The biofeedback system used in this study consists of four modules; a torso tilt measurement module, a data processing and control module, a visual biofeedback module and a kinesthetic biofeedback module. The torso tilt measurement module is composed of a smartphone which can be attached to the patient by means of an exclusive leather belt around the waist at L2-L4 lumbar spine region. Aforementioned torso tilt measurement module has been used as a reliable tool to assess body sway parameters during quite stance and gait motion [24, 25]. The smartphone (Pantech Vega IM-A850L [26]) continuously runs a dedicated Android application that measures the trunk tilt in terms of the mediolateral (ML) and anteroposterior (AP) angles, and sends the data to the data processing and control module through a “Socket” program communicating via Wi-Fi. The data processing and control module consists of a Personal Computer (PC) running a purpose-built program written in visual C++. Data sent by the smartphone is retrieved by the PC. The software running on the PC decodes the received data from the smartphone and generates corresponding outputs for providing visual and haptic biofeedback. The PC is connected with two display screens, one displaying information for the operator and the other functions as the visual biofeedback module (Fig. 1). The visual biofeedback display screen is placed in front of the test subject and displays a visual to help the subjects balance themselves. The kinesthetic biofeedback module consists of a haptic device which is connected to the PC. In our balance training system, the Phantom Omni® is used [27], which is a commercially available low-cost haptic device. The smartphone used in this research featured a quad-core 1.5 GHz CPU with 2 GB of RAM, and efficiently ran the Android® (Jelly Bean) operating system. The data bandwidth of the smartphone utilized here was 100 Hz. The system allowed for a torso tilt angle measurement resolution better than 0.1°.
The Phantom Omni® can be connected to and controlled by a PC. The interface and implementation of Omni device for providing kinesthetic haptic biofeedback has been successfully tested [24]. The device can produce directional force in the X
p
, Y
p
and Z
p
directions; to ensure the virtual reference surface for the concept of light touch, the movement of the device handle in the Y
p
axis was restrained, allowing motion of device handle in X
p
Z
p
plane only. An output force from the haptic device was always less than 1 N and the handle was allowed to deviate if any subject exerted a force larger than 1 N. Omni device’s handle maintained home position (0, 0, 0) at the beginning of haptic biofeedback. The ML and AP trunk tilt values were used to calculate the directions and magnitudes of all required forces; the handle then delivered these forces. The relationships between tilt angle and output haptic force magnitude and direction are given by Eqs. (1) and (2):
$$ F_{X} = - k \times \left( {\frac{{trunk\;\;til{t_{ML}}}}{{range\;{X_P}}}} \right)\quad ({\rm{N}}) $$
(1)
$$F_{Z} = k \times \left({\frac{trunk\;\;til{t_{AP}}}{range\;{Z_{P}}}}
\right)\qquad ({\rm N})$$
(2)
where the “trunk tilt” is the tilt in ML or AP of the subject, calculated relative to the initial value as recorded at the start of the experiment, and the “range” in X
p
and Z
p
is the maximum permitted workspace (between −60 and +60 mm in both axes) of the haptic device [24]. The stiffness “k” was set to 0.05 N/mm to reduce jerkiness, thus providing smooth force transfer and not affecting the body sway. The trunk tilt information was also utilized to provide visual biofeedback on a LCD display screen attached to the PC. The screen was placed in front of the subject at head height at a distance of 1 m, in order to allow the subject to easily and comfortably maintain an upright posture while receiving focused feedback from the screen. Before the commencement of the experimental trials, the trunk angle feedback of the subject was represented as a circle at the center of the screen, the visual biofeedback during trials consisted of the motion of this circle in harmony with the trunk tilt variations captured from the waist mounted smartphone. The AP trunk tilt was mapped to the vertical motion of the circle and subsequently ML trunk tilt was mapped to the horizontal motion of the circle. The software generated this visual biofeedback at a refresh rate of 50 Hz (approximate latency of 40 ms) for display to the subjects. In order to provide multimodal biofeedback, both haptic biofeedback and visual biofeedback were provided simultaneously to the subjects.
Eleven healthy young subjects (9 males and 2 females, age 27.1 ± 3.1 years, weight 78.3 ± 6.6 kg, height 169.9 ± 9.2 cm) were recruited to check the effectiveness of biofeedback provided by our proposed system. None of the young healthy subjects had any history of sensorimotor or neurological disorders. These subjects did not suffer from any visual deficits other than adequately corrected loss of visual acuity. All of the subjects gave written informed consent in accordance with the rules of our local Ethics Committee.
In order to experimentally test the effects of multimodal biofeedback, the subjects were asked to try and maintain their balance while standing barefoot in prescribed postures on a platform made up of foam for a period of 30 s. The platform had the dimensions of 600 × 600 × 150 mm. High resilience foam with density of 48 kg/m3 and tensile strength of 83 kPa was used to simulate soft ground conditions. The young healthy subjects were required to assume two distinct postures while standing on the platform, standing on one foot stance (P1) and the tandem Romberg stance (P2) as shown in Fig. 2.
Four conditions of biofeedback were applied for each subject during each posture. In the condition of no biofeedback (F1) subjects used their natural balancing capabilities to maintain upright stance, furthermore, in condition of haptic biofeedback (F2) subjects hold the phantom Omni device’s handle to get balance cues. The other three conditions consisted of different possibilities of biofeedback. In the condition of haptic biofeedback (F2) subjects held the phantom handle of the Omni device to obtain balance cues, likewise, in condition of visual biofeedback (F3) subjects utilized the visual cues provided through the display screen to balance themselves, and lastly in the condition of multimodal biofeedback (F4), the subjects utilized both haptic and visual biofeedback simultaneously to achieve their objective.
Appropriate utilization of the haptic and visual biofeedback for assistance in balance control was explained to all subjects. The surrounding environment was designed to lack any stimulus. Subjects were instructed to remain silent. 60 s of rest time was provided to all subjects between trials on each condition. Selection of posture and biofeedback was random for young healthy subjects.
All trunk tilt values of ML and AP values were analyzed using the MATLAB® software. Projection of trunk tilt (PT) was calculated from the data of trunk tilt angles and smartphone’s attachment height, given by Eqs. (3) and (4):
$$ PT_{ML} = trunk\,til{t_{ML}}*h\quad ({\text{cm}}) $$
(3)
$$ PT_{AP} = trunk\,til{t_{AP}}*h\quad ({\text{cm}}) $$
(4)
where “h” is the height of smartphone’s attachment to the subject’s trunk from ground up. Since the tilt angles are small, PT can be linearized as Eqs. (3) and (4). Similar to our approach, other researchers have used trunk tilt projection derived from an electromagnetic sensor; identified balance and stability behavior, and classified individuals on the basis of age, gender, height and weight [28–31]. Mean velocity displacement (MVD), planar deviation (PD), ML Trajectory (MLT) and AP (APT) Trajectory was calculated as parameters of body sway using Eqs. (5)–(8):
$$ {\text{MVD}} = \frac{{\mathop \sum \nolimits \frac{{\sqrt {\left( {\left( {{\text{PT}}_{\text{ML}} ({\text{i}}) - {\text{PT}}_{\text{ML}} ({\text{i}} - 1)} \right)^{2} + \left( {{\text{PT}}_{\text{AP}} ({\text{i}}) - {\text{PT}}_{\text{AP}} ({\text{i}} - 1)} \right)^{2} } \right)} }}{{{\text{t}}_{\text{i}} - {\text{t}}_{{{\text{i}} - 1}} }}}}{\text{n}}\quad ({\text{cm}}/{\text{s}}) $$
(5)
$$ {\text{PD}} = \sqrt {{{\upsigma }}^{2} {\text{PT}}_{{{\text{ML}}}} + {{\upsigma }}^{2} {\text{PT}}_{{{\text{AP}}}} } \quad ({\text{cm}}) $$
(6)
$$ {\text{MLT}} = \sum \left| {{\text{PT}}_{\text{ML}} \left( {{\text{i}} + 1} \right) - {\text{PT}}_{\text{ML}} \left( {\text{i}} \right)} \right|\quad ({\text{cm}}) $$
(7)
$$ APT = \sum \left| {{\text{PT}}_{\text{AP}} \left( {{\text{i}} + 1} \right) - {\text{PT}}_{\text{AP}} \left( {\text{i}} \right)} \right|\quad ({\text{cm}}) $$
(8)
where “i” is the index of tilt data, “n” is the total number of data values and “t” is time. MVD is the mean value of all PT velocities; changes in the ML and AP are combined to yield a single velocity value. PD is defined as the square root of sum of variances (σ
2) of PT displacement in ML and AP directions. Variance of PT displacement measures show how far the PT is spread out. Similarly the sums of changes in ML and AP projection of tilt yield MLT and APT, respectively. A larger value of these parameters indicates the greater balance difficulty. A two-way analysis of variance (ANOVA) was used to identify the interaction effects of postures (one foot stance, tandem Romberg stance) and biofeedback (no feedback, haptic, visual, and multimodal) on body sway. Furthermore, main effects were analyzed using one-way analysis of variance (ANOVA) and Tukey’s HSD test was used for post hoc analysis.