Skip to main content
PLOS One logoLink to PLOS One
. 2020 Oct 29;15(10):e0241479. doi: 10.1371/journal.pone.0241479

Virtual reality as a tool for balance research: Eyes open body sway is reproduced in photo-realistic, but not in abstract virtual scenes

Lorenz Assländer 1,*, Stephan Streuber 1,2
Editor: Eric R Anson3
PMCID: PMC7595375  PMID: 33119679

Abstract

Virtual reality (VR) technology is commonly used in balance research due to its ability to simulate real world experiences under controlled experimental conditions. However, several studies reported considerable differences in balance behavior in real world environments as compared to virtual environments presented in a head mounted display. Most of these studies were conducted more than a decade ago, at a time when VR was still struggling with major technical limitations (delays, limited field-of-view, etc.). In the meantime, VR technology has progressed considerably, enhancing its capacity to induce the feeling of presence and behavioural realism. In this study, we addressed two questions: Has VR technology now reached a point where balance is similar in real and virtual environments? And does the integration of visual cues for balance depend on the subjective experience of presence? We used a state-of-the-art head mounted VR system and a custom-made balance platform to compare balance when viewing (1) a real-world environment, (2) a photo-realistic virtual copy of the real-world environment, (3) an abstract virtual environment consisting of only spheres and bars (‘low presence’ VR condition), and, as reference, (4) a condition with eyes closed. Body sway of ten participants was measured in three different support surface conditions: (A) quiet stance, (B) stance on a sway referenced surface, and (C) surface tilting following a pseudo-random sequence. A 2-level repeated measures ANOVA and PostHoc analyses revealed no significant differences in body sway between viewing the real world environment and the photo-realistic virtual copy. In contrast, body sway was increased in the ‘low presence’ abstract scene and further increased with eyes closed. Results were consistent across platform conditions. Our results support the hypothesis that state of the art VR reached a point of behavioural realism in which balance in photo-realistic VR is similar to balance in a real environment. Presence was lower in the abstract virtual condition as compared to the photo-realistic condition as measured by the IPQ presence questionnaire. Thus, our results indicate that spatial presence may be a moderating factor, but further research is required to confirm this notion. We conceive that virtual reality is a valid tool for balance research, but that the properties of the virtual environment affects results.

Introduction

Balance control is important for every-day behaviors such as walking, standing, social interactions, dancing, or sports. During these activities, the nervous system controls the body’s center of mass counteracting external perturbations, such as gravity [1]. Age and various pathologies such as Parkinson’s disease or stroke can deteriorate balance control [2] leading to increased risk of falls and injury [3, 4]. Standing balance is often used as an indicator for these pathologies [5, 6]. Studying balance control is important for understanding its underlying mechanisms and to develop tailored interventions [7, 8]. Conventional experimental paradigms used to study balance control—especially sensory integration mechanisms—often require costly technical equipment such as motion platforms or tilt rooms. These facilities are not accessible to most researchers and clinicians. Here, we test the use of consumer based Virtual Reality (VR) systems as an efficient low-cost alternative for the study of balance control.

Virtual Reality is the presentation of a computer-generated environment that translates the user’s behaviors and actions into sensory experiences by replacing sensory feedback using display technologies [9]. In case of head mounted display VR systems (HMDs), stereoscopic images of the virtual scene are rendered from an egocentric viewing perspective obtained from the user’s head position and orientation in real time. Due to its ability to simulate rich and vivid real-world experiences, VR presented through HMDs has become a promising methodological tool for studying human perception and behavior [1012]. Unlike traditional experimental paradigms, VR embeds users into a synthetic computer-generated surrounding. Researchers can use VR to simulate real world experiences and, at the same time, precisely control and manipulate perceptual information. These features, unique to VR, open the possibility to conduct behavioral and psychophysical experiments under more natural and yet controlled experimental conditions, enhancing the ecological validity of the results [13].

A key component of VR is the concept of immersion and presence. Immersion refers to the capacity of a VR system to induce an “inclusive, extensive, surrounding and vivid illusion of reality” [14]. Hence, the degree of immersion mainly depends on technical properties of a VR system such as a large field-of-view, high resolution or multi-sensory stimulation. Presence, on the other hand, is a psychological construct that refers to the subjective experience or illusion of ‘being there’ in the virtual environment [14]. Presence is (among other factors) affected by visual properties of the environment where photo-realistic virtual environments lead to higher presence scores compared to rather abstract renderings [15]. High degrees of immersion and presence lead to behavioral realism—users respond similarly in virtual and real environments [16]. This makes VR a powerful tool to study human behavior. One specific application is the experimental analysis of the visual contribution to balance control.

Vision is one of three sensory systems used by the central nervous system (CNS) to maintain balance during upright stance. The others being the vestibular system and the proprioceptive reference to the support surface [1]. State of the art HMD VR-technology allows researchers to manipulate the visual scene almost without limitations, providing unique experimental access to the sensory integration of visual cues. However, open questions remain: Is balance viewing a virtual scene comparable to balance viewing a real-world scene? Does the visual contribution to balance depend on subjective experience of the virtual environment (i.e. immersion, presence, etc.)?

Vision research has a long history of using simulated visual inputs and investigating body sway behavior in dependence on different scene presentations. Three different technologies should be distinguished: 2D screen projections [1720], 3D projections on a 2D screen using stereo-goggles [2123], and head mounted displays (HMDs). While the difference between 2D and 3D scenes is obvious, 3D visualization on a screen has some fundamental differences to 3D visualization on a HMD [24]. HMDs require online updating of the display when the head is moving, which induces time delays and errors between the head movement and the shift of the scene, however provide a full 3D environment not restricted to a screen. One major problem is the conflict between vergence of the eyes and accommodation of the lense that is present in all 3D environments. This conflict was shown to affect the integration of visual cues for balance control [25].

Very few studies have compared balance behavior in real and virtual environments using HMDs. Two studies found little or no improvement relative to eyes closed spontaneous sway when viewing a stationary virtual visual scene [26], independently from field of view limitations of HMDs [24]. However, VR technology is evolving rapidly, increasing the field of view, optical resolution and accuracy of tracking the HMDs, as well as reducing screen update latencies and device weight. Only one study compared real and virtual environments using a more recent device Robert et al. [27]. These authors used a filmed 3D representation of a laboratory room as the VR environment. Two balance measures were compared: 1) spontaneous sway, i.e. the small body oscillations present during unperturbed upright stance, and 2) more dynamic tasks from the Berg balance scale, such as standing up from a chair or standing one foot, which are subjectively rated from zero (unable) to four (independent) [28]. No differences in performance were found between virtual and real world scenes. The findings of Robert et al. [27] suggest that technology may have evolved to a level where the visual input to balance in virtual reality has similar effects as compared to viewing a real-world visual scene. However, the evidence is still scarce.

In balance research, external perturbations such as support surface tilts are frequently applied in experiments to probe the control mechanism [1]. The relation between perturbation and sway response provides detailed information on the control mechanism maintaining balance, exceeding the information content of spontaneous sway and other purely observational measurements [29]. Therefore, we used support surface tilts to perturb balance in addition to spontaneous sway measurement and compared sway behavior in two stationary virtual scenes to eyes closed and eyes open viewing a real-world scene. The first goal of this study is to test whether balance control is similar in VR and in the real world. If balance control is similar in VR and the real world, we expect similar sway response pattern when participants see a real-world environment compared to when they see a photo-realistic rendering of the same environment in VR, but we expect different sway response patterns when seeing a real-world environment or a photo-realistic rendering in VR as compared to standing with the eyes closed. The second aim of this study was to test the effect of visual richness of the virtual scene on balance control. We hypothesized that different levels of visual richness would modulate the subjective experience of presence, which in turn would have an impact on balance control. Specifically, we hypothesized that a high level of visual richness (e.g. a photo-realistic rendering) would cause an increased feeling of presence leading to similar sway responses in the VR as compared to the real world. In a similar way we expect low feelings of presence for the scene with low visual richness (e.g., impoverished, abstract scenes) leading to deteriorated balance control similar to standing with the eyes closed.

Methods

Subjects and ethics statement

Ten young and healthy subjects (25.4 ± 4.7 years; 171 ± 11 cm; 65.5 ± 12.0 kg; 5 female, 5 male) participated in the study after giving written informed consent. Exclusion criteria were orthopedic problems, balance related problems, concussions, and a history of epilepsy. Subject were recruited via an online platform where mostly University students are registered. The study was conducted in agreement with the declaration of Helsinki in its latest revision and was approved by the University of Konstanz ethics board (IRB20KN07-004).

Experimental setup

Subjects were wearing a HMD (VIVE PRO EYE, HTC Corporation, Taoyuan City, Taiwan) with a resolution of 1440 x 1600 pixels per eye, a diagonal field of view of 110 degrees. The stereoscopic stimulus was rendered on a Nvidia GeForce RTX 2070 graphics card with 8GB of GDDR with an update frequency of 90 Hz. During the experiment participants stood on a custom-built device with a tiltable support surface, sway-rods to measure body sway, and a force sensor to measure the torque applied at the ankle joints (equivalent to center of pressure measurements). The tilt axis was 8.8 cm above the support surface, thus approximately through the ankle joint axis (Fig 1C). Body sway was measured using two sway rods connected to potentiometers on one end and guided by hooks attached at hip and shoulder level. The torque cues were measured with a force sensor mounted 30 cm below the rotation axis of the tilt platform. With ankle joint and platform axes aligned, ankle torque generated by the subjects resulted in an proportional reaction force at the force sensor. This measurement was only used when the platform was stationary, i.e. when no platform acceleration contributed to the torque measurement. A real-time PC (Education real-time target machine, Speedgoat GmbH, Liebefeld, Switzerland) running Simulink (The Mathworks, Natick, USA) was used to control the device and record the data from the balance device at 1000 Hz sampling rate.

Fig 1. Experimental setup.

Fig 1

(A) Virtual laboratory scene (LAB). The scene is a detailed reconstruction of the real world view (EO). (B) Virtual abstract scene (ABS). (C) scheme of the experimental setup. Subjects did not wear the HMD in conditions EO and EC.

Visual conditions

Body sway was measured in 4 different visual conditions. Eyes open with a view of the real lab space (EO), a virtual reconstruction of the lab wearing the HMD (LAB, Fig 1A), an abstract virtual scene containing a cloud of spheres, as well as vertical bars in the periphery providing rich optic flow information and depth cues wearing the HMD (ABS, Fig 1B), and eyes closed without wearing the HMD (EC). The virtual laboratory scene was 3D modeled in Blender and Unity3D based on measurements and photographs taken from the real laboratory environment. Subjects’ presence in and potential nausea from wearing the virtual reality system were assessed following each of the two virtual visual conditions using the igroup Presence Questionnaire (IPQ; [30]) and the Simulator Sickness Questionnaire (SSQ; [31]), respectively. The IPQ was conducted with a scale from 1–5 instead of the recommended scale from 1–6.

Platform conditions

Three different platform conditions were tested for each visual condition: 1) fixed support surface, 2) sway referenced support surface, and 3) surface tilting in a pseudo-random sequence.

For the fixed surface, sway was measured for 45 seconds. Sway referencing is a condition, where the platform is tilting with the subject [32]. Sway referencing is achieved by measuring the tilt of a subjects’ leg segment using the hip sway rod and online calculation and control of the platform tilt at each sample time using the real-time PC. The calculation is tuned such that swaying forward and backward does not change the ankle angle as it would on a fixed surface. Therefore, the proprioceptive reference to the support surface contains virtually no information about body orientation in space and is thus not contributing to balance control, increasing the contribution of visual and vestibular inputs [1]. Body sway during sway referencing was measured for 45 seconds. The sway referencing condition immediately followed the quiet stance condition (i.e. both were measured in the same trial). For the surface tilt condition, the tilt stimulus was based on a 20-s long pseudo-random-ternary sequence [1]. The sequence consists of eighty 0.25-s long states with positive (+1.78°/s), negative (-1.78°/s) or zero velocity, constructed in pseudo-random order using shift-registers [33]. The platform tilt angle followed the integrated signal, which had a peak-to-peak tilt amplitude of 4°. The subjective experience during these pseudo-random stimuli are small toes-up and toes-down movements with unpredictable changes in direction. A total of 13 sequences were concatenated, resulting in a trial length of 260 s.

Center of mass calibration routine

A 120 second long calibration routine was used to obtain anterior-posterior whole body center of mass (com) position without assumptions on segment mass distributions [1]. Subjects were instructed to make very slow movements in the ankle and hip joints within the movement range typically seen during the experiment. In these quasi-static conditions, the center of pressure position (xcop), obtained from the torque measurement, can be used as a projection of the com position (xcom; [34]:

xcop(t)xcom(t) (1)

A least-squares regression between hip and shoulder translation, obtained from the sway rods and trigonometric calculations, and the center of pressure were used to calculate calibration values. In conditions with larger and more dynamic body sway, where Eq 1 does not hold anymore, these calibration values can be used to calculate the com position based on the hip and shoulder movements:

xcom(t)=a+b*xhip(t)+c*xsho(t) (2)

Finally, using an estimate of the body com height (hcom; [35], angular body com sway (θcom) was calculated and used for all further analysis.

θcom(t)=arcsin(xcom(t)/hcom) (3)

Procedures

After obtaining written informed consent, anthropometric measures were taken from subjects. Subjects briefly familiarized with the HMD and both virtual scenes (LAB, ABS) in a seated position prior to the experiments. Subjects were then positioned on the platform and sway rods were attached. After the calibration routine, subjects familiarized with the different tilt stimulus conditions and the virtual lab scene (LAB) during a 150-s long warm-up session. We confirmed that subjects were comfortable with the setup, before running visual conditions EO, LAB, ABS, and EC in randomized order. Within each visual condition, the two stimulus sequences (pseudo-random tilt and quiet stance with fixed surface followed by sway referenced support surface) were run in randomized order. During all experimental conditions including the warmup, subjects wore noise-canceling head-phones listening to audio books or podcasts to avoid auditory orientation cues and distract from the balance task. Following either of the two virtual reality conditions, subjects were asked to verbally answer the questionnaires (SSQ and IPQ).

Data analysis and statistics

Recorded data was analyzed using Matlab (The Mathworks, Natick, USA). For quiet stance and sway referencing trials, com trajectories were low-pass filtered to remove measurement noise (2nd order butterworth filter; cutoff 5 Hz) and the first 15 seconds were discarded. For the remaining 30 seconds, anterior-posterior sway path was calculated by summing the absolute sample-to-sample difference. This measure of the sway path was then divided by trial time to obtain a measure of average sway velocity (s) for each subject, which was used for statistical analysis.

For pseudo-random surface tilt trials, the first of the thirteen sequences was discarded to avoid transients at the beginning of the tilt perturbations and each cycle was centered around zero (subtracting the mean). The arithmetic mean and the standard deviation across the remaining 12 sequences were calculated for each subject to obtain a periodic sway component as a measure of the response to the tilt stimulus and the random sway component not evoked by the stimulus, respectively. Sway power of the periodic and the random component was calculated as the sum of the squared trace.

JASP [36] was used for all statistical analyses. Parameters were statistically compared using a two level repeated measures ANOVA with the levels ‘visual condition’ and ‘platform condition’. Further, differences of visual conditions with respect to eyes open were tested using simple contrasts. PostHoc comparisons of visual conditions within each platform condition were calculated using single level repeated measures ANOVAs and simple contrasts with respect to EO.

Results

One subject showed extraordinarily large sway in all platform conditions. The data of this subject is marked red in the plots and is not included in mean values and statistical analyses. Table 1 shows the results of the repeated measures ANOVA. Visual condition, platform condition and their interaction showed significant differences (p < .001). Table 2 shows the contrasts comparing LAB, ABS, and EC to EO. There was no significant difference between EO and the virtual reality condition LAB, however there was a difference for ABS and EC (p < .001). In the following, we will describe the changes for individual platform conditions and the questionnaire results.

Table 1. Within subjects differences between platform and visual conditions.

  Sum of Squares df Mean Square F p η2
platform condition 3.136 3 1.045 53.775 < .001 0.526
Residual 0.467 24 0.019
visual condition 0.914 3 0.305 35.317 < .001 0.153
Residual 0.207 24 0.009
visual condition ✻ platform condition 0.518 9 0.058 11.929 < .001 0.087
Residual 0.348 72 0.005

Note. repeated measures ANOVA; Type III Sum of Squares.

Table 2. Post-hoc simple contrasts comparing visual conditions.

Comparison Estimate SE t p
LAB—EO 0.023 0.022 1.043 0.307
ABS—EO 0.105 0.022 4.779 < .001
EC—EO 0.203 0.022 9.275 < .001

Note. EO real world scene; EC eyes closed; LAB virtual laboratory scene; ABS virtual abstract scene.

Surface tilt conditions

Fig 2 shows the results of the support surface tilt conditions with the actual platform movement for one 20 second pseudo-random sequence (Fig 2A) and periodic body sway (averaged across 108 sequence repetitions; 9 subjects à 12 sequences) in the time domain (Fig 2B). Periodic body sway reflects the general shape of the stimulus sequence and shows small differences between EO, LAB and ABS, and a big difference between EO and EC. The power of sway responses to the tilt stimulus and sway variability during the tilt conditions are shown in Fig 3. Sway power of the periodic component was 2–4 times larger as compared to that of the random sway component. The ANOVA showed significant differences in visual conditions (p < .001). Contrasts showed no significant difference between EO and LAB for periodic sway (p = .450) or variability (p = .258). Both parameter differed from EO for ABS and EC (p < .001).

Fig 2. Example sequences support surface tilt and periodic body sway.

Fig 2

(A) Support surface tilt sequence (was repeated 13 times in each trial) and (B) periodic body com sway in response to the stimulus (= average across sequence repetitions and subjects) for all visual conditions. Visual conditions are real scene eyes open (EO), virtual laboratory room (LAB), virtual abstract scene (ABS), and eyes closed (EC).

Fig 3. Body sway power during pseudo-random platform tilts.

Fig 3

(A) Periodic sway component. (B) Random sway component. Visual conditions are real scene eyes open (EO), virtual laboratory room (LAB), virtual abstract scene (ABS), and eyes closed (EC). Parameters are shown for individual subjects (grey) and as mean and standard deviation across subjects (black). Red circles indicate outliers (one subject) not included in the average. Values outside the visible range are given in brackets.

Spontaneous sway conditions

Fig 4 shows the results of the two conditions in which spontaneous sway was measured: subjects standing on a fixed surface (left) and on a sway referenced surface (right). Sway power during sway referenced conditions was about 3 times larger as compared to the fixed surface conditions. The ANOVA again showed an effect of visual conditions. Contrasts showed no difference between EO and LAB for fixed surface (p = .506) and sway referenced surface (p = .498). There was no significant difference between EO and ABS on a fixed surface (p = .083), however there was a difference on the sway referenced surface (p < .01). EC differed significantly from EO (p < .001).

Fig 4. Mean body sway velocity.

Fig 4

(A) Fixed support surface. (B) Sway referenced support surface. Visual conditions are real scene eyes open (EO), virtual laboratory room (LAB), virtual abstract scene (ABS), and eyes closed (EC). Sway velocity is shown for individual subjects (grey circles) and as mean and standard deviation across subjects (black). Red circles indicate outliers (one subject) not included in the average. Values outside the visible range are given in brackets.

Questionnaire results

Table 3 shows the average results of the SSQ and the IPQ across all subjects. The SSQ showed extremely low values, indicating that subjects had no nausea problems in the virtual environment. The IPQ showed a larger presence (P) in the LAB scene, which was reflected by slightly larger values for the three sub categories spatial presence (SP), involvement (INV) and experienced realism (REAL).

Table 3. Questionnaire results.

VR scene SSQ IPQ G IPQ SP IPQ INV IPQ REAL
LAB 7.5 ± 8.5 3.7 ± 0.9 3.5 ± 0.6 3.2 ± 0.5 2.7 ± 0.2
ABS 15.3 ± 14.8 2.6 ± 1.4 3.3 ± 0.8 3.1 ± 0.8 2.1 ± 0.3

Note. SSQ max Score: 235.62; IPQ Score: 1–5.

Discussion

The results showed a very consistent behavior across all support surface conditions. Subjects showed very similar sway viewing the virtual laboratory scene (LAB) as compared to the real laboratory scene (EO). However, sway was larger when viewing the abstract virtual scene (ABS). This increase was about 50% of the increase found for eyes closed (EC) as compared to eyes open conditions. The similarity of EO and LAB conditions shows that human subjects use visual inputs also in a virtual environment as a sensory input to maintain balance. However, the difference in sway when viewing the abstract scene indicates that humans do not use every virtual scene alike. Below, we discuss two possible factors that may modulate the visual contribution to standing balance.

Presence

Improved balance in the photo-realistic laboratory scene was associated by higher presence scores in the igroup presence questionnaire as compared to the abstract scene. This finding is in line with research showing 1) that photo-realistic virtual environments lead to higher presence scores compared to rather abstract renderings [15] and 2) that a high presence score is a predictor of behavioral realism. Thus, subjects might have used the visual reference in the abstract scene less as compared to the photo-realistic laboratory scene due to different levels of presence. Such a possible relationship between presence and postural stability has been suggested by Menzies et al. [37]. The authors observed small differences when comparing spontaneous sway viewing the same virtual scene with different technical devices. Menzies et al. [37] found reduced sway using a better technical device and dedicated their findings to higher fidelity. While their findings are in line with our results and interpretation, Menzies et al. [37] used a visual scene similar to our abstract scene. Balance measures were overall close to eyes closed behavior and very different from eyes open behavior, which was nicely reproduced in our abstract scene results. Thus, the findings of Menzies et al. [37] may have been much more clear when using a more realistic virtual environment. Nonetheless, the findings of Menzies et al. [37] and our findings support the idea that postural stability might be a predictive and unbiased behavioral marker for presence. However, further research needs to be conducted to better quantify this relationship.

Cognitive suppression of the visual contribution

One striking aspect of our results is that balance in the abstract virtual scene was only slightly improved as compared to eyes closed, despite a supposedly rich visual input (large field optic flow, depth and parallax information, etc.) and using the same technical device. Above we argue that the difference to the virtual lab scene might be related to the subjective feeling of presence in the scene. However, it is unclear how a feeling of presence could affect the balance control mechanism. One mechanism could be a cognitive suppression of the visual input. It is well known that humans can suppress the visual input to standing balance. For example, Bronstein [38] showed that subjects show a large sway response when exposed to a sudden movement of a (real world) visual surround for the first time. However, the sway response is largely suppressed during a second and succeeding exposures. The relevance of this suppression mechanism is very straight forward: subjects would fall when looking at a driving bus, if the input could not be suppressed. A similar suppression could also occur when subjects do not feel present in the scene. In other words, the feeling of presence in the virtual environment could modulate the extend to which a subject ‘trusts’ and therefore uses the visual scene as a reliable visual space reference for standing balance. Such a causal relation of presence and a cognitive suppression of the visual contribution to balance is plausible, but remains speculative at this stage.

Information content of the visual input

The central nervous system extracts a variety of different visual cues from the retinal input, such as optic flow, visual vertical, various depth cues, or landmark information. The abstract scene was constructed to contain a rich visual input. However, the visual input may have not provided the same amount or quality of information in the abstract environment, as compared to the virtual laboratory room. For example, the abstract scene did not contain a floor or walls, which may provide a much stronger self-orientation information as compared to the vertical bars in the periphery.

Limitations

In VR balance control, mechanisms might be challenged by technical artifacts such as a limited field of view or delays. A lack of balance control in VR due to these factors might also lead to discomfort, cybersickness, falls or injury. The restricted field of view using the HMD may have led to an increase in sway responses [39]. Close inspection showed that sway in the laboratory scene (LAB) was slightly larger as compared to sway with eyes open in the real-world (EO). Thus, the limited field of view of the HMD may have reduced the visual information content to some extent. However, overall the effect was small and, as discussed above, could also have been caused by other reasons.

Overall, the study showed very clear results, despite the small number of subjects. Nonetheless, investigating the above discussed relation of presence and the visual contribution to standing balance requires a much larger power and needs to resolve the problem of providing the same visual information content in low and high presence conditions.

In conclusion, we found that human subjects use visual self-motion cues for standing balance in a virtual environment. However, the visual input is reduced in an abstract scene, where we identified a reduced presence in the virtual environment or a poorer quality of the visual self-motion cues as potential reasons. Our findings indicate that virtual reality is a useful tool for balance research. In addition, balance behavior may also be a good tool to quantify presence of a subject in the virtual environment.

Supporting information

S1 Data

(ZIP)

Acknowledgments

We would like to thank Amine El-Kaissi, Sandra Wacker, Sharan Gopalan, and Nikolai Killer for their help creating the scenes and during data collection.

Data Availability

All relevant data are within the paper and its Supporting Information files.

Funding Statement

LA, SS Zukunftskolleg "Cooperative Initiatives" Universität Konstanz; https://www.uni-konstanz.de/zukunftskolleg/.

References

  • 1.Peterka RJ. Sensorimotor integration in human postural control. J Neurophysiol. 2002;88: 1097–118. Available: http://www.ncbi.nlm.nih.gov/pubmed/13679407 10.1152/jn.2002.88.3.1097 [DOI] [PubMed] [Google Scholar]
  • 2.Sturnieks DL, St George R, Lord SR. Balance disorders in the elderly. Neurophysiol Clin. 2008;38: 467–78. 10.1016/j.neucli.2008.09.001 [DOI] [PubMed] [Google Scholar]
  • 3.Sterling DA, O’Connor JA, Bonadies J. Geriatric falls: injury severity is high and disproportionate to mechanism. J Trauma. 2001;50: 116–9. 10.1097/00005373-200101000-00021 [DOI] [PubMed] [Google Scholar]
  • 4.Who. WHO Global Report on Falls Prevention in Older Age. Community Health (Bristol). 2007; 53. doi:978 92 4 156353 6 [Google Scholar]
  • 5.Nardone A, Schieppati M. The role of instrumental assessment of balance in clinical decision making. Eur J Phys Rehabil Med. 2010;46: 221–237. [PubMed] [Google Scholar]
  • 6.Horak F, Mancini M. The Relevance of Clinical Balance Assessment Tools to Differentiate Balance Deficits. Eur J Phys Rehabil Med. 2010;46: 239 [PMC free article] [PubMed] [Google Scholar]
  • 7.Carter ND, Kannus P, Khan KM. Exercise in the prevention of falls in older people: a systematic literature review examining the rationale and the evidence. Sports Med. 2001;31: 427–38. 10.2165/00007256-200131060-00003 [DOI] [PubMed] [Google Scholar]
  • 8.Kannus P, Sievänen H, Palvanen M, Järvinen T, Parkkari J. Prevention of falls and consequent injuries in elderly people. Lancet (London, England). 2005;366: 1885–93. 10.1016/S0140-6736(05)67604-0 [DOI] [PubMed] [Google Scholar]
  • 9.Sherman WR, Craig AB. Understanding virtual reality: Interface, application, and design. Morgan Kaufmann; 2018. [Google Scholar]
  • 10.Pine DS, Grun J, Maguire EA, Burgess N, Zarahn E, Koda V, et al. Neurodevelopmental aspects of spatial navigation: a virtual reality fMRI study. Neuroimage. 2002;15: 396–406. 10.1006/nimg.2001.0988 [DOI] [PubMed] [Google Scholar]
  • 11.Patton J, Dawe G, Scharver C, Mussa-Ivaldi F, Kenyon R. Robotics and virtual reality: a perfect marriage for motor control research and rehabilitation. Assist Technol. 2006;18: 181–95. 10.1080/10400435.2006.10131917 [DOI] [PubMed] [Google Scholar]
  • 12.Streuber S, Mohler BJ, Bülthoff HH, De La Rosa S. The influence of visual information on the motor control of table tennis strokes. Presence. 2012;21: 281–294. [Google Scholar]
  • 13.Pan X, Hamilton AF de C. Why and how to use virtual reality to study human social interaction: The challenges of exploring a new research landscape. Br J Psychol. 2018;109: 395–417. 10.1111/bjop.12290 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Slater M, Wilbur S. A Framework for Immersive Virtual Environments (FIVE): Speculations on the Role of Presence in Virtual Environments. Presence Teleoperators Virtual Environ. 1997;6: 603–616. 10.1162/pres.1997.6.6.603 [DOI] [Google Scholar]
  • 15.Balakrishnan B, Oprean D, Martin B, Smith M. Virtual reality: Factors determining spatial presence, comprehension, and memory. Proceedings of the 12th International Conference on Construction Applications of Virtual Reality. 2012. pp. 451–459.
  • 16.Kisker J, Gruber T, Schöne B. Behavioral realism and lifelike psychophysiological responses in virtual reality by the example of a height exposure. Psychol Res. 2019; 10.1007/s00426-019-01244-9 [DOI] [PubMed] [Google Scholar]
  • 17.Logan D, Kiemel T, Jeka JJ. Asymmetric sensory reweighting in human upright stance. PLoS One. 2014;9: 1–10. 10.1371/journal.pone.0100418 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Oie KS, Kiemel T, Jeka JJ. Multisensory fusion: simultaneous re-weighting of vision and touch for the control of human posture. Brain Res Cogn Brain Res. 2002;14: 164–76. Available: http://www.ncbi.nlm.nih.gov/pubmed/12063140 10.1016/s0926-6410(02)00071-x [DOI] [PubMed] [Google Scholar]
  • 19.van Asten WN, Gielen CC, Denier van der Gon JJ. Postural adjustments induced by simulated motion of differently structured environments. Exp Brain Res. 1988;73: 371–83. Available: http://www.ncbi.nlm.nih.gov/pubmed/3215313 10.1007/BF00248230 [DOI] [PubMed] [Google Scholar]
  • 20.Warren WH, Kay BA, Yilmaz EH. Visual Control of Posture during Walking: Functional Specificity. J Exp Psychol Hum Percept Perform. 1996;22: 818–838. 10.1037//0096-1523.22.4.818 [DOI] [PubMed] [Google Scholar]
  • 21.Dijkstra TM, Schöner G, Giese MA, Gielen CC. Frequency dependence of the action-perception cycle for postural control in a moving visual environment: relative phase dynamics. Biol Cybern. 1994;71: 489–501. Available: http://www.ncbi.nlm.nih.gov/pubmed/7999875 10.1007/BF00198467 [DOI] [PubMed] [Google Scholar]
  • 22.Dokka K, Kenyon R V, Keshner E a, Kording KP. Self versus environment motion in postural control. PLoS Comput Biol. 2010;6: e1000680 10.1371/journal.pcbi.1000680 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Mergner T, Schweigart G, Maurer C, Blümle A. Human postural responses to motion of real and virtual visual environments under different support base conditions. Exp Brain Res. 2005;167: 535–556. 10.1007/s00221-005-0065-3 [DOI] [PubMed] [Google Scholar]
  • 24.Kelly JW, Riecke B, Loomis JM, Beall AC. Visual control of posture in real and virtual environments. Percept Psychophys. 2008;70: 158–65. 10.3758/pp.70.1.158 [DOI] [PubMed] [Google Scholar]
  • 25.Cooper N, Cant I, White MD, Meyer GF. Perceptual assessment of environmental stability modulates postural sway. PLoS One. 2018;13: e0206218 10.1371/journal.pone.0206218 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Horlings CGC, Carpenter MG, Küng UM, Honegger F, Wiederhold B, Allum JHJ. Influence of virtual reality on postural stability during movements of quiet stance. Neurosci Lett. 2009;451: 227–231. 10.1016/j.neulet.2008.12.057 [DOI] [PubMed] [Google Scholar]
  • 27.Robert MT, Ballaz L, Lemay M. The effect of viewing a virtual environment through a head-mounted display on balance. Gait Posture. Elsevier B.V.; 2016;48: 261–266. 10.1016/j.gaitpost.2016.06.010 [DOI] [PubMed] [Google Scholar]
  • 28.Berg KO, Wood-Dauphinee SL, Williams JI, Maki B. Measuring balance in the elderly: validation of an instrument. Can J public Heal = Rev Can santé publique. 1992;83 Suppl 2: S7–11. Available: http://www.ncbi.nlm.nih.gov/pubmed/1468055 [PubMed] [Google Scholar]
  • 29.van der Kooij H, van Asseldonk E, van der Helm FCT. Comparison of different methods to identify and quantify balance control. J Neurosci Methods. 2005;145: 175–203. 10.1016/j.jneumeth.2005.01.003 [DOI] [PubMed] [Google Scholar]
  • 30.Schubert T, Friedmann F, Regenbrecht H. The Experience of Presence: Factor Analytic Insights. Presence Teleoperators Virtual Environ. 2001;10: 266–281. 10.1162/105474601300343603 [DOI] [Google Scholar]
  • 31.Kennedy RS, Lane NE, Berbaum KS, Lilienthal MG. Simulator Sickness Questionnaire: An Enhanced Method for Quantifying Simulator Sickness. Int J Aviat Psychol. 1993;3: 203–220. 10.1207/s15327108ijap0303_3 [DOI] [Google Scholar]
  • 32.Nashner L, Berthoz A. Visual contribution to rapid motor responses during postural control. Brain Res. 1978;150: 403–7. Available: http://www.ncbi.nlm.nih.gov/pubmed/678978 10.1016/0006-8993(78)90291-3 [DOI] [PubMed] [Google Scholar]
  • 33.Davies W. System Identification for Self-Adaptive Control. London: Wiley-Interscience; 1970. [Google Scholar]
  • 34.Brenière Y. Why we walk the way we do. J Mot Behav. 1996;28: 291–8. Available: http://www.ncbi.nlm.nih.gov/pubmed/14769551 10.1080/00222895.1996.10544598 [DOI] [PubMed] [Google Scholar]
  • 35.Winter DA. Biomechanics and Motor Control of Human Movement [Internet]. Hoboken, NJ, USA: John Wiley & Sons, Inc.; 2009. 10.1002/9780470549148 [DOI] [Google Scholar]
  • 36.JASP Team. JASP (Version 0.11.1)[Computer software] [Internet]. 2019. Available: https://jasp-stats.org/
  • 37.Menzies RJ, Rogers SJ, Phillips AM, Chiarovano E, de Waele C, Verstraten FAJ, et al. An objective measure for the visual fidelity of virtual reality and the risks of falls in a virtual environment. Virtual Real. 2016;20: 173–181. 10.1007/s10055-016-0288-6 [DOI] [Google Scholar]
  • 38.Bronstein AM. Suppression of visually evoked postural responses. Exp Brain Res. 1986;63: 655–658. Available: http://www.ncbi.nlm.nih.gov/pubmed/3489640 10.1007/BF00237488 [DOI] [PubMed] [Google Scholar]
  • 39.Paulus WM, Straube A, Brandt T. Visual stabilization of posture. Physiological stimulus characteristics and clinical aspects. Brain. 1984;107 (Pt 4: 1143–63. Available: http://www.ncbi.nlm.nih.gov/pubmed/6509312 10.1093/brain/107.4.1143 [DOI] [PubMed] [Google Scholar]

Decision Letter 0

Eric R Anson

11 Sep 2020

PONE-D-20-23613

Virtual reality as a tool for balance research: eyes open body sway is reproduced in photo-realistic, but not in abstract virtual scenes

PLOS ONE

Dear Dr. Assländer,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by Oct 26 2020 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

We look forward to receiving your revised manuscript.

Kind regards,

Eric R. Anson

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Please ensure you have thoroughly discussed all potential limitations of this study within the Discussion section, including the small sample size.

3.Thank you for including your ethics statement:  "The  study  was  conducted  in  agreement with  the  declaration  of  Helsinki  in  its  latest  revision  and  in  agreement  with  the  University  of Konstanz ethics regulations".   

Please amend your current ethics statement to confirm that your named institutional review board or ethics committee specifically approved this study.

Once you have amended this/these statement(s) in the Methods section of the manuscript, please add the same text to the “Ethics Statement” field of the submission form (via “Edit Submission”).

For additional information about PLOS ONE ethical requirements for human subjects research, please refer to http://journals.plos.org/plosone/s/submission-guidelines#loc-human-subjects-research.

4. Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: General comments

This manuscript is well written and concerns a highly interesting topic, since VR is use increasingly in rehabilitation. The manuscript is well written. Please find my specific comments below.

Title

Explicit

Abstract

Clear

Introduction

Appropriate.

Aim

Long, but clear

Method

How were the subjects recruited? Were they healthy subjects, students, staff? How many women and men? Differences in response to immersive virtual reality environment has been found between men and women in other studies.

Is there any approval from an ethical review board?

Results

Appropriate

Discussion

Line 341 – 348, “Cognitive suppression of the visual contribution”. The paragraph is interesting but needs to be in relation to the findings in this study.

Only limitations of balance in VR are discussed. Please add a paragraph with strengths and limitations of the study.

Conclusion

Well written

Tables and figures

Table 1 shows within subjects differences in the different conditions, please clarify this in the heading. Also include the statistical analysis used when calculating the differences, preferably in a footnote.

Table 2 also needs a clarifying heading. Abbreviations should be explained in a footnote.

References

Appropriate

Reviewer #2: Virtual reality as a tool for balance research: eyes open body sway is reproduced in photo-realistic, but not in abstract virtual scenes

The study compared the amount of postural control (body sway) between reconstructed VR environment (LAB scene) and the real world, together with comparison against other visual conditions. Authors found that postural sway was in the real world was synonymous to the LAB scene but not with eyes closed or abstract virtual scenes. They concluded that VR is a valid tool for conducting balance research. The study was well performed, well written to clearly communicate the results and has added additional evidence for the usage of VR in balance research. There are, however, a few grammatical considerations and comments needed.

Introduction

Line 39: The “Who” reference should read “WHO”

Line 146: Change “was” to “of”.

Line 146: Rephrase to clarify meaning. Check word(s) omitted in sentence beginning with “During the experiment…”, specifically, “…were while…” and “…instrumentation to measure…”

Line 172: Check word(s) omitted in “from were”.

Discussion

Lines 330 – 339: Conducting further research to unravel why inconsistent results were found in the Menzies’ study is in order. However, the present authors could compare the nature of the visual scenes in the both studies to provide some possible reasons. For instance, the LAB scene may have resulted to a lesser postural instability compare to the ABS scene because it was a reconstruction of the real world. It is clear that this LAB scene is different from the visual disturbances in the Menzies’ study.

Line 344: “(Bronstein, 1986)” should be written as “Bronstein (1986)”.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: Yes: Kwadwo Osei Appiah-Kubi

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2020 Oct 29;15(10):e0241479. doi: 10.1371/journal.pone.0241479.r002

Author response to Decision Letter 0


14 Sep 2020

We would like to thank both reviewers for the positive feedback and their helpful remarks. Below is a point-by-point response to the comments.

Reviewer #1: General comments

This manuscript is well written and concerns a highly interesting topic, since VR is use increasingly in rehabilitation. The manuscript is well written. Please find my specific comments below.

Title

Explicit

Abstract

Clear

Introduction

Appropriate.

Aim

Long, but clear

Method

How were the subjects recruited? Were they healthy subjects, students, staff? How many women and men? Differences in response to immersive virtual reality environment has been found between men and women in other studies.

Is there any approval from an ethical review board?

We specified the open questions in the manuscript.

We rephrased the ethics statement, now explicitly refering to the approval.

New paragraph: „Ten young and healthy subjects (age 25.4 ± 4.7 years; height 171 ± 11 cm; weight 65.5 ± 12.0 kg; 5 female, 5 male) participated in the study after giving written informed consent. Exclusion criteria were orthopedic problems, balance related problems, concussions, and a history of epilepsy. Subject were recruited via an online platform where mostly University students are registered. The study was conducted in agreement with the declaration of Helsinki in its latest revision and in agreement with was approved by the University of Konstanz ethics board (IRB20KN07-004).“

Results

Appropriate

Discussion

Line 341 – 348, “Cognitive suppression of the visual contribution”. The paragraph is interesting but needs to be in relation to the findings in this study.

Thank you for pointing out that we did not describe our point properly. We clarified our statement and put it into context.

Only limitations of balance in VR are discussed. Please add a paragraph with strengths and limitations of the study.

We generalized the paragraph and added several aspects, such as the small number of subjects (limitation) and the strong effects we found in our results (strength).

Conclusion

Well written

Tables and figures

Table 1 shows within subjects differences in the different conditions, please clarify this in the heading. Also include the statistical analysis used when calculating the differences, preferably in a footnote.

Table 2 also needs a clarifying heading. Abbreviations should be explained in a footnote.

Amended as suggested.

References

Appropriate

Reviewer #2: Virtual reality as a tool for balance research: eyes open body sway is reproduced in photo-realistic, but not in abstract virtual scenes

The study compared the amount of postural control (body sway) between reconstructed VR environment (LAB scene) and the real world, together with comparison against other visual conditions. Authors found that postural sway was in the real world was synonymous to the LAB scene but not with eyes closed or abstract virtual scenes. They concluded that VR is a valid tool for conducting balance research. The study was well performed, well written to clearly communicate the results and has added additional evidence for the usage of VR in balance research. There are, however, a few grammatical considerations and comments needed.

Introduction

Line 39: The “Who” reference should read “WHO”

Corrected. Thank you!

Line 146: Change “was” to “of”.

Corrected. Thank you!

Line 146: Rephrase to clarify meaning. Check word(s) omitted in sentence beginning with “During the experiment…”, specifically, “…were while…” and “…instrumentation to measure…”

We cleaned up the sentence.

Line 172: Check word(s) omitted in “from were”.

Added. Thank you!

Discussion

Lines 330 – 339: Conducting further research to unravel why inconsistent results were found in the Menzies’ study is in order. However, the present authors could compare the nature of the visual scenes in the both studies to provide some possible reasons. For instance, the LAB scene may have resulted to a lesser postural instability compare to the ABS scene because it was a reconstruction of the real world. It is clear that this LAB scene is different from the visual disturbances in the Menzies’ study.

We fully agree with this comment and rewrote the paragraph to elaborate on the similarity of our abstract scene and the abstract scene used by Menzies et al. and the difference to our LAB scene. We now also point out that the Menzies findings may have been much more clear when using a more realistic scene.

Line 344: “(Bronstein, 1986)” should be written as “Bronstein (1986)”.

Corrected. Thank you!

Decision Letter 1

Eric R Anson

21 Sep 2020

PONE-D-20-23613R1

Virtual reality as a tool for balance research: eyes open body sway is reproduced in photo-realistic, but not in abstract virtual scenes

PLOS ONE

Dear Dr. Assländer,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Overall the reviewer responses to the changes made were very positive.  Although much improved, the paragraph "Cognitive suppression of the visual contribution" in the discussion which highlights important elements comes across as too general and disconnected from the results of the study.  In your revision, please more explicitly connect this paragraph to the study results while also maintaining the broader context that is currently presented.

Please submit your revised manuscript by Nov 05 2020 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

We look forward to receiving your revised manuscript.

Kind regards,

Eric R. Anson

Academic Editor

PLOS ONE

Journal Requirements:

Additional Editor Comments (if provided):

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: (No Response)

Reviewer #2: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Thank you for your revised manuscript, it is much improved. The paragraph in the discussion called “Cognitive suppression of the visual contribution” is, as said before, very interesting but needs to be in relation to the findings of your study. As it is written now, it could preferably be in the introduction. Please revise the paragraph once again to better fit in the discussion.

Reviewer #2: (No Response)

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2020 Oct 29;15(10):e0241479. doi: 10.1371/journal.pone.0241479.r004

Author response to Decision Letter 1


13 Oct 2020

Reviewer #1: Thank you for your revised manuscript, it is much improved. The paragraph in the discussion called “Cognitive suppression of the visual contribution” is, as said before, very interesting but needs to be in relation to the findings of your study. As it is written now, it could preferably be in the introduction. Please revise the paragraph once again to better fit in the discussion.

→ We are sorry for this unnecessary iteration. We were probably mentally stuck in follow-up experiments and hope we now addressed your justified criticism better. For your convenience we copied the revised paragraph below, which is the only changed section in the manuscript.

Cognitive suppression of the visual contribution

One striking aspect of our results is that balance in the abstract virtual scene was only slightly improved as compared to eyes closed, despite a supposedly rich visual input (large field optic flow, depth and parallax information, etc.) and using the same technical device. Above we argue that the difference to the virtual lab scene might be related to the subjective feeling of presence in the scene. However, it is unclear how a feeling of presence could affect the balance control mechanism. One mechanism could be a cognitive suppression of the visual input. It is well known that humans can suppress the visual input to standing balance. For example, Bronstein (1986) showed that subjects show a large sway response when exposed to a sudden movement of a (real world) visual surround for the first time. However, the sway response is largely suppressed during a second and succeeding exposures. The relevance of this suppression mechanism is very straight forward: subjects would fall when looking at a driving bus, if the input could not be suppressed. A similar suppression could also occur when subjects do not feel present in the scene. In other words, the feeling of presence in the virtual environment could modulate the extend to which a subject ‘trusts’ and therefore uses the visual scene as a reliable visual space reference for standing balance. Such a causal relation of presence and a cognitive suppression of the visual contribution to balance is plausible, but remains speculative at this stage.

Decision Letter 2

Eric R Anson

16 Oct 2020

Virtual reality as a tool for balance research: eyes open body sway is reproduced in photo-realistic, but not in abstract virtual scenes

PONE-D-20-23613R2

Dear Dr. Assländer,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Eric R. Anson

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Thank you for your revision, all my concerns are met, and the paragraph in the discussion is perfect!

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: Yes: Eva Ekvall Hansson

Acceptance letter

Eric R Anson

20 Oct 2020

PONE-D-20-23613R2

Virtual reality as a tool for balance research: eyes open body sway is reproduced in photo-realistic, but not in abstract virtual scenes

Dear Dr. Assländer:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Eric R. Anson

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 Data

    (ZIP)

    Data Availability Statement

    All relevant data are within the paper and its Supporting Information files.


    Articles from PLoS ONE are provided here courtesy of PLOS

    RESOURCES