Abstract
In this data article, this dataset included raw data of head and eye movement that collected by Polhemus (Polhemus Inc) and SmartEye (Smart Eye AB) equipment. Subjects who have driver license participated in this experiment. The experiment was conducted with a driving simulator that was controlled by CarSim (Mechanical simulation Co., Anna Arbor, MI) with the vehicle motion. This data set not only contained the eye and head movement but also had eye gaze, pupil diameter, saccades, and so on. It can be used for the parameter identification of the vestibulor-ocular reflex (VOR) model, simulation eye movement, as well as running other analysis related to eye movement.
Keywords: Driver distraction, Vestibulo–ocular reflex, Optokinetic reflex
Specifications Table
Subject area | Psychology, transportation |
---|---|
More specific subject area | Driver distraction evaluation |
Type of data | Table, log file, CSV file, video file. |
How data was acquired | Real-time head and eye movement recording |
Data format | the log file, CSV file |
Experimental factors | Eye simulation based on the head measurement |
Experimental features | A participant was drove following the design course thrice: drove without visual stimulus, drove with visual stimulus, and drove with visual stimulus and mental workload (detail in experimental setup part). |
Data source location | Institute of Innovation for Future Society, Nagoya University, Furo-cho, Chikusa-ku, Nagoya, 464–8601, Japan |
Data accessibility | Data available within this article |
Value of the data
-
•
Parameter identification for VOR model.
-
•
Parameter identification for optokinetic (OKR) model.
-
•
Data provides the possibility for analyzing the effect of visual information on eye movement.
-
•
Data also provides the information of eye movement while driving with the mental workload.
1. Data
+ Smart Eye data: data of eye tracking used Smart Eye equipment [1] (log file).
The Smart Eye data was collected with a 120 Hz sampling rate that included head tracking, eye position, eye gaze, pupil diameter, saccades, fixations, and many more.
+ Polhemus data: Head movement measurement.
The Polhemus equipment was recorded head movement data with the sampling rate (60 Hz). This data contained head position and velocity that can use for inputting of VOR model or OKR model.
2. Experimental design, materials and methods
In the experiment, each subject was asked to drive around a simulated course while seated in a driving simulator with six degrees of freedom. The simulator was controlled by CarSim, which can simulate the dynamic behavior of a vehicle (Fig. 1). In these experiments, the seat was moved with a fixed frequency in the vertical and horizontal plane by using MATLAB Simulink (MathWorks, Natick, MA) to control CarSim.
A subject who held drivers’ licenses participated in the experiment. Each participant followed the course three times: without Visual Stimulus, with Visual Stimulus, and with Visual Stimulus and Mental Workload.
-
•
Visual stimulus (VS): Simulated trees were positioned alongside the test track in the driving simulator to help induce large quantities of optical flow.
-
•
Driving without VS: The subject was asked to drive around a designed course without any simulated objects alongside the road.
-
•
Driving with VS: The subject was asked to drive around the same course with simulated trees along the road.
-
•
Driving with VS and the n-back task: The subject was asked to drive around the same course with simulated trees alongside the road while performing a one-back task within two seconds by pressing appropriate buttons on the steering wheel.
This data can be used to simulate eye movement based on head movement such as [2], [3], Obinata group [4], [5], [6], [7], [8], [9], Anh Son et al. [10], [11], [12], [13], [14], [15], and so on. In addition, this data can use to see the effect of a visual stimulus or mental workload on driver performance as well as eye movement.
Acknowledgments
This research is in part supported by Toyota Motor Corporation. We are particularly grateful to Goro Obinata (Chibu University), Hiroto Hamada (Toyota Company), Kentaro Omura (Nagoya University), Makoto Inagami (Nagoya University) for providing us with secondary data, comments, and so on.
Footnotes
Supplementary data associated with this article can be found in the online version at 10.1016/j.dib.2018.03.097.
Supplementary data associated with this article can be found in the online version at 10.1016/j.dib.2018.03.097.
Transparency document. Supplementary material
.
Appendix A. Supplementary material
.
References
- 1.S.E. AB, User manual - Smart Eye Pro 5.9, (2006). 〈http://cb3.unl.edu/dbrainlab/wp-content/uploads/sites/2/2013/12/Eye-Tracker-Manual_5.0.pdf〉.
- 2.Angelaki D.E., Wei M., Merfeld D.M. Vestibular discrimination of gravity and translational acceleration. Ann. N. Y. Acad. Sci. 2001;942:114–127. doi: 10.1111/j.1749-6632.2001.tb03739.x. [DOI] [PubMed] [Google Scholar]
- 3.Zupan L.H., Peterka R.J., Merfeld D.M. Neural processing of gravito-inertial cues in humans. I. Influence of the semicircular canals following post-rotatory tilt. J. Physiol. 2003 doi: 10.1152/jn.2000.84.4.2001. [DOI] [PubMed] [Google Scholar]
- 4.N. Shibata, G. Obinata, H. Kodera, H. Hamada, Evaluating the influence of distractions to drivers based on eye movement model, in: Proceedings FISITA World Automot. Congr. 2006〈http://ci.nii.ac.jp/naid/10025314235/en/〉, 2006.
- 5.Obinata G., Usui T., Shibata N. On-line method for evaluating driver distraction of memory-decision workload based on dynamics of vestibulo-ocular reflex. Rev. Automot. Eng. 2008;29:627–632. [Google Scholar]
- 6.Obinata G., Tokuda S., Fukuda K., Hamada H. Quantitative evaluation of mental workload by using model of involuntary eye movement. In: Harris D., editor. Eng. Psychol. Cogn. Ergon. SE - 24. Springer Berlin Heidelberg; 2009. pp. 223–232. [Google Scholar]
- 7.Obinata G., Fukuda K., Moriyama A., Tokuda S., Kim Y.W. Evaluating the influence of distractions to drivers based on reflex eye movement model. IFAC. 2010 [Google Scholar]
- 8.H. Aoki, L. Anh Son, H. Hamada, T. Suzuki, G. Obinata, Quantitative evaluation of mental workload with voluntary eye movements by means of the vestibulo-ocular reflex, Proceed ings 2015 Japan Soc. Automot. Eng. Annu. Congr (2015), pp. 780–785.
- 9.L. Anh Son, M. Inagami, T. Suzuki, H. Aoki, G. Obinata, Driver’s internal state estimation based on the eye movement, Proceedings 2016 International Conference Univers. Village. 3, 2016.
- 10.L. Anh Son, H. Aoki, H. Hamada, T. Suzuki, Parameters optimization using genetic algorithm technique for vestibulo-ocular reflex model, Futur. Act. Saf. Technol. Towar. Zero Traffic Accid. (2015).
- 11.L. Anh Son, H. Hamada, M. Inagami, T. Suzuki, H. Aoki, Effect of mental workload and aging on driver distraction based on the involuntary eye movement, in: A.N. Stanton, S. Landry, G. Di Bucchianico, A. Vallicelli (Eds.), Adv. Hum. Asp. Transp. Proceedings AHFE 2016 International Conference Hum. Factors Transp. July 27–31, 2016, Walt Disney World®, Florida, USA, Springer International Publishing, Cham, 2016: pp. 349–359. doi: 〈 10.1007/978-3-319-41682-3_30〉. [DOI]
- 12.Anh Son L., Makoto I., Hiroto H., Tatsuya S., Hirofumi A. The effect of visual stimulus on voluntary eye movement based on a VOR / OKR model. Int. J. Automot. Eng. 2017;8:37–44. [Google Scholar]
- 13.Son L.A., Suzuki T., Aoki H. Evaluation of cognitive distraction in a real vehicle based on the reflex eye movement. Int. J. Automot. Eng. 2018;9:1–8. [Google Scholar]
- 14.Anh Son L., Aoki H., Suzuki T. Evaluation of driver distraction with changes in gaze direction based on a vestibulo-ocular reflex model. J. Transp. Technol. 2017;7:336–350. [Google Scholar]
- 15.L. Anh Son, H. Hiroto, S. Tatsuya, A. Hirofumi, Driver distraction evaluation using reflex eye movement simulation, in: Futur. Act. Saf. Technol. Towar. Zero Traffic Accid, 2017.
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.