Skip to main content
PLOS One logoLink to PLOS One
. 2020 Oct 6;15(10):e0239984. doi: 10.1371/journal.pone.0239984

Dynamics in typewriting performance reflect mental fatigue during real-life office work

Marlon de Jong 1,¤, Anne M Bonvanie 2, Jacob Jolij 1,3,¤, Monicque M Lorist 1,4,¤,*
Editor: Michael B Steinborn5
PMCID: PMC7537853  PMID: 33022017

Abstract

Mental fatigue has repeatedly been associated with decline in task performance in controlled situations, such as the lab, and in less controlled settings, such as the working environment. Given that a large number of factors can influence the course of mental fatigue, it is challenging to objectively and unobtrusively monitor mental fatigue on the work floor. We aimed to provide a proof of principle of a method to monitor mental fatigue in an uncontrolled office environment, and to study how typewriting dynamics change over different time-scales (i.e., time-on-task, time-of-day, day-of-week). To investigate this, typewriting performance of university employees was recorded for 6 consecutive weeks, allowing not only to examine performance speed, but also providing a natural setting to study error correction. We show that markers derived from typewriting are susceptible to changes in behavior related to mental fatigue. In the morning, workers first maintain typing speed during prolonged task performance, which resulted in an increased number of typing errors they had to correct. During the day, they seemed to readjust this strategy, reflected in a decline in both typing speed and accuracy. Additionally, we found that on Mondays and Fridays, workers adopted a strategy that favored typing speed, while on the other days of the week typing accuracy was higher. Although workers are allowed to take breaks, mental fatigue builds up during the day. Day-to-day patterns show no increase in mental fatigue over days, indicating that office workers are able to recover from work-related demands after a working day.

Introduction

In order to be able to interact with the dynamically changing world around us, we continuously adapt our behavior. During prolonged task performance, however, this adaptation is often insufficient to counter for the increasing demands placed on our information processing system. This typically is reflected as a decline in task performance over time: people perform more slowly, make more mistakes, and are less able to correct for these mistakes [1, 2]. This decline in performance, commonly known as mental fatigue, occurs in many settings, and might not only have implications for productivity, but also for the safety of employees and their environment [3, 4]. More specifically, long working hours and experiencing work-related fatigue have shown to be predictors of health complaints and absence due to sickness [5, 6]. Additionally, there is an increased risk of fatigue-related accidents when employees engage in traffic after a long day of work, thereby endangering not only themselves but also others [7]. Considering the impact of mental fatigue in the working environment, it is important to conduct research on how to detect, deal with or even prevent the effects of mental fatigue.

Ever since the beginning of the 19th century researchers have studied mental fatigue and its effects on performance in controlled experimental settings and in real-life situations, such as the workplace [811]. Several theories regarding the cognitive mechanisms behind its manifestation have been proposed, eventually leading to the widely accepted theory suggesting that mental fatigue develops as a result of a cost-benefit evaluation of effort [12, 13]. According to this theory, if the costs of performing a task exceed the benefits of finishing the task, people will come to experience subjective feelings of mental fatigue (e.g., aversion against task performance, low vigilance) and performance deteriorates (i.e., people become slower and less accurate).

Kreapelin was the first to attempt to quantify the course of mental fatigue during task performance. It soon became clear, however, that there was no such thing as a typical decline in performance over time. Mental fatigue and its effects on behavior depend on several personal and environmental factors. For example, people are able to overcome the effects of mental fatigue if they are sufficiently motivated when they receive a monetary reward based on their performance [14] or if they drink a cup of (caffeinated) coffee [15]. Given that it is hard to define a specific course of mental fatigue over time, and employees themselves are poor at detecting when they are not capable of performing a task at an adequate level anymore [16], it is challenging to effectively monitor and prevent mental fatigue in the working environment. In order to detect this decline in performance, it is necessary to continuously monitor behavior dynamics without interference of work.

Developments in information technology, however, have made it possible to monitor behavior in novel ways, without interfering with regular work activities. For example, Pimenta and colleagues [17] developed a method for non-invasive measurement of mental fatigue by monitoring a very common behavior for office workers: typewriting performance. They found that several markers of typing performance were susceptible to the effects of time-of-day. To validate whether changes in these markers were due to mental fatigue specifically, Jong and colleagues [18] conducted an experiment in which brain activity using electroencephalography (EEG) was recorded during a 2-hour typewriting task. They were specifically interested in the P3 brain potential of which the amplitude has been known to decrease with increasing mental fatigue [1921]. The study showed that both typing speed, reflected in the time between two subsequent keypresses (interkey interval), and typing accuracy, reflected in overall backspace use and incorrectly typed words, declined with prolonged task performance. Moreover, these deteriorations in typewriting performance with time-on-task correlated with neural makers signaling mental fatigue, indicating that monitoring typewriting markers can provide information about the level of mental fatigue, at least in a controlled setting.

Although changes in typewriting have been found to reflect mental fatigue under these standardized conditions, there are many other variables that could influence behavior dynamics under less controlled conditions. For instance, at the workplace, where deteriorations in task performance are particularly problematic, the effects of time-of-day and day-of-week have found to influence performance, as well. While time-on-task effects have mostly been studied in experimental settings, studies concerning the effects of time-of-day and day-of-week are generally performed in real-life settings, investigating self-reports. A study of Linder and colleagues [22], for example, showed that clinicians prescribed unnecessary antibiotics more often in the afternoon as compared to the beginning of the day. Similar effects have been found over the different days of the week, where employees have been found to feel more energized after the weekend, resulting in better reported performance at the beginning compared to the end of the week [23]. Although these effects work on different time-scales, they all resulted in changes in (self-reported) performance levels. Moreover, continuously performing a task, and engaging in work for multiple hours or days, requires rest to restore performance to its former level [24]. More specifically, time-on-task effects can be reversed by taking a short (coffee) break [25], time-of-day effects can be reversed by a nights rest [26], and day-of-week effects can be reversed by a weekend break [23]. Although there is substantial evidence that prolonged task performance, manipulated by time-on-task, time-of-day, and day-of-week, separately influence performance, interestingly, it is not yet known how these factors interact and subsequently influence behavior dynamics.

Previous experimental studies on mental fatigue mainly focused on the effects of time-on-task on behavioral performance, investigating isolated effects of prolonged task performance on specific cognitive processes (e.g., error processing [2]). In addition, studies in real-life settings focused on specific professions, especially those involving shift-work [27, 28], where the manifestation of fatigue was expected to be potent or even dangerous, given its relationship with serious accidents [29, 30]. There seems to have been done little research on the manifestation of mental fatigue during regular 9 to 5 jobs.

Present study

In order to gain more insight in the manner in which behavioral dynamics in the workplace are influenced by time-on-task, time-of-day and day-of-week, we first focused on validating a potentially useful method to study mental fatigue on the work floor without interfering with regular working activities. To this end, markers in typewriting that were found to be sensitive to mental fatigue in a lab setting (i.e. interkey interval and backspace use) were recorded for six consecutive weeks during regular office work16. Second, we investigated the influence of mental fatigue on these markers at different time-scales (i.e., time-on-task, time-of-day, and day-of-week).

In line with findings in an experimental setting [18], we hypothesized that there would be a main effect of time-on-task on both the interkey interval and the percentage of backspaces, where we expected that both measures would increase with time-on-task. Secondly, we hypothesized that the magnitude of the effect of time-on-task on these performance measures would depend on time-of-day (main effect). That is, we expected a larger increase in both the interkey interval and the percentage of backspaces with time-on-task in the afternoon (interaction) than in the morning. Lastly, we hypothesized that typewriting patterns would change over the course of the week. We expected these changes to manifest in two ways. First, we hypothesized that employees became slower and less accurate over the week (main effect), and second, we expected a larger decline of performance (interkey interval and backspace use) with time-on-task over the week (interaction).

Materials and methods

Participants

Forty-five office workers gave their written informed consent to participate in a study that was approved by the Ethics Committee of the Faculty of Economics and Business in Groningen. This research complied with the tenets of the Declaration of Helsinki. Participants were employees of the Faculty of Economics and Business of the University of Groningen and were recruited via the health and safety coordinator of the faculty. They were included if they worked for at least 0.8 full-time equivalent (32 h a week) and typewriting activities were part of their work. Only datasets which contained more than 30 subsets of more than 45 min of continuous typing were included in order to perform reliable statistical analyses. From now on we will refer to these subsets as tasks. As a result, data of 23 employees was excluded from the analysis, leaving data of 22 employees (12 females, M = 48.1 year, SD = 13.4). There was variation in function profile across participants that were included in the study (i.e., scientific staff, support staff). Participants that were excluded from the analyses performed working activities during the measurement period that did not include the required amount of typing activities (e.g., teaching and collecting research data), which was specifically the case for Ph.D. students and (Postdoctoral) researchers. In addition, a number of participants worked on multiple workstations during the 6 weeks of data collection, which was reflected in a limited amount of typewriting data that was recorded from these participants at the workstation on which the recording software was installed. Data of these participants were excluded from the analyses, as well.

Apparatus and materials

The experiment was conducted in the natural working environment of the participants at the faculty of Economics and Business of the University of Groningen. The experimental setup consisted of an office chair behind an adjustable desk, a windows computer with a QWERTY keyboard, and screen support. The working environment was adjusted according to the occupational health and safety guidelines of the faculty. The percentage of backspaces and the interkey interval was acquired using keylogging software (aXtion).

Typing performance

Previous research of de Jong et al. [18], found backspace use and the interkey interval to be susceptible to the effects of mental fatigue in a controlled lab setting. In order to monitor these typing indices, keylogging software, installed on the workstations, registered a timestamp at the start of each keystroke. To safeguard the confidentiality of the typed text during the study, only the backspace key was given a unique marker. Each minute, the average interkey interval (the time between two subsequent keystrokes) and the percentage of backspaces of the preceding 15 min was calculated and registered for offline analysis. If the time between two subsequent keystrokes was longer than 5 s, the interkey interval was not included in the average. A series of average values was included in subsequent analysis if more than 45 successive averages were recorded. In the present study, continuous typewriting was defined as typewriting during a block of at least 45 minutes.

Procedure

Typing performance was monitored for 6 weeks in the natural working environment of the participants. Data collection of the first cohort started on the first Monday of May and the second cohort started on the first Monday of November. A week before the start of the monitoring period, the keylogging software was installed on the computers of the participants and the office environment was confirmed to be or adjusted according to occupational health and safety guidelines of the faculty. During this week, participants also filled out a questionnaire with demographic and work-related questions (S1 Appendix). Each Monday, starting in the second week of the experiment, participants filled out a questionnaire with general questions about how they experienced the week before (S2 Appendix). Each working day, participants received real-time feedback on their performance provided via text messages on their mobile phones and via email. An overview was provided via email at the end of the day.

Statistical analysis

Statistical analysis was conducted in R version 3.4.4 [31, 32]. For statistical significance testing, we used a mixed-modelling approach using the lme4 package version 1.1–21 [33]. The package lmertest version 3.0–1 was used to obtain statistical significance by approximating the degrees of freedom using the Satterthwaite approximation [34]. The data provided to the models included the interkey interval and the percentage of backspaces. The models contained a varying intercept per participant. In addition, a varying slope for time-on-task and time-of-day by subject was added to the model if the fit of the model improved as indicated by the Akaike Information Criterion (AIC) [35]. The models used to statistically test the effects of time-on-task (120 min of continuous typewriting), time-of-day (morning and afternoon), and day-of-week (Monday, Tuesday, Wednesday, Thursday, and Friday) on the dependent typewriting variables (i.e., interkey interval and percentage of backspaces) are listed in Table 1.

Table 1. Models used to statistically test the effects of time-on-task, time-of-day, and day-of week on the interkey interval and the percentage of backspace keystrokes.

Dependent variable Equation
Interkey intervaln β0.j+β1timeontaskn+β2timeofdayn+β3dayofweekn+β5timeontaskn×timeofdayn+β6timeontaskn×dayofweekn+β7timeofdayn×dayofweekn+β8timeontaskn×timeofdayn×dayofweekn+ϵn
and
Backspace usen

n reflects a time-block (minute) and j reflects a participant. β0 reflects the intercept of the model, β1–8 reflect the regression coefficients, and ϵ reflects the error term. The notation for these models allowed for a varying intercept per participant (as indicated by j).

Post-hoc tests were performed to assess the main and interaction effects, adjusting error rates according to Bonferroni. First, to estimate the difference between the effect of time-on-task between the morning and the afternoon, polynomial contrasts were compared using pairwise comparisons. Second, pairwise comparisons were administered to compare the interkey interval and backspace use on the different days of the week. Finally, polynomial contrasts were used to estimate the linear and quadratic trends with time-on-task in the morning and the afternoon, and over the different days of the week. Statistical tests were considered significant at p < .05.

Speed and accuracy

In order to investigate the relationship between typing speed (interkey interval) and typing accuracy (backspace use) during continuous typewriting, we calculated the regression coefficients that described the effect of time-on-task on the dependent variables in the morning and in the afternoon for each participant. For these personalized regression coefficients, we calculated Pearson’s correlations to identify whether changes in typing speed and accuracy were related.

Results

In order to systematically discuss the results, we first report the effects of time-on-task on the interkey interval, reflecting typing speed, and backspace use, reflecting accuracy. Thereafter, we go into the effects of time-of-day and the interaction of time-of-day with time-on-task on the same measures. Lastly, we report how these typewriting patterns change over the different days of the week. The models that were used to statistically test the effects of prolonged task performance on the different time-scales (i.e., time-of-day, time-of-day, and day-of-week) can be found in Table 1. An overview of the main and interaction effects is provided in Tables 2 and 3, respectively.

Table 2. The main and interaction effects of time-on-task, time-of-day, and day-of week on the interkey interval.

Main and interaction effects F-value Dfs p-value
Time-on-task2 17.75 2, 94707 < .001
Time-of-day 1.07 1, 94719 .302
Day-of week 74.15 4, 94707 < .001
Time-on-task2 × time-of-day 6.90 2, 94703 .001
Time-on-task2 × day-of week 47.57 8, 94701 < .001
Time-of-day × day-of week 54.69 4, 94705 < .001
Time-on-task2 × time-of-day × day-of week 72.08 8, 94701 < .001

Table 3. The effects of time-on-task, time-of-day and day-of week and their interaction on the percentage of backspaces described by F-test.

main and interaction effects F-value Dfs p-value
Time-on-task2 284.28 2, 91637 < .001
Time-of-day 36.55 1, 91656 < .001
Day-of week 53.37 4, 91637 < .001
Time-on-task2 × time-of-day 12.93 2, 91634 < .001
Time-on-task2 × day-of week 21.85 8, 91633 < .001
Time-of-day × day-of week 36.93 4, 91635 < .001
Time-on-task2 × time-of-day × day-of week 9.54 8, 91633 < .001

Time-on-task

The results showed that both the interkey interval (F(2, 94707) = 17.75, p < .001) and the percentage of backspace keystrokes (F(2, 91637) = 284.28, p < .001) changed with prolonged task performance (i.e., subset of > 45 minutes of continuous typewriting). That is, in general, we observed an increase in both the interkey interval and the percentage of backspaces, reflecting a decrease in typing speed and a decline in typing accuracy with time-on-task. However, as expected, these effects were modulated by time-of-day and day-of-week. These modulations will be discussed below.

Time-of-day

Although mean interkey interval (main effect time-of-day: F(1, 94719) = 1.07, n.s.) did not differ between the morning and the afternoon, the effect of time-on-task on the interkey interval was modulated by time-of-day (interaction effect time-on-task × time-of-day: F(2, 94703) = 6.90, p = .001; afternoon minus morninglinear: z = 3.29, p = .006; afternoon minus morningquadratic: z = -3.51, p = .002). That is, post-hoc tests revealed that the interkey interval remained stable during continuous typewriting in the morning, but in general it increased with 11.6 ms during two hours of continuous task performance in the afternoon (see Table 4 and Fig 1A).

Table 4. The effect of time-on-task on the interkey interval and backspace use in the morning and the afternoon.

Dependent variable Time-of-day Polynomial z-value Mchange (SE)
Interkey interval (ms) Morning Linear -1.92 2.94 (1.53)
Quadratic 1.73
Afternoon Linear 5.42*** 11.51 (2.12)
Quadratic -3.07*
Backspace use
(% of backspace keystrokes)
Morning Linear 19.74*** 1.57 (0.08)
Quadratic -11.11***
Afternoon Linear 14.99*** 1.63 (0.11)
Quadratic -9.84***

Mchange reflects the average change in the dependent variable from the 1st to the 120th minute of continuous typewriting.

Bolded values are significant

∗ p < .05;

∗∗ p < .01;

∗∗∗ p < .001

Fig 1. The effect of time-on-task on typewriting changes from the morning to the afternoon.

Fig 1

Time-blocks are calculated based on the preceding 15 min, see method section. (A) the interaction between time-on-task and time-of-day on the average interkey interval. (B) The interaction between time-on-task and time-of-day on backspace use. The confidence intervals reflect the standard errors of the mean.

Backspace use increased from the morning to the afternoon (main effect time-of-day: F(1, 91656) = 36.55, p < .001). Additionally, the effect of time-on-task on backspace use differed between the morning and the afternoon (interaction effect time-on-task × time-of-day: F(2, 93467) = 9.54, p < .001). That is, although the percentage of backspace keystrokes increased with ~1.6% during two hours of prolonged task performance, both in the morning and in the afternoon (afternoon minus morninglinear: z = -0.46, p = 1.0), the increase followed a more quadratic function in the afternoon compared to the morning (afternoon minus morningquadratic: z = -3.95, p<0.001; see Table 2 and Fig 1B).

Day-of-week

In addition to the effects of time-on-task and time-of-day, we also looked into changes in typewriting patterns over the workweek. First, we hypothesized that typing performance would decline over the workweek, reflected in an increase in the interkey interval and the percentage of backspaces. Contrary to our expectations, we observed an increase in the interkey interval from 295ms on Monday to 301ms on both Tuesday (Mon-Tue: z = -3.93, p < .001) and Wednesday (Mon-Wed: z = -3.99, p < .001), after which the interkey interval decreased to 296ms on Thursday and 291ms on Friday (zlinear = -6.09, p < .001), during which employees’ typewriting was fastest (see Table 5 and Fig 2A).

Table 5. Average typing performance on the different days of the week, reflected by the interkey interval (ms) and backspace use (% of backspace keystrokes).

Day-of-week Mean interkey interval in ms (SE) Mean percentage of backspace keystrokes (SE)
Monday 295 (9.62) 8.62 (0.67)
Tuesday 301 (9.60) 8.40 (0.66)
Wednesday 301 (9.60) 7.82 (0.66)
Thursday 296 (9.63) 7.83 (0.66)
Friday 291 (9.65) 8.00 (0.67)

Fig 2. The course of typewriting performance over the different days of the week.

Fig 2

(A) The effect of day-of-week on the interkey interval. (B) The effect of day-of-week on the percentage of backspace keystrokes. The confidence intervals reflect the standard errors of the mean.

Backspace use was highest on Monday with 8.6% of the keystrokes were backspace keystrokes, followed by Tuesday with 8.4% compared to the other days of the week (Mon-mean(Wed, Thu, Fri): z = 11.39, p < .001; Tue-mean(Wed, Thu, Fri): z = 8.91, p < .001; see Table 5 and Fig 2B). Backspace use did not significantly differ between Wednesday, Thursday and Friday (Wed-Thu: z = -0.11, p = 1.0; Wed-Fri: z = 2.11, p = .351; Thu-Fri: z = 1.88, p = .604).

Additionally, we observed that the effects of time-on-task on typing speed differed over the days of the week (see Table 2 for an overview of the main and interaction effects). That is, on Monday afternoon (z = -7.10, p < .001), and Friday morning (z = -8.33, p < .001) the interkey interval decreased with time-on-task. On the other days of the week, the interkey interval either remained stable or increased, the last one reflecting a decrease in typing speed with time-on-task.

We also observed changes in the effect of prolonged task performance on backspace use over the working week (see Table 3 for an overview of the main and interaction effects). On all days, except for Friday afternoon, the percentage of backspace keystrokes increased with time-on-task in the morning and in the afternoon. On Friday in the afternoon no change in backspace use was observed (z = 0.82, n.s.).

Speed and accuracy

In order to investigate the relationship between typing speed (interkey interval) and typing accuracy (backspace use) between the morning and the afternoon, we calculated the effect of time-on-task in the morning and in the afternoon. The correlations between these coefficients revealed that the relation between speed and accuracy differed between the morning and the afternoon. In the morning, we observed no correlation between the effect of time-on-task on typing speed and the effect of time-on-task on the percentage of backspace keystrokes (r = -0.04, n.s.). In the afternoon, however, there was a positive relationship between the increase in the interkey interval and the increase in the percentage of backspaces. More specifically, participants that showed a larger increase in the interkey interval with time-on-task also showed a larger increase in the percentage of backspace keystrokes with time-on-task (r = 0.622, p < .001), indicating that changes in typewriting with time-on-task do not reflect changes in speed-accuracy trade-off.

Discussion

In the current study we evaluated novel, non-invasive measures based on typewriting to continuously monitor behavior in a working environment. Our aims were, first, to provide a proof of principle of this method, and, second, to study how typewriting dynamics during regular office work change over different time-scales (i.e., time-on-task, time-of-day, day-of-week). Based on earlier findings observed in a controlled environment, we focused on interkey interval and backspace use as indices of behavior. To investigate these aims, the typewriting markers were recorded for six consecutive weeks during regular office work performed in a university environment. We confirmed that typewriting behavior contains sensitive markers that reflect changes in behavior over time. In addition to general changes in speed and accuracy with time-on-task, we found that the effects of time-on-task as indexed by our typewriting measures changed throughout the day. More specifically, the effect of time-on-task on typing speed (i.e., interkey interval) was more pronounced in the afternoon than in the morning. Moreover, on average, office workers used the backspace key more often in the afternoon compared to the morning, although the effect of time-on-task on backspace use, reflecting task accuracy, was smaller in the afternoon. Finally, an analysis of time of week effects provides no evidence for a general decline in performance over the week.

With regard to our first aim, as hypothesized, the length of the interkey interval and the percentage of backspace keystrokes both increased with time-on-task, replicating previous work suggesting that changes in markers derived from typewriting are sensitive to mental fatigue elicited during continuous task performance [18, 36]. Previously, this type of research was mainly conducted in simulated office environments [37], or focused on self-reported behavior of employees [38], using measures that either interrupted regular office work or relied on subjective measures influenced by the observer's personal judgment [39]. Our findings show that our measures based on typing behavior have practical potential to objectively monitor performance efficiency without disturbing regular work-related activities.

A similar pattern of results was observed during simulated office work [18] as in the present, real-life environment. Moreover, the changes with time-on-task in typewriting performance were even found to be more pronounced in the present study. Relevant in this perspective is that in the present study, compared to a relatively controlled experimental environment, many uncontrollable factors may have influenced performance efficiency due to the dynamic nature of the actual office environment [40, 41]. On the one hand, factors such as interruptions related to the presence of others [42] and uncontrollable requests for actions from electronic devices (e.g., online activity, telephone calls), might increase task demands [43], which could in turn increase the effects of mental fatigue on performance efficiency. However, on the other hand, work motivation [1] and enhanced autonomy with regard to setting one’s own schedule and planning work-breaks if needed [37], might reduce experienced task demands and related levels of mental fatigue during regular office work compared to the lab setting. Interestingly, despite these noisy conditions, we observed significant changes in the typewriting indices during prolonged task performance. To summarize, with regard to our first aim, we provided a proof of principle of the sensitivity of these measures, confirming that typewriting markers are susceptible to changes in behavior related to the effects of mental fatigue, not only in a controlled experimental setting, but also in an uncontrolled office environment.

Under real life conditions, factors that influence our behavior vary every day and even from hour to hour, and therefore a substantial variability in performance and the effects of mental fatigue might be expected over time. Our second aim was to investigate how typewriting dynamics during regular office work changed over different time-scales. In general, performance efficiency during real life activities, such as typewriting, depends on two dimensions: speed and accuracy. In the present study, the interkey interval served as an indicator of typing speed and the percentage of backspace keystrokes was used as an indicator of typing accuracy [18]. Using the backspace key is an indirect measure of typing accuracy, given that it is used to correct mistakes in typewriting. Therefore, while interpreting the results, it is important to keep in mind that an increase in backspace use could originate from different types of behavior. That is, in our study, participants could have corrected more (in)correctly typed letters and/or detected their errors later, which, as a result, required more consecutive backspaces to correct one incorrect keystroke.

The results of the present study showed that, in the morning, typing speed remained relatively stable over time. Simultaneously, typing accuracy declined, which was revealed by an increase in backspace use. In the afternoon, we observed a decline in both dimensions of typing performance. More specifically, typing speed decreased over time, reflected by an increase in the interkey interval, and additionally the quality of typing was reduced, indicated by the increase in the percentage of backspace keystrokes with prolonged task performance. This pattern shows similarities with previous research investigating the effects of mental fatigue on task control in a lab setting [44]. Lorist and colleagues showed that if participants were instructed to perform fast, accuracy steadily declined from the start of the experiment, while participants kept responding at a stable speed. After a while, participants seemed to adjust their strategy. That is, over time, participants performed at a slower pace as well, which was observed in an increase in RTs.

In daily life, people adoptively invoke qualitatively different performance strategies. Adopting a strategy that focuses on speed generally results in a larger number of errors, while adopting a strategy that focuses on high accuracy results in slower performance [45]. People tend to moderate this speed-accuracy trade-off based on external conditions and the time available to complete their work. The results suggested that, in the present study, office workers first tried to maintain typing speed, which resulted in an increased number of typing errors they had to correct. In the afternoon, however, they seemed to readjust their strategy, which resulted in a decline in both dimensions of performance. In comparison to the study of Lorist and colleagues [44], who measured prolonged performance during a 2h session, the pattern we found stretched out over the day, indicating that mental fatigue might build up during a working day. These findings imply that, although office workers are entitled to have breaks during a working day, the scheduled breaks might not have been enough to fully recover from the demands they encountered during the day [46].

Previous research on the effects of mental fatigue on typewriting strategies showed that people tend to make more typing errors during prolonged task performance [18]. However, when given the option to correct their mistakes, they at least partly correct these mistakes. Although several studies observed a similar increase in correction behavior during prolonged task performance, people are not able to correct for the total increase in mistakes, even if these mistakes are in plain sight, that is, clearly visible on the screen in front of them [1, 2]. In order to safeguard the privacy of the employees, however, we did not monitor the identity of the keys apart from the backspace keys. For this reason, we were not able to identify and analyze the mistakes that were not corrected during typewriting.

Error-corrections indirectly reflect accuracy on a given task, and using error-corrections during typewriting as a measure of mental fatigue might therefore provide information on the effects of mental fatigue on underlying cognitive processes. Monitoring performance requires higher-order mental functions, which are prone to the effects of mental fatigue. Experimental studies showed that erroneous responses are usually followed by specific brain activation patterns, called the error-related negativity (ERN) [47, 48], and result in decreased response speed on the next trial (i.e. post-error slowing) [49]. Lorist and colleagues [1] investigated these behavioral and brain activity patterns during prolonged performance on an Eriksen flanker task. They found that performance monitoring declined over time, which was reflected in a significant decrease of brain activity patterns related to error processing (ERN), and was accompanied by a decrease of post-error slowing. This decreased ability to monitor behavior and adapt performance concurrently might have resulted in a later detection of errors and therefore in an increase in error-corrections during the present study, as was also shown in the controlled lab study of de Jong and colleagues [18].

The present study focused on the dynamics in typewriting during prolonged task performance. Previous research repeatedly showed that, in addition to a tonic decline in speed and accuracy over time, fatigued participants also experience short-term lapses in performance during which they are unable to process any information [50, 51]. These phasic lapses in performance, so-called mental blocks, are characterized by extremely long reaction times during experimental tasks and can be detected by studying the distribution of reaction times [25]. In the present study, we excluded lapses of attention by excluding interkey intervals that were longer than 5 seconds. However, for future research it would be interesting to investigate whether the effects of prolonged task performance on length or number of short-term lapses in performance during prolonged typewriting follow a similar pattern as the tonic effects of prolonged task performance on typewriting.

Besides the effects of mental fatigue on typewriting dynamics during the day, the present study also provides direct insight into typewriting dynamics during a working week. First, we found no evidence for a general decline in typewriting performance with day-of-week, given that backspace use remains stable on Wednesday, Thursday and Friday, and the interkey interval decreases, reflecting a increase in typing speed, from Wednesday to Friday. Second, we found that the effects of prolonged task performance on typing speed and accuracy followed a similar pattern over the different days of the week, suggesting that mental fatigue elicited on the previous day, as reflected in the effects of time-on-task and time-of-day, did not influence the course of performance during prolonged task performance on the next day. These results provide proof that mental fatigue does not accumulate across the days of the week. Although previous literature does not paint a consistent picture, Persson and colleagues [52] indicated that alertness of construction workers did not increase during a working week. This pattern was shown in construction workers with a regular working schedule (7–15 h), but also in workers with an extended schedule (six days in a row, one day off, five days in a row, nine days off) that stayed at accommodations at the construction site during the working week.

Our findings provide further evidence for office workers’ ability to recover from work-related demands during the week. Nonetheless, typewriting dynamics were subject to daily variations. That is, on Mondays and Fridays, office workers adopted a typewriting strategy that maximized typing speed, while on the other days of the week they either adopted a strategy that maximized accuracy (Wednesdays), or performed both fast and accurate compared to the other days of the week (Thursdays). Although some studies showed that employees recover from regular work-related demands after engaging in pleasurable activities during the weekend [23, 53], other studies revealed that Mondays serve as a transition day from pleasurable activities to the structured demanding work week, which is reflected in a more negative mood [54], increased stress levels [55], and decreased ability to recover from work-related demands on Mondays [56]. These factors might have also led to the observed behavioral patterns in the present study. Similarly, it could be argued that Fridays also serves as a transition day from the work week to the weekend. In contrast to Mondays, however, Fridays have previously been associated with improved mood compared to the rest of the week [57].

This study has implications for real-life working environments, given that a large part of the working population regularly performs computer work. In the Netherlands, for example, 40% of the employees perform computer work more than 6 h every day [58]. There are several ways in which monitoring typewriting could support employees during their work. First, personalized real-time feedback based on changes in typing behavior could be provided to the users in order to help them detect when lapses in performance occur and a short break might be beneficial. However, real-time feedback might be biased due to dynamics in typewriting performance that are not related to lapses in performance. One of the characteristics of our working environment is the large variability in working conditions, due to changes in work-related tasks, noise in the working environment, and changes in general persons state, among others. Our method also allows monitoring performance over a longer period of time enabling us to detect regularities in working activities. Related to this, a second possibility of our method is to provide feedback on an individual level to help employees realize a more optimal work-break schedule that is complementary with their individual state and specific work-related demands. By comparing behavior dynamics over several weeks, typing behavior could help decide when, during the workday or -week an employee should work on tasks that need high accuracy or when it is better to work on less demanding tasks. A third option is to use changes in typing behavior to evaluate interventions in the working environment. For instance, it might provide relevant information with regard to performance efficiency for evaluating the effectiveness of a 6-hour workday instead of our regular 8-hour workday. Previously, researchers already used questionnaires to evaluate this specific intervention, however, measuring performance, and importantly, doing so without interrupting regular activities, could enhance our knowledge of its effects on performance and productivity more objectively.

Conclusions

The typing indices that were used to describe behavior dynamics reflect subtitle changes in both speed and accuracy during regular office work, not only during the day but also over the week. These findings might be relevant to consider when scheduling different tasks over the day, but could also provide information about the number of hours that employees can or should work during a day.

Supporting information

S1 Appendix

(DOCX)

S2 Appendix

(DOCX)

Acknowledgments

The authors would like to thank Aafke Wiekens for recruiting the participants.

Data Availability

Data cannot be shared publicly because participants did not give their consent for sharing the data with other parties than the researchers. Data are available from the Behavioral and Social Sciences Institutional Data Access for researchers who meet the criteria for access to confidential data (research-data-bss@rug.nl).

Funding Statement

This study, part of SPRINT@Work, is part-financed by the European Regional Development Fund, the province and municipality of Groningen, and the province of Drenthe (Grant No: T-3036, 2013). MJ, AB and ML were financed by this funding. The funders did not play a role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript. There was no additional external funding received for this study.

References

  • 1.Boksem MAS, Meijman TF, Lorist MM. Mental fatigue, motivation and action monitoring. Biol Psychol. 2006;72(2):123–32. 10.1016/j.biopsycho.2005.08.007 [DOI] [PubMed] [Google Scholar]
  • 2.Lorist MM, Boksem MAS, Ridderinkhof KR. Impaired cognitive control and reduced cingulate activity during mental fatigue. Brain Res Cogn Brain Res. 2005;24(2):199–205. 10.1016/j.cogbrainres.2005.01.018 [DOI] [PubMed] [Google Scholar]
  • 3.McCormick F, Kadzielski J, Landrigan CP, Evans B, Herndon JH, Rubash HE. Surgeon Fatigue. Arch Surg. 2012;147(5):430–5. 10.1001/archsurg.2012.84 [DOI] [PubMed] [Google Scholar]
  • 4.Ricci JA, Chee E, Lorandeau AL, Berger J. Fatigue in the U.S. workforce: Prevalence and implications for lost productive work time. J Occup Environ Med. 2007;49(1):1–10. 10.1097/01.jom.0000249782.60321.2a [DOI] [PubMed] [Google Scholar]
  • 5.Sluiter JK, de Croon EM, Meijman TF, Frings-Dresen MHW. Need for recovery from work related fatigue and its role in the development and prediction of subjective health complaints. Occup Environ Med. 2003;60(suppl 1):i62 LP–i70. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Sparks K, Cooper C, Fried Y, Shirom A. The effects of hours of work on health: A meta-analytic review. J Occup Organ Psychol. 1997;70(1991):391–408. [Google Scholar]
  • 7.Gander P, Purnell H, Garden A, Woodward A. Work patterns and fatigue-related risk among junior doctors. Occup Environ Med. 2007;64(11):733–8. 10.1136/oem.2006.030916 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Bartley S, Chute E. Fatigue and Impairment in Man. Q Rev Biol. 1949. March 1;24(1):68. [Google Scholar]
  • 9.Hockey GR. The psychology of fatigue: Work, effort and control Cambridge: Cambridge University Press; 2013. [Google Scholar]
  • 10.Muscio B. Feeling-Tone in Industry. Br J Psychol. 1921;12:150–62. [Google Scholar]
  • 11.Thorndike E. Mental fatigue. I. Psychol Rev. 1900;7(6):547–79. [Google Scholar]
  • 12.Boksem M, Tops M. Mental fatigue: Costs and benefits. Brain Res Rev. 2008;59:125–39. 10.1016/j.brainresrev.2008.07.001 [DOI] [PubMed] [Google Scholar]
  • 13.Hockey GRJ. A motivational control theory of cognitive fatigue In: Ackerman P.L., editor. Cognitive fatigue: Multidisciplinary perspectives on current research and future applications [Internet]. Washington, DC, US: American Psychological Association; 2011. p. 167–87. Available from: http://content.apa.org/books/12343-008 [Google Scholar]
  • 14.Boksem MAS, Meijman TF, Lorist MM. Effects of mental fatigue on attention: an ERP study. Brain Res Cogn Brain Res. 2005;25(1):107–16. 10.1016/j.cogbrainres.2005.04.011 [DOI] [PubMed] [Google Scholar]
  • 15.Lorist MM, Snel J, Kok A. Influence of caffeine on information processing stages in well rested and fatigued subjects. Psychopharmacology (Berl). 1994;113(3):411–21. [DOI] [PubMed] [Google Scholar]
  • 16.Zhang Y, Gong J, Miao D, Zhu X, Yang Y, Military B, et al. Subjective evaluation of mental fatigue and characteristics of attention during a driving simulation. Soc Behav Personal an Int J. 2011;39(1):15–20. [Google Scholar]
  • 17.Pimenta A, Carneiro D, Novais P, Neves J. Analysis of Human Performance as a Measure of Mental Fatigue. Hybrid Artif Intell Syst. 2014;389–401. [Google Scholar]
  • 18.de Jong M, Jolij J, Pimenta A, Lorist MM. Age Modulates the Effects of Mental Fatigue on Typewriting. Front Psychol. 2018;9(July):1–15. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Hopstaken JF, van der Linden D, Bakker AB, Kompier MAJ. A multifaceted investigation of the link between mental fatigue and task disengagement. Psychophysiology. 2015;52(3):305–15. 10.1111/psyp.12339 [DOI] [PubMed] [Google Scholar]
  • 20.Lorist MM, Jolij J. Trial history effects in Stroop task performance are independent of top-down control. PLoS One. 2012;7(6):1–13. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Wascher E, Getzmann S. Rapid Mental Fatigue Amplifies Age-Related Attentional Deficits. 2014;28(3):215–24. [Google Scholar]
  • 22.Linder JA, Doctor JN, Friedberg MW, Reyes Nieva H, Birks C, Meeker D, et al. Time of Day and the Decision to Prescribe Antibiotics. JAMA Intern Med. 2014. December 1;174(12):2029–31. 10.1001/jamainternmed.2014.5225 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Binnewies C, Sonnentag S, Mojza EJ. Recovery during the weekend and fluctuations in weekly job performance: A week-level study examining intra-individual relationships. J Occup Organ Psychol. 2010;83(2):419–41. [Google Scholar]
  • 24.Kühnel J, Zacher H, de Bloom J, Bledow R. Take a break! Benefits of sleep and short breaks for daily work engagement. Eur J Work Organ Psychol. 2017;26(4):481–91. [Google Scholar]
  • 25.Steinborn MB, Huestegge L. A Walk Down the Lane Gives Wings to Your Brain. Restorative Benefits of Rest Breaks on Cognition and Self-Control. Appl Cogn Psychol. 2016;30(5):795–805. [Google Scholar]
  • 26.Hooff MLM, Geurts SAE, Kompier MAJ, Taris TW. Workdays, in-between workdays and the weekend: A diary study on effort and recovery. Int Arch Occup Environ Health. 2007;80(7):599–613. 10.1007/s00420-007-0172-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Brown JP, Martin D, Nagaria Z, Verceles AC, Jobe SL, Wickwire EM. Mental Health Consequences of Shift Work: An Updated Review. Curr Psychiatry Rep. 2020;22(2):1–7. [DOI] [PubMed] [Google Scholar]
  • 28.Kecklund G, Axelsson J. Health consequences of shift work and insufficient sleep. BMJ. 2016;355:i5210 10.1136/bmj.i5210 [DOI] [PubMed] [Google Scholar]
  • 29.Swaen GMH, Van Amelsvoort LGPM, Bültmann U, Kant IJ. Fatigue as a risk factor for being injured in an occupational accident: Results from the Maastricht Cohort Study. Occup Environ Med. 2003;60(SUPPL. 1):88–92. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Chan M. Fatigue: The most critical accident risk in oil and gas construction. Constr Manag Econ. 2011;29(4):341–53. [Google Scholar]
  • 31.Studio Team R. RStudio: Integrated Development for R. 2016. [Google Scholar]
  • 32.R Development Core Team. R: A language and environment for statistical computing. Vienna, Austria. 2017.
  • 33.Bates D, Mächler M, Bolker BM, Walker SC. Fitting linear mixed-effects models using lme4. J Stat Softw. 2015;67(1):1–48. [Google Scholar]
  • 34.Kuznetsova A, Brockhoff PB, Christensen RHB. lmerTest Package: Tests in Linear Mixed Effects Models. J Stat Softw. 2017;82(2):1–26. [Google Scholar]
  • 35.Bozdogan H. Model selection and Akaike’s Information Criterion (AIC): The general theory and its analytical extensions. Psychometrika. 1987;52(3):345–70. [Google Scholar]
  • 36.Pimenta A, Carneiro D, Novais P, Neves J. Monitoring mental fatigue through the analysis of keyboard and mouse interaction patterns. Lect Notes Comput Sci (including Subser Lect Notes Artif Intell Lect Notes Bioinformatics). 2013;8073 LNAI:222–31. [Google Scholar]
  • 37.Hockey GRJ, Earle F. Control over the scheduling of simulated office work reduces the impact of workload on mental fatigue and task performance. J Exp Psychol Appl. 2006;12(1):50–65. 10.1037/1076-898X.12.1.50 [DOI] [PubMed] [Google Scholar]
  • 38.Smolders KCHJ De Kort YAW, Tenner AD Kaiser FG. Need for recovery in offices: Behavior-based assessment. J Environ Psychol. 2012;32(2):126–34. [Google Scholar]
  • 39.Norbert S. Self-Reports: How the Questions Shape the Answers. Am Psychol. 1999;54(2):93–105. [Google Scholar]
  • 40.Davis DR. The disorganization of behaviour in fatigue. J Neurol Neurosurg Psychiatry. 1946;9(1):23–9. 10.1136/jnnp.9.1.23 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Smith-Jackson TL, Klein KW. Open-plan offices: Task performance and mental workload. J Environ Psychol. 2009;29(2):279–89. [Google Scholar]
  • 42.Galván VV, Vessal RS, Golley MT. The Effects of Cell Phone Conversations on the Attention and Memory of Bystanders. PLoS One. 2013;8(3):1–10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43.Steege LM, Drake DA, Olivas M, Mazza G. Evaluation of physically and mentally fatiguing tasks and sources of fatigue as reported by registered nurses. J Nurs Manag. 2015;23(2):179–89. 10.1111/jonm.12112 [DOI] [PubMed] [Google Scholar]
  • 44.Lorist MM, Klein M, Nieuwenhuis S, De Jong R, Mulder G, Meijman TF. Mental fatigue and task control: planning and preparation. Psychophysiology. 2000;37:614–25. [PubMed] [Google Scholar]
  • 45.Wickens CD, Hollands JG. Engineering Psychology and Human Performance. In: SA Journal of Industrial Psychology. 2000. [Google Scholar]
  • 46.Jansen NWH, Kant I, Van Amelsvoort LGPM, Nijhuis FJN, Van Den Brandt PA. Need for recovery from work: Evaluating short-term effects of working hours, patterns and schedules. Ergonomics. 2003;46(7):664–80. 10.1080/0014013031000085662 [DOI] [PubMed] [Google Scholar]
  • 47.Gehring WJ, Goss B, Coles MGH, Meyer DE, Donchin E. A Neural System for Error Detection and Compensation. Psychol Sci. 1993;4(6):385–90. [Google Scholar]
  • 48.Falkenstein M, Hohnsbein J, Hoormann J, Blanke L. Effects of crossmodal divided attention on late ERP components. II. Error processing in choice reaction tasks. Electroencephalogr Clin Neurophysiol. 1991;78(6):447–55. 10.1016/0013-4694(91)90062-9 [DOI] [PubMed] [Google Scholar]
  • 49.Rabbitt PM. How old and young subjects monitor and control responses for accuracy and speed. Br J Psychol. 1979;305–11. [Google Scholar]
  • 50.Bills AG. Blocking: a new principle of mental fatigue. Am J Psychol. 1931;43:230–45. [Google Scholar]
  • 51.Bertelson P, Joffe R. Blockings in prolonged serial responding. Ergonomics. 1963. April 1;6(2):109–16. [Google Scholar]
  • 52.Persson R, Helene Garde A, Schibye B, Ørbæk P. Building-site camps and extended work hours: A two-week monitoring of self-reported physical exertion, fatigue, and daytime sleepiness. Chronobiol Int. 2006;23(6):1329–45. 10.1080/07420520601058021 [DOI] [PubMed] [Google Scholar]
  • 53.Fritz C, Sonnentag S. Recovery, Health, and Job Performance: Effects of Weekend Experiences. 2005;10(3):187–99. 10.1037/1076-8998.10.3.187 [DOI] [PubMed] [Google Scholar]
  • 54.Croft GP, Walker AE. Are the Monday blues all in the mind? The role of expectancy in the subjective experience of mood. J Appl Soc Psychol. 2001;31(6):1133–45. [Google Scholar]
  • 55.Devereux J, Rydstedt LW, Cropley M. An exploratory study to assess the impact of work demands and the anticipation of work on awakening saliva cortisol. Psychol Rep. 2011;108(1):274–80. 10.2466/09.14.17.PR0.108.1.274-280 [DOI] [PubMed] [Google Scholar]
  • 56.Rook J, Zijlstra F. The contribution of various types of activities to recovery. Eur J Work Organ Psychol. 2006;15:218–40. [Google Scholar]
  • 57.Reis HT, Sheldon KM, Gable SL, Roscoe J, Ryan RM. Daily well-being: The role of autonomy, competence, and relatedness. Personal Soc Psychol Bull. 2000;26(4):419–35. [Google Scholar]
  • 58.Hooftman W., Mars GM., Knops JCM, Janssen BMJ, Pleijers AJSF. Nationale Enquête Arbeidsomstandigheden 2018: Methodologie en globale resultaten. 2019. [Google Scholar]

Decision Letter 0

Michael B Steinborn

11 May 2020

PONE-D-20-10944

Dynamics in typewriting performance reflect mental fatigue during real-life office work

PLOS ONE

Dear Dr. Lorist,

Thank you for submitting your manuscript to PLOS ONE. Two experts commented on your manuscript. As you can see from the reviews, both referees found the general topic addressed in your manuscript interesting and they have a number of nice things to say about the study. At the same time, they have some remarkably constructive and excellently detailed suggestions how to further improve the paper. The comments speak for themselves, but it is obvious that one reoccurring theme is the need for more specificity regarding the hypotheses and a more systematic presentation of the statistics/results. While this will require some extra efforts, I consider it worthwhile. Hence, we invite you to submit a revision of the manuscript that addresses the remaining points together with a cover letter that contains point-by-point replies. Some additional editorial comments are added below. 

 Best regards,Michael B. Steinborn, PhDAcademic Editor

We would appreciate receiving your revised manuscript by Jun 25 2020 11:59PM. When you are ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter.

To enhance the reproducibility of your results, we recommend that if applicable you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). This letter should be uploaded as separate file and labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. This file should be uploaded as separate file and labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. This file should be uploaded as separate file and labeled 'Manuscript'.

Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out.

We look forward to receiving your revised manuscript.

Kind regards,

Michael B. Steinborn, PhD

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Please include additional information regarding the survey or questionnaire used in the study and ensure that you have provided sufficient details that others could replicate the analyses. For instance, if you developed a questionnaire as part of this study and it is not under a copyright more restrictive than CC-BY, please include a copy, in both the original language and English, as Supporting Information.

3. We note that you have indicated that data from this study are available upon request. PLOS only allows data to be available upon request if there are legal or ethical restrictions on sharing data publicly. For more information on unacceptable data access restrictions, please see http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions.

In your revised cover letter, please address the following prompts:

a) If there are ethical or legal restrictions on sharing a de-identified data set, please explain them in detail (e.g., data contain potentially sensitive information, data are owned by a third-party organization, etc.) and who has imposed them (e.g., an ethics committee). Please also provide contact information for a data access committee, ethics committee, or other institutional body to which data requests may be sent.

b) If there are no restrictions, please upload the minimal anonymized data set necessary to replicate your study findings as either Supporting Information files or to a stable, public repository and provide us with the relevant URLs, DOIs, or accession numbers. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories.

We will update your Data Availability statement on your behalf to reflect the information you provide.

Additional Editor Comments:

 line 111-220methods/statistics: statistical terms (M, SD, F, p, etc.) should be written in italics  line 251-262a distinctive feature is that the typing task requires self-paced responding (using the inter-key interval to index reaction time). Under these conditions, individuals are highly vulnerable to occasional lapses (or blocks) of performance and this tendency might also increase with time on task. This has important theoretical consequences as the time on task effect is not due to a tonic slowdown of performance speed but originates from an increase in the number of these short-term depletion (or lapses) in performance (which add to the mean performance). Individuals during a mental blockade might be entirely unable to process any information until the mental blockade dissipates, which can only be demonstrated with advanced performance measurement methods, using distributional analysis. My own work is relevant with this regard (Steinborn & Huestegge, 2016) as we have set a methodological benchmark of how to measure these effects accurately in performance settings [Steinborn, M. B., & Huestegge, L. (2016). A walk down the lane gives wings to your brain: Restorative benefits of rest  breaks on cognition and self-control. Applied Cognitive Psychology, 30(5), 795-805. doi:10.1002/acp.3255]. While I would not demand further analysis of typical speed variability (i would welcome it, however), I would appreciate if the authors could address this point in the revised version of the manuscript.   line 273-285backspace key to index erroneous behavior. In my opinion, this is a highly interesting point that should be given somewhat more weight in the discussion. As most of the research is using reaction-time based paradigms where it is not possible to correct an error, the present study uses a natural setting where an error is completely relevant to the task and where correction is not only enabled but naturally invited by the nature of the task. I suggest elaborating this point a bit further, if possible, as it would increase the impact of the present study.    

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: No

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: This paper examines the effect of time-of-week, time-of-day, and time-on-task on type writing performance in a natural setting, using a sample of about 50 participants. Type writing performance was measured using typing speed (inter key interval) and error correction (backspace) as dependent measures. It was hypothesised that the development of task-related mental fatigue (time on task) is modulated by time of week (early weekdays better than late weekday) and time of day (afternoon times better than morning times). It seems that the results support the hypothesis although some aspects of the results are not completely clear to me at the moment. My evaluation is positive, and I have a few comments that might be considered in the revision.

#1 Theory

It is clear to me that the authors are the "top experts" in that field of research on mental fatigue. I am therefore very much inclined to believe most of what is stated in the introduction. Apart from that, there are some points that could be explained in more detail or with more precision. For example, the underlying processes that are assumed to produce the performance variations at different time scale, over the week or with time on task, could be specified with a bit more precision. In other words, the rationale for expecting interactions between these variables on the relevant performance measures could be explained further. On the other hand, the manuscript is relatively concise at present and the might be little room to include too much detail, so I only ask (as an interested reader) whether the authors could elaborate a bit more on the underlying cognitive mechanisms of mental fatigue in the introduction and also in the discussion.

#2 Hypotheses / Statistics

The hypotheses are so far well formulated, however, they could be more systematically presented with respect to all relevant main effects and interactive effects on performance. Regarding the statistics, I suggest including a table that contains all main and interaction effects on performance. At the moment, I did not fully understand all aspects of the results and I therefore would appreciate if this information is more systematically presented in the revised manuscript.

#3 Results

Although the authors are interested in some more specific effects, I would appreciate if they could provide a full model of all main and interaction effects on both performance measures (typing speed, and error correction). For example, there are expected main effects of the factor "time of week", of the factor time of day, and also on "time on task", then three two-way interactions and one three-way interaction, resulting in seven relevant statistical effects. A more integrative presentation of these results would certainly improve the current manuscript.

#4 self-report state

Performance is evidently influenced by momentary states within the individual. Given the evidence of relationships between subjective engagement to a task and objective performance, I wonder whether self-report measures are available for the present research, and if so, whether they could deliver additional information. For example, the dundee stress state questionnaire (DSSQ, Matthews et al., 2002; Langner et al., 2010) seems to be the proper instrument to assess these aspects but other instruments might also do well with this regard. I would appreciate if the authors could give a short opinion or outlook on the possibilities of assessing engagement and to elaborate somewhat more deeply on potential limitations with this regard in the present study [top references: [Langner, R. et al. (2010). Mental fatigue and temporal preparation in simple reaction-time performance. Acta Psychologica, 133(1), 64-72. doi:10.1016/j.actpsy.2009.10.001; Matthews, G. et al. (2002). Fundamental dimensions of subjective state in performance settings: Task engagement, distress, and worry. Emotion, 2(4), 315-340. doi:10.1037//1528-3542.2.4.315].

#5 minors and typoes

line 165-168, table 1

table according to APA rules, provide sufficient information as notes

line 170-end of results

F values should be rounded up (2,97924 to 3.0)

line 185-190, figure 1

if possible, the results should be presented not separately but in one figure so that the reader can evaluate the main and interactive effects of the results simultaneously. At present, the results are distributed across separated figures which makes it difficult to understand the whole picture

line 349, references

check typos

Reviewer #2: Background:

The study examines how mental fatigue, measured by markers of typewriting performance for speed and accuracy, changes over different time scales during regular office work.

To this end, the authors aimed to:

1. provide a proof-of-principle regarding two formerly identified markers of typewriting performance

2. investigate changes in typewriting performance over different time scales

As a result, differences in typewriting performance were found for the different time scales under investigation. These differences resemble those found in an earlier study, where a direct link between typewriting performance and mental fatigue was established.

Evaluation:

Overall, my evaluation is positive, especially if a few points are further clarified. Except for a few minor exceptions, the manuscript is a pleasurable read. The document is formally well written, mostly to the point and adequate in length. A few less strong points regard the elaboration of the effective mechanism under investigation and the statistical reporting. I believe there is potential for improvement with this regard. My detailed comments are outlined below. Please note that my comments are aimed at further improving the manuscript, and are not meant to criticize the authors' work.

1. Theory

The predictions would benefit from further elaboration. While it is understood that the proof-of-principle character of the study somewhat mitigates this point, it is so far not clear what exactly is predicted for either of the different time-scales. Thus, the reported results for performance for either time-scale might point to an effect of mental fatigue, they might, however, also be somewhat unrelated to mental fatigue and caused by additional factors. I would therefore expect the authors to provide further clarity regarding their expectations for each outcome measure and time-scale.

2. Statistical reporting

The statistical reporting part lacks a systematic and comprehensive overview of the reported outcome measures. Given this lack, it is thus far not possible to readily understand either the main effects or the interactions found for typewriting performance. Statistics should thus be systematically described in tables, reporting all main and interaction effects. A detailed table of complete results as well as some sort of reporting of the compiled data (means, standard deviations, ... for all days and time-scales seperately as well as combined) would also greatly improve the readers' ability to comprehend the reported results.

3. Methods

- potential trade-off between speed and accuracy

As it stands, typewriting markers for speed and accuracy are not measured independently of each other and might thus be confounded. A faster typing speed might result in more necessary backstrokes to correct a mistake. Similarly, more mistakes and thus more (potentially rather fast) backstrokes might lead to an enhanced typing speed. Given the design of the study, both these effects would not be correctly reflected by the current measures. While I understand that an independent measurement might be difficult to achieve, I suggest the authors should provide some further analysis of how this problem might affect the found results.

- sample bias

The comparatively large drop-out ratio appears problematic in regards to a potential sample bias, especially given the rather strict criteria of more than 45 minutes of uninterrupted work for at least 30 times a week. Several potential solutions come to mind and are highly recommened to provide further clarity regarding this point:

- reconsider the strict criteria or give a more comprehensive explanation for its choice

- discuss potential effects from excluding a large portion of the sample, especially regarding the potential that mental fatigue might might be more pronounced in the dropped-out participants (thus leading to a smaller amount of unterinterrupted work)

Line comments:

line 107 (theory): "... expected that typing performance would ..." This hypothesis would especially benefit from further clarification, particularly since no overall measure of performance is given and therefore no possible way to judge whether hypothesis will be approved or rejected if speed and accuracy fail to change in the expected dimension

line 112-120 (methods - participants): please provide a more detailed description of the work the participants carry out and on what exactly is typed by them (predominantly email, research articles, ...)

line 118 (methods - participants): please clarifiy what is meant by continous typing

line 134-136 (methods - typing performance): it is not easily understood how the series of average values are generated, please provide some more details on this

line 146-147 (methods - procedure): please explain the nature of the feedback in more detail. It might also be beneficial to discuss the implications of the given feedback on performance and the given results, if any effect is expected

line 205 (results - day-of-week): given that most people work less hours on Fridays, performance on Friday afternoons might lack data points. I suggest to provide more comprehensive and complete results to understand how potential effects like these were treated.

line 311 (discussion): Given that mental fatigue was measured only on weekdays, but might also occur on Saturdays and Sundays, such a general conclusion might be exagerated to a certain degree. Thus, "baseline" measurement on Mondays might not reflect a true baseline for mental fatigue.

line 329 (discussion): typo "lead", should read "led"

line 333 (discussion): typo "subtitle", should read "subtle"

line 336-337 (discussion): please explain in greater detail how the presented results point to information about the amount of hours employees can or should work

Figure 2: please note what is indicated by confidence intervals

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files to be viewed.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email us at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2020 Oct 6;15(10):e0239984. doi: 10.1371/journal.pone.0239984.r002

Author response to Decision Letter 0


24 Aug 2020

Response to reviewer and editor comments

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming.

The style requirements were checked and applied.

2. Please include additional information regarding the survey or questionnaire used in the study and ensure that you have provided sufficient details that others could replicate the analyses. For instance, if you developed a questionnaire as part of this study and it is not under a copyright more restrictive than CC-BY, please include a copy, in both the original language and English, as Supporting Information.

We included two questionnaires in S1 Appendix and S2 Appendix. The questionnaire in S1 Appendix includes demographic and work-related questions. The weekly questionnaire which is included in S2 Appendix focused on how participants experienced the week before.

3. We note that you have indicated that data from this study are available upon request. PLOS only allows data to be available upon request if there are legal or ethical restrictions on sharing data publicly. For more information on unacceptable data access restrictions, please see http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions.

In your revised cover letter, please address the following prompts:

a) If there are ethical or legal restrictions on sharing a de-identified data set, please explain them in detail (e.g., data contain potentially sensitive information, data are owned by a third-party organization, etc.) and who has imposed them (e.g., an ethics committee). Please also provide contact information for a data access committee, ethics committee, or other institutional body to which data requests may be sent.

b) If there are no restrictions, please upload the minimal anonymized data set necessary to replicate your study findings as either Supporting Information files or to a stable, public repository and provide us with the relevant URLs, DOIs, or accession numbers. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories.

We will update your Data Availability statement on your behalf to reflect the information you provide.

There are legal restrictions on sharing the data:

“The data cannot be publicly shared because the data is considered to be personal data under the GDPR, and cannot be further anonymized. The research participants have not given their consent for sharing data beyond the original research purpose, which does include data access for independent verification of the results, but prohibits public sharing. These limitations are imposed by European law and verified by the data office of the faculty of Behavioral and Social Sciences.

Data access can be requested via the data office of the faculty of Behavioral and Social Sciences of the University of Groningen: research-data-bss@rug.nl. Please quote study code: #2017-03_02 ECFEB in the request.”

Additional Editor Comments:

line 111-220

methods/statistics: statistical terms (M, SD, F, p, etc.) should be written in italics

We would like to thank the editor for pointing this out. We changed this throughout the manuscript.

line 251-262

a distinctive feature is that the typing task requires self-paced responding (using the inter-key interval to index reaction time). Under these conditions, individuals are highly vulnerable to occasional lapses (or blocks) of performance and this tendency might also increase with time on task. This has important theoretical consequences as the time on task effect is not due to a tonic slowdown of performance speed but originates from an increase in the number of these short-term depletion (or lapses) in performance (which add to the mean performance). Individuals during a mental blockade might be entirely unable to process any information until the mental blockade dissipates, which can only be demonstrated with advanced performance measurement methods, using distributional analysis. My own work is relevant with this regard (Steinborn & Huestegge, 2016) as we have set a methodological benchmark of how to measure these effects accurately in performance settings [Steinborn, M. B., & Huestegge, L. (2016). A walk down the lane gives wings to your brain: Restorative benefits of rest breaks on cognition and self-control. Applied Cognitive Psychology, 30(5), 795-805. doi:10.1002/acp.3255]. While I would not demand further analysis of typical speed variability (i would welcome it, however), I would appreciate if the authors could address this point in the revised version of the manuscript.

We enjoyed reading the article and agree that the manuscript would benefit from discussing these findings. In the present manuscript, we analyzed average interkey intervals and percentage of backspaces, therefore it was not possible to compare the distribution of the individual responses, as was described in the study of Steinborn and Huestegge (2016). However, we looked into the data of our previous study (de Jong et al., 2018), where we investigated the effects of prolonged task performance on typewriting in an experimental setting. Here, we did neither observe a change in the distribution of the interkey intervals over time, nor an increase in extremely large interkey intervals. Still, we agree that it would be interesting to highlight this part of the literature and encourage further research into lapses of attention in the working environment. Therefore, we added a paragraph in the discussion on page 23, lines 407-420:

“The present study focused on the dynamics in typewriting during prolonged task performance. Previous research repeatedly showed that, in addition to a tonic decline in speed and accuracy over time, fatigued participants also experience short-term lapses in performance during which they are unable to process any information (Bills, 1931; Bertelson & Joffe, 1963). These phasic lapses in performance, so-called mental blocks, are characterized by extremely long reaction times during experimental tasks and can be detected by studying the distribution of reaction times (Steinborn and Huestegge, 2016). In the present study, we excluded lapses of attention by excluding interkey intervals that were longer than 5 seconds. However, for future research it would be interesting to investigate whether the effects of prolonged task performance on length or number of short-term lapses in performance during prolonged typewriting follow a similar pattern as the tonic effects of prolonged task performance on typewriting.”

line 273-285

backspace key to index erroneous behavior. In my opinion, this is a highly interesting point that should be given somewhat more weight in the discussion. As most of the research is using reaction-time based paradigms where it is not possible to correct an error, the present study uses a natural setting where an error is completely relevant to the task and where correction is not only enabled but naturally invited by the nature of the task. I suggest elaborating this point a bit further, if possible, as it would increase the impact of the present study.

We agree with the editor that using correction behavior is an interesting index of erroneous behavior, especially in mental fatigue research. After re-reading the manuscript, we also agree that this measure deserves a more elaborate discussion. In the discussion of the manuscript we elaborated this point somewhat further in the discussion on page 23, lines 421-431:

“Error-corrections directly reflect accuracy on a given task, and using error-corrections during typewriting as a measure of mental fatigue might therefore provide information on the effects of mental fatigue on underlying cognitive processes. Monitoring performance requires higher-order mental functions, which are prone to the effects of mental fatigue. Experimental studies showed that erroneous responses are usually followed by specific brain activation patterns, called the error-related negativity (ERN; Falkenstein et al., 1991; Gehring et al., 1993), and result in decreased response speed on the next trial (i.e. post-error slowing; e.g., Rabbit, 1966). Lorist and colleagues (2005) investigated these behavioral and brain activity patterns during prolonged performance on an Eriksen flanker task. They found that performance monitoring declined over time, which was reflected in a significant decrease of brain activity patterns related to error processing (ERN), and was accompanied by a decrease of post-error slowing. This decreased ability to monitor behavior and adapt performance concurrently might have resulted in a later detection of errors and therefore in an increase in error-corrections during the present study, as was also shown in the controlled lab study of de Jong and colleagues (2018).”

In addition, we added a sentence to the abstract (page 2, lines 29-30) to make the relevance of backspace use as an indicator of erroneous behavior in a natural environment more clear to the reader.

“allowing not only to examine performance speed, but also providing a natural setting to study error correction.”

Comments to the Author

Reviewer #1:

This paper examines the effect of time-of-week, time-of-day, and time-on-task on type writing performance in a natural setting, using a sample of about 50 participants. Type writing performance was measured using typing speed (inter key interval) and error correction (backspace) as dependent measures. It was hypothesised that the development of task-related mental fatigue (time on task) is modulated by time of week (early weekdays better than late weekday) and time of day (afternoon times better than morning times). It seems that the results support the hypothesis although some aspects of the results are not completely clear to me at the moment. My evaluation is positive, and I have a few comments that might be considered in the revision.

We would like to thank the reviewer for their thoughtful comments. As a result we were able to greatly improve the manuscript.

#1 Theory

It is clear to me that the authors are the "top experts" in that field of research on mental fatigue. I am therefore very much inclined to believe most of what is stated in the introduction. Apart from that, there are some points that could be explained in more detail or with more precision. For example, the underlying processes that are assumed to produce the performance variations at different time scale, over the week or with time on task, could be specified with a bit more precision. In other words, the rationale for expecting interactions between these variables on the relevant performance measures could be explained further. On the other hand, the manuscript is relatively concise at present and the might be little room to include too much detail, so I only ask (as an interested reader) whether the authors could elaborate a bit more on the underlying cognitive mechanisms of mental fatigue in the introduction and also in the discussion.

Based on these suggestions, we changed the manuscript in three ways, which are discussed below:

#1 We agree with the reviewer that it would benefit the manuscript if we elaborate a bit more on the underlying cognitive mechanisms of mental fatigue. We added a couple of sentences to the introduction on page 3, lines 58-64:

“Several theories regarding the cognitive mechanisms behind its manifestation have been proposed, eventually leading to the widely accepted theory suggesting that mental fatigue develops as a result of a cost-benefit evaluation of effort (Boksem and Tops, 2008; Hockey, 2011). According to this theory, if the costs of performing a task exceed the benefits of finishing the task, people will come to experience subjective feelings of mental fatigue (e.g., aversion against task performance, low vigilance) and performance deteriorates (i.e., people become slower and less accurate).”

#2 In line with the comments of reviewer 2, we also more explicitly discussed the previous research on the effects of prolonged task performance on different time-scales on page 5, lines 107-118.

“Although there is substantial evidence that prolonged task performance, manipulated by time-on-task, time-of-day, and day-of-week, separately influence performance, interestingly, it is not yet known how these factors interact and subsequently influence behavior dynamics.

Previous experimental studies on mental fatigue mainly focused on the effects of time-on-task on behavioral performance, investigating isolated effects of prolonged task performance on specific cognitive processes (e.g., error processing; Lorist et al., 2005). In addition, studies in real-life settings focused on specific professions, especially those involving shift-work (Brown et al., 2020; Kecklund & Axelsson, 2016), where the manifestation of fatigue was expected to be potent or even dangerous, given its relationship with serious accidents (Akerstedt & Haraldsson, 2001; Swaen et al., 2002; Chan, 2011). There seems to have been done little research on the manifestation of mental fatigue during regular 9 to 5 jobs.”

#3 The editor made a comment about more explicitly discussing the use of correction behavior as an index of erroneous behavior, especially in mental fatigue research. We took this opportunity to more elaborately discuss the cognitive processes underlying erroneous behavior, which have been found to be affected by mental fatigue. See our reaction to this comment of the editor.

#2 Hypotheses / Statistics

The hypotheses are so far well formulated, however, they could be more systematically presented with respect to all relevant main effects and interactive effects on performance. Regarding the statistics, I suggest including a table that contains all main and interaction effects on performance. At the moment, I did not fully understand all aspects of the results and I therefore would appreciate if this information is more systematically presented in the revised manuscript.

We would like to thank the reviewer for pointing this out. Both reviewer 1 and 2 asked for clarification of the hypotheses and the statistics. Based on these comments, we more systematically presented the hypotheses in the introduction on page 6, lines 128-138:

“In line with findings in an experimental setting (de Jong et al., 2018), we hypothesized that there would be a main effect of time-on-task on both the interkey interval and the percentage of backspaces, where we expected that both measures would increase with time-on-task. Secondly, we hypothesized that the magnitude of the effect of time-on-task on these performance measures would depend on time-of-day (main effect). That is, we expected a larger increase in both the interkey interval and the percentage of backspaces with time-on-task in the afternoon (interaction) than in the morning. Lastly, we hypothesized that typewriting patterns would change over the course of the week. We expected these changes to manifest in two ways. First, we hypothesized that employees became slower and less accurate over the week (main effect), and second, we expected a larger decline of performance (interkey interval and backspace use) with time-on-task over the week (interaction).”

Additionally, we re-analyzed the data and rewrote the result section, including several tables that reflect the main and interaction effects. These were included to address the results more systematically. We also changed the design by including the interaction of time-on-task and time-of-day with day-of-week. Although this affected some of the results, these revisions did not change our main conclusions.

#3 Results

Although the authors are interested in some more specific effects, I would appreciate if they could provide a full model of all main and interaction effects on both performance measures (typing speed, and error correction). For example, there are expected main effects of the factor "time of week", of the factor time of day, and also on "time on task", then three two-way interactions and one three-way interaction, resulting in seven relevant statistical effects. A more integrative presentation of these results would certainly improve the current manuscript.

As we already mentioned in our response to the previous comment, we completely rewrote the result section including multiple tables to address the results more systematically. We hope our results are now more clear.

#4 self-report state

Performance is evidently influenced by momentary states within the individual. Given the evidence of relationships between subjective engagement to a task and objective performance, I wonder whether self-report measures are available for the present research, and if so, whether they could deliver additional information. For example, the dundee stress state questionnaire (DSSQ, Matthews et al., 2002; Langner et al., 2010) seems to be the proper instrument to assess these aspects but other instruments might also do well with this regard. I would appreciate if the authors could give a short opinion or outlook on the possibilities of assessing engagement and to elaborate somewhat more deeply on potential limitations with this regard in the present study [top references: [Langner, R. et al. (2010). Mental fatigue and temporal preparation in simple reaction-time performance. Acta Psychologica, 133(1), 64-72. doi:10.1016/j.actpsy.2009.10.001; Matthews, G. et al. (2002). Fundamental dimensions of subjective state in performance settings: Task engagement, distress, and worry. Emotion, 2(4), 315-340. doi:10.1037//1528-3542.2.4.315].

We appreciate that the reviewer would like to know our opinion or outlook on the possibility of assessing subjective engagement. It would definitely have been interesting to measure and compare the participants’ subjective state with the continuous performance measures. In the present study, however, we chose not to measure the participants’ subjective state. The main reason not to include self-report measures is that this study was specifically conducted in order to provide insight into the effects of prolonged task performance during regular working activities without interfering with these activities. Asking participants to fill out questionnaires would have influenced the flow of work. Moreover, since we did not know the exact schedule of the participants beforehand, we neither were able to schedule subjective state measures on a micro level (time-on-task) without interrupting ongoing activities.

Additionally, based on our previous research on mental fatigue, including our findings in an experimental setting examining typewriting performance (de Jong et al., 2018) showed that subjective fatigue and performance not necessarily change simultaneously.

Although we did not incorporate subjective engagement questions for any of the time-scales, we agree that adding these questionnaires (e.g., at the end of the different days of the week to prevent interruptions during a working day) might have provided additional insights into the effects we found on Mondays and Fridays, which slightly deviated from the current literature. However, this still would not have enabled us to examine the relationship between the subjective scores and the effects of time-on-task and time-of-day.

#5 minors and typoes

line 165-168, table 1

table according to APA rules, provide sufficient information as notes

We changed the table according to APA rules, and provided additional information in the notes below the table.

line 170-end of results

F values should be rounded up (2, 97924 to 3.0)

F-values need to be reported with degrees of freedom. We noticed that we did not put a space between the comma and the next number. Therefore, it might appear as one number. We changed this throughout the manuscript.

line 185-190, figure 1

if possible, the results should be presented not separately but in one figure so that the reader can evaluate the main and interactive effects of the results simultaneously. At present, the results are distributed across separated figures which makes it difficult to understand the whole picture

We would like to thank the reviewer for this suggestion. We would like to argue, however, that including all the effects in one figure would not benefit a systematic discussion of our results. We had specific hypotheses that correspond with the included figures. We hope that more systematically discussing these hypotheses and the rewritten result section helps with better understanding our hypotheses and our findings regarding these hypotheses.

line 349, references

check typos

We now more thoroughly checked the reference list on typos.

Reviewer #2:

Background:

The study examines how mental fatigue, measured by markers of typewriting performance for speed and accuracy, changes over different time scales during regular office work.

To this end, the authors aimed to:

1. provide a proof-of-principle regarding two formerly identified markers of typewriting performance

2. investigate changes in typewriting performance over different time scales

As a result, differences in typewriting performance were found for the different time scales under investigation. These differences resemble those found in an earlier study, where a direct link between typewriting performance and mental fatigue was established.

Evaluation:

Overall, my evaluation is positive, especially if a few points are further clarified. Except for a few minor exceptions, the manuscript is a pleasurable read. The document is formally well written, mostly to the point and adequate in length. A few less strong points regard the elaboration of the effective mechanism under investigation and the statistical reporting. I believe there is potential for improvement with this regard. My detailed comments are outlined below. Please note that my comments are aimed at further improving the manuscript, and are not meant to criticize the authors' work.

We would like to thank the reviewer for the positive and constructive feedback. Based on the reviewer’s comments we were able to clarify and better present our findings.

1. Theory

The predictions would benefit from further elaboration. While it is understood that the proof-of-principle character of the study somewhat mitigates this point, it is so far not clear what exactly is predicted for either of the different time-scales. Thus, the reported results for performance for either time-scale might point to an effect of mental fatigue, they might, however, also be somewhat unrelated to mental fatigue and caused by additional factors. I would therefore expect the authors to provide further clarity regarding their expectations for each outcome measure and time-scale.

We would like to thank the reviewer for pointing this out. We agree with the reviewer that the expected changes in performance for the interkey interval and for backspace use were not clearly described. Therefore, we rewrote the hypotheses paragraph on page 6, lines 128-138 (see our response to reviewer 1).

2. Statistical reporting

The statistical reporting part lacks a systematic and comprehensive overview of the reported outcome measures. Given this lack, it is thus far not possible to readily understand either the main effects or the interactions found for typewriting performance. Statistics should thus be systematically described in tables, reporting all main and interaction effects. A detailed table of complete results as well as some sort of reporting of the compiled data (means, standard deviations, ... for all days and time-scales seperately as well as combined) would also greatly improve the readers' ability to comprehend the reported results.

We agree with the reviewers and the editor that the statistics lack a systematic and comprehensive overview. Therefore, we rewrote the result section, where we now included multiple tables thereby providing an overview of the interaction and main effects.

3. Methods

- potential trade-off between speed and accuracy

As it stands, typewriting markers for speed and accuracy are not measured independently of each other and might thus be confounded. A faster typing speed might result in more necessary backstrokes to correct a mistake. Similarly, more mistakes and thus more (potentially rather fast) backstrokes might lead to an enhanced typing speed. Given the design of the study, both these effects would not be correctly reflected by the current measures. While I understand that an independent measurement might be difficult to achieve, I suggest the authors should provide some further analysis of how this problem might affect the found results.

We would like to thank the reviewer for this interesting comment. We compared the effect of time-on-task on the interkey interval and the effect of time-on-task on the percentage of backspace keystrokes. In the morning, we did not find a correlation between these measures, suggesting that a larger interkey intervals with time-on-task was not accompanied by a decrease in backspace keystrokes. In the afternoon, we observed a positive correlation between the effects of time-on-task on both performance measures. That is, participants that showed a larger interkey interval with time-on-task, also used the backspace key more frequently, which means that a larger decline in speed is accompanied by a larger decline in accuracy.

These results contradict the scenario that is described by the reviewer, and suggest that changes in typewriting with time-on-task do not reflect a change in speed-accuracy trade-off. We rewrote the result section, and reworked the paragraph regarding the relationship between speed and accuracy on page 18 (lines 307-318) in order to make this more clear to the reader.

- sample bias

The comparatively large drop-out ratio appears problematic in regards to a potential sample bias, especially given the rather strict criteria of more than 45 minutes of uninterrupted work for at least 30 times a week. Several potential solutions come to mind and are highly recommened to provide further clarity regarding this point:

- reconsider the strict criteria or give a more comprehensive explanation for its choice

- discuss potential effects from excluding a large portion of the sample, especially regarding the potential that mental fatigue might might be more pronounced in the dropped-out participants (thus leading to a smaller amount of unterinterrupted work)

We agree with the reviewer that there is a large drop-out ratio, and that his needs further clarification in our revised manuscript. The participant were employees of the University of Groningen and there was large variation across function profiles (see Table 1) and tasks (e.g., writing research articles, teaching). During this study, several participants were performing teaching activities or were collecting research data, which resulted in limited typewriting, at least in their normal office environment. In addition, part of the participants used multiple work stations (e.g., office computer and laptop), but did not install the keylogging tool on both work stations. Due to these factors, we excluded a large portion of our participants. We clarified this in the method section on page 7, lines 128-138:

“There was variation in function profile across participants that were included in the study (i.e., scientific staff, support staff). Participants that were excluded from the analyses performed working activities during the measurement period that did not include the required amount of typing activities (e.g., teaching and collecting research data), which was specifically the case for Ph.D. students and (Postdoctoral) researchers. In addition, a number of participants worked on multiple workstations during the 6 weeks of data collection, which was reflected in a limited amount of typewriting data that was recorded from these participants at the workstation on which the recording software was installed. Data of these participants were excluded from the analyses, as well.”

Line comments:

line 107 (theory): "... expected that typing performance would ..." This hypothesis would especially benefit from further clarification, particularly since no overall measure of performance is given and therefore no possible way to judge whether hypothesis will be approved or rejected if speed and accuracy fail to change in the expected dimension

In line with the comment of reviewer 1 and 2 regarding clarification of our predictions, we clarified, and more systematically presented our hypotheses on page 6, lines 128-138.

line 112-120 (methods - participants): please provide a more detailed description of the work the participants carry out and on what exactly is typed by them (predominantly email, research articles, ...)

We would like to thank the reviewer for pointing this out. Due to privacy concerns, we did not ask the participants about their work activities.

line 118 (methods - participants): please clarifiy what is meant by continous typing

We clarified what was meant by continuous typing in the method section on page 9, lines 176-178:

“In the present study, continuous typewriting was defined as typewriting during a block of at least 45 minutes.”

line 134-136 (methods - typing performance): it is not easily understood how the series of average values are generated, please provide some more details on this.

We rewrote the paragraph about typing performance in the method section in order to improve the readers understanding of how the series of average values are generated (page 8, lines 168-178).

line 146-147 (methods - procedure): please explain the nature of the feedback in more detail. It might also be beneficial to discuss the implications of the given feedback on performance and the given results, if any effect is expected

We included more information on the nature of the feedback on page 9, lines 185-190:

“During this week, participants also filled out a questionnaire with demographic and work-related questions (S1 Appendix). Each Monday, starting in the second week of the experiment, participants filled out a questionnaire with general questions about how they experienced the week before (S2 Appendix). Each working day, participants received real-time feedback on their performance provided via text messages on their mobile phones and via email. An overview was provided via email at the end of the day.“

Additionally, we analyzed the questionnaire data (5-point likert-scale, where 1 reflects completely agree and 5 reflects completely disagree) in order to investigate to what extend participants were aware of the feedback and whether they used the feedback to adapt their behavior. Although participants read the feedback that was provided to them (M =1.54, SD=0.99), they did not use the feedback to adapt their behavior (M =3.56, SD=1.43). This might be explained by the fact that they did not find the feedback reliable (M=3.20, SD=1.33).

line 205 (results - day-of-week): given that most people work less hours on Fridays, performance on Friday afternoons might lack data points. I suggest to provide more comprehensive and complete results to understand how potential effects like these were treated.

There were two participants in our sample that did not perform typewriting activities on Fridays and an additional 2 participants that did not perform typewriting activities in the afternoon. Except for baseline differences due to different typewriting style, excluding these participants from the analyses did not influence our results.

line 311 (discussion): Given that mental fatigue was measured only on weekdays, but might also occur on Saturdays and Sundays, such a general conclusion might be exagerated to a certain degree. Thus, "baseline" measurement on Mondays might not reflect a true baseline for mental fatigue.

We agree with the reviewer that measurement on Monday might not have reflected a true baseline. However, backspace use remains stable on Wednesday, Thursday and Friday. Additionally, the interkey interval decreases from Wednesday to Friday, reflecting an increase in typing speed. Moreover, the effects of time-on-task and time-of-day on the interkey interval and backspace use are not more pronounced over the working week. These results suggest that fatigue builds up over the working week. We clarified this in the discussion on P23 lines 434-441:

“First, we found no evidence for a general decline in typewriting performance with day-of-week, given that backspace use remains stable on Wednesday, Thursday and Friday, and the interkey interval decreases, reflecting an increase in typing speed, from Wednesday to Friday. Second, we found that the effects of prolonged task performance on typing speed and accuracy followed a similar pattern over the different days of the week, suggesting that mental fatigue elicited on the previous day, as reflected in the effects of time-on-task and time-of-day, did not influence the course of performance during prolonged task performance on the next day. These results provide proof that mental fatigue does not accumulate across the days of the week.”

line 329 (discussion): typo "lead", should read "led"

line 333 (discussion): typo "subtitle", should read "subtle"

We corrected the typo’s in the revised manuscript.

line 336-337 (discussion): please explain in greater detail how the presented results point to information about the amount of hours employees can or should work

We agree with the reviewer that the discussion could use greater detail on the implications of the presented results. We added a paragraph in the discussion (page 25, lines 461-483):

“This study has implications for real-life working environments, given that a large part of the working population regularly performs computer work. In the Netherlands, for example, 40% of the employees perform computer work more than 6 h every day (Hooftman et al, 2019). There are several ways in which monitoring typewriting could support employees during their work. First, personalized real-time feedback based on changes in typing behavior could be provided to the users in order to help them detect when lapses in performance occur and a short break might be beneficial. However, real-time feedback might be biased due to dynamics in typewriting performance that are not related to lapses in performance. One of the characteristics of our working environment is the large variability in working conditions, due to changes in work-related tasks, noise in the working environment, and changes in general persons state, among others. Our method also allows monitoring performance over a longer period of time enabling us to detect regularities in working activities. Related to this, a second possibility of our method is to provide feedback on an individual level to help employees realize a more optimal work-break schedule that is complementary with their individual state and specific work-related demands. By comparing behavior dynamics over several weeks, typing behavior could help decide when, during the workday or -week an employee should work on tasks that need high accuracy or when it is better to work on less demanding tasks. A third option is to use changes in typing behavior to evaluate interventions in the working environment. For instance, it might provide relevant information with regard to performance efficiency for evaluating the effectiveness of a 6-hour workday instead of our regular 8-hour workday. Previously, researchers already used questionnaires to evaluate this specific intervention, however, measuring performance, and importantly, doing so without interrupting regular activities, could enhance our knowledge of its effects on performance and productivity more objectively.”

Figure 2: please note what is indicated by confidence intervals

The confidence intervals reflect the standard errors of the mean. We added this information to the figures.

Attachment

Submitted filename: Response to the reviewers.docx

Decision Letter 1

Michael B Steinborn

17 Sep 2020

Dynamics in typewriting performance reflect mental fatigue during real-life office work

PONE-D-20-10944R1

Dear Dr. Lorist,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Michael B. Steinborn, PhD

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: All comments have been addressed

Reviewer #2: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: The authors did a good job in the revision. The manuscript improved considerably. I can recommend it in the present form.

Reviewer #2: (No Response)

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

Acceptance letter

Michael B Steinborn

21 Sep 2020

PONE-D-20-10944R1

Dynamics in typewriting performance reflect mental fatigue during real-life office work

Dear Dr. Lorist:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Michael B. Steinborn

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 Appendix

    (DOCX)

    S2 Appendix

    (DOCX)

    Attachment

    Submitted filename: Response to the reviewers.docx

    Data Availability Statement

    Data cannot be shared publicly because participants did not give their consent for sharing the data with other parties than the researchers. Data are available from the Behavioral and Social Sciences Institutional Data Access for researchers who meet the criteria for access to confidential data (research-data-bss@rug.nl).


    Articles from PLoS ONE are provided here courtesy of PLOS

    RESOURCES