Skip to main content
PLOS One logoLink to PLOS One
. 2021 Apr 19;16(4):e0249755. doi: 10.1371/journal.pone.0249755

Citizen science with colour blindness: A case study on the Forel-Ule scale

Olivier Burggraaff 1,2,*, Sanjana Panchagnula 1,3, Frans Snik 1
Editor: Adrian G Dyer4
PMCID: PMC8055037  PMID: 33872327

Abstract

Many citizen science projects depend on colour vision. Examples include classification of soil or water types and biological monitoring. However, up to 1 in 11 participants are colour blind. We simulate the impact of various forms of colour blindness on measurements with the Forel-Ule scale, which is used to measure water colour by eye with a 21-colour scale. Colour blindness decreases the median discriminability between Forel-Ule colours by up to 33% and makes several colour pairs essentially indistinguishable. This reduces the precision and accuracy of citizen science data and the motivation of participants. These issues can be addressed by including uncertainty estimates in data entry forms and discussing colour blindness in training materials. These conclusions and recommendations apply to colour-based citizen science in general, including other classification and monitoring activities. Being inclusive of the colour blind increases both the social and scientific impact of citizen science.

1 Introduction

Colour measurements are common in citizen science. They are often done using red-green-blue (RGB) consumer cameras such as smartphones [13], but also with the human eye. Human colour measurements are used in such diverse fields as coral reef monitoring [4], snail evolution [5], soil surveying [6], climate adaptation [7], and water colour [810]. The data are expressed through a qualitative label [5] or by comparison with a colour chart [4, 6, 811]. Colour is a useful proxy for underlying properties such as chemical composition [11, 12] and the simplicity of measuring with the eye enables low-cost measurements over large areas and long time series [12, 13].

Accessibility and inclusivity are key to successful citizen science [14, 15]. A large and diverse group of participants increases the social and scientific impact of citizen science [5, 14, 16, 17]. However, recruiting and retaining participants is challenging [14, 15, 18, 19]. Important motivations to participate are a feeling of contributing to science and environmental protection [6, 1619], learning [7, 17, 18], and simply having fun [9, 14, 16, 17, 19]. Common reasons to stop participating include mis- or not understanding the project [17], perceiving the data as not valuable [4, 16, 18], and difficulty in performing the measurements [6, 9, 18, 20].

While colour vision is often assumed to be universal, many differences exist between individuals. Colour blindness, or colour deficiency, affects up to 9% of men and 2% of women, depending on ethnicity and other genetic factors [21]. It reduces or even eliminates one’s ability to distinguish certain colours, most commonly red and green [21]. Colour blindness is typically congenital [2125], but can also be acquired through age or disease [25, 26].

Three forms of colour blindness exist, namely anomalous trichromacy, dichromacy, and monochromacy. Each affects the eye’s three pigments in a different way. These pigments are labelled LMS for long-, medium-, and short-wave, respectively, with peak sensitivity wavelengths of 560, 530, and 420 nm [22]. In anomalous trichromacy, a single pigment has an atypical spectral response, reducing one’s colour discrimination abilities [22, 23, 27]. This is called protanomaly, deuteranomaly, or tritanomaly, for the respective LMS pigments. Dichromacy is a complete lack of one pigment, similarly called protanopia, deuteranopia, or tritanopia [22]. Finally, monochromacy is a complete lack of multiple cones, causing a full lack of colour vision. Monochromacy is exceedingly rare [22, 24] and is not discussed further in this work.

Colour blindness is often treated as a continuous spectrum from regular colour vision (all pigments present and typical) through degrees of anomalous trichromacy (one pigment atypical) to dichromacy (one pigment wholly missing) [21, 22, 26]. For simplicity, the three LMS deficiencies are referred to as protan, deutan, and tritan, respectively [21]. Protan and deutan are the most common, affecting for example up to 9% of men and 0.6% of women in Europe, as well as 7% of men and 2% of women in China [21]. The prevalence of tritan in the West is on the order of 1:10 000 [25], though higher prevalences have been reported in other locations [28].

Colour blindness limits the accessibility of citizen science that involves colour measurements for up to 1 in 11 participants. However, to our knowledge, little research has gone into its potentially far-reaching consequences. Such work has been done for science communication, for example in designing inclusive colour maps [27, 29].

As a case study, we investigate the impact of colour blindness on water colour measurements with the Forel-Ule (FU) scale. This scale quantifies human water colour measurements [30] by assigning a numerical value from 1–21 to a predetermined set of colours, shown in Fig 1. These range from indigo blue (FU 1) through green (FU 11) to cola brown (FU 21). First used in the 1890s by Forel and Ule [31, 32], it provides the longest continuous record of ocean colour [13]. For instance, Wernand and Van der Woerd used 17 171 archival FU measurements from 1930 to 1999 to derive long-term biogeochemical trends in the Pacific Ocean [12]. Properties of a water body that can be derived from its colour include suspended particles, dissolved organic matter, and algal pigments such as chlorophyll-a [12, 13, 33].

Fig 1. The Forel-Ule scale.

Fig 1

The individual FU colours are shown on the right, a comparison to the human gamut on the left. The gamut is plotted in (x, y) chromaticity, normalized from CIE XYZ and shown with a constant brightness, and converted to sRGB. The FU scale increases from 1 (bottom left) to 21 (far right). The shaded area represents the full gamut of regular colour vision, while the coloured triangle represents the sRGB colour space, which most computer monitors are limited to. The perceived colours may vary depending on monitor or printer settings and the reader’s own colour vision.

The FU scale is commonly used by professionals [12, 34] and by citizen scientists [8, 10]. Measurements are done by comparing a physical standard colour scale to a water body. For citizen science, the original scale made from 21 vials of pigment mixes [35] may be replaced with plastic filters [10] or a printed version [8], making it easier to use. Having this physical reference reduces the effects of variations in illumination, though in all cases it is difficult to guarantee colour consistency.

We use simulations to determine the effects of colour blindness on FU measurements. Such digital simulations accurately reproduce colour blind vision [27, 36]. The discriminability of the resulting shifted colours is assessed using the CIE ΔE00 colour difference measure [37]. This way, the impact of colour blindness on FU measurements is quantified.

Based on these results, we make general recommendations for dealing with colour blindness in citizen science. These include guidelines for data entry protocols and training materials, benefiting citizen motivation and data quality. Moreover, the methods applied in this work are easily generalized to other colour-based tools. This enables authors to account for colour blindness in the design stage of new citizen science projects. While some projects have opted for simplified colour scales [11], this significantly reduces the information content [33] of all data, including those from colour blind participants. Simplified colour scales are thus generally not an ideal solution to this problem.

Section 2 describes the methods used to simulate colour deficiency and assess colour discriminability. Results are presented in Section 3 and discussed in Section 4. Finally, conclusions and recommendations are drawn up in Section 5.

2 Methods

The colour blindness simulations and analysis were implemented in custom Python scripts available from https://github.com/burggraaff/cbfu.

2.1 Forel-Ule scale

Tristimulus (CIE XYZ) values for the FU scale were derived by Novoa et al. from transmission spectroscopy [35]. The corresponding (x, y) chromaticities are shown in Fig 1.

Four illuminants were considered, namely E (equal-energy) and D55, D65, and D75 (daylight). These illuminants quantify differences in lighting conditions and are used to express colour appearance in a standardised manner [38]. The FU scale is defined with an E illuminant [35] but measurements take place in daylight, making D-type illuminants more representative [10]. Conversion between illuminants was done in XYZ space using the Bradford chromatic adaptation matrices provided on Bruce Lindbloom’s website [39].

The tristimulus values were first converted to the LMS colour space, representing the relative excitations of the LMS cones [27, 40]. This was done through the Hunt-Pointer-Estevez matrix [40], as shown in Eq (1). Here [L M S]T and [X Y Z]T are the vector representations of a single colour in LMS and XYZ, respectively.

[LMS]=[0.389710.68898-0.07868-0.229811.183400.04641001.00000][XYZ] (1)

2.2 Simulation of colour blindness

Colour blindness was simulated by mapping colours from the LMS colour space representing regular vision to a reduced colour space representing colour deficiency [27, 36, 41]. This is a mathematical representation of how colour appearances shift due to colour blindness, based on the observed colour perceptions of dichromats [36]. Since for dichromats and anomalous trichromats, two out of three cones are unaffected, the responses of those cones to a given colour are unchanged. The simulation determines the response of the third, deficient cone that imitates for a regular observer the colour perceived by a colour blind person [36, 41]. This in turn allows us to apply discriminability metrics developed for regular colour vision to the simulated perceived colours.

The LMS-space vectors cL were modified using a cone-deficiency transfer matrix Tk. Tk is the identity matrix I3 with one diagonal element (Tk00, Tk11, Tk22 for protan, deutan, tritan, respectively) reduced to a relative cone contribution k. This is shown in Eqs (2) and (3) for protan with its respective matrix Tkp [27, 41]. k ranges continuously from 1 (regular vision) to 0 (dichromacy). It represents the relative contribution of a specific cone to colour vision but does not correspond directly to a physical property of the eye. The elements q1 and q2 of Tk shift the response from the deficient cone (L in the example) to the others.

[LMS]=[kq1pq2p010001][LMS] (2)
cL=TkpcL (3)

The cone transfer matrices for protan Tkp, deutan Tkd, and tritan Tkt are as follows:

Tkp=[kq1pq2p010001]Tkd=[100q1dkq2d001]Tkt=[100010q1tq2tk] (4)

The elements q1, q2 were determined by noting that colour blind people retain regular vision for white and a complementary colour (blue for protan and deutan, red for tritan) [27, 36, 41]. In other words, Tk has eigenvectors wL=[111]T (white) and either bL (blue) or rL (red) with eigenvalues 1. This is shown in Eqs (5) and (6).

TkpbL=bLTkdbL=bLTktrL=rL (5)
TkpwL=wLTkdwL=wLTktwL=wL (6)

For each case, a system of two equations with two unknowns q1, q2 and one variable k was derived, with Lb, Mb, Sb the LMS coordinates of the blue reference vector bL and Lr, Mr, Sr those of rL:

kLb+q1pMb+q2pSb=LbkMb+q1dLb+q2dSb=MbkSr+q1tLr+q2tMr=Sr (7)
k+q1p+q2p=1k+q1d+q2d=1k+q1t+q2t=1 (8)

Solving for q1, q2 gave the following expressions:

q1p=1-k-q2pq1d=1-k-q2dq1t=1-k-q2t (9)
q2p=(1-k)Mb-LbMb-Sbq2d=(1-k)Lb-MbLb-Sbq2t=(1-k)Lr-SrLr-Mr (10)

The sRGB blue and red primaries are typically used for bL and rL, respectively, as this technique is used in the field of computer graphics [27, 41]. While other primaries could be used, such as monochromatic wavelengths [36], this makes little difference [27] so we followed the convention.

We calculated Tk for protan, deutan, and tritan with 1 ≥ k ≥ 0 in intervals of 0.01, and transformed the 21 FU colours with each Tk. The modified vectors were then transformed back to XYZ and analyzed. This was implemented in Python through NumPy’s einsum method [42].

2.3 Colour discrimination

Discriminability of the transformed FU colours was assessed in the CIE Lab (1976) colour space. CIE Lab is approximately perceptually uniform, its components representing lightness (L*), green-red (a*), and blue-yellow (b*) [38]. While FU colour assignment is typically done in (x, y) chromaticity (normalized XYZ) through the hue angle [3, 33], this approach does not work for dichromacy, which reduces the chromaticity plane to a line [36]. The Euclidean distance in XYZ coordinates also could not be used, as XYZ is not perceptually uniform [43].

Discriminability was quantified through the ΔE00 metric [37], which expresses the difference between colour pairs. The full formula for ΔE00 is given in [37] and not reprinted here due to its length; our Python implementation passed all the example cases in said paper. A value of ΔE00 = 2.3 corresponds to a just-noticeable difference (JND), the smallest difference an average observer can distinguish [38, 44].

For each deficiency simulation, the ΔE00 difference between each of the 21 transformed FU colours was calculated, giving a 21 × 21 confusion matrix. In this, any colour pairs where ΔE00 < 1 JND cannot be discriminated at all, while pairs with 1 ≤ ΔE00 ≤ 3 are discriminable with difficulty. Pairs with ΔE00 > 3 were considered discriminable.

3 Results

3.1 Colour blindness simulation

The appearance of the FU scale with varying degrees of colour blindness, simulated as in Section 2.2, is shown in Fig 2. The observed changes qualitatively match those seen in previous work [27, 41] and were anecdotally confirmed by one of the authors (deuteranomalous) and a colleague (protanopic). The largest colour shifts are seen for tritan, as expected since it affects the perception of blue light and many FU colours are shades of blue.

Fig 2. Apparent Forel-Ule colours with regular and deficient colour vision.

Fig 2

The (modified) XYZ coordinates were adapted to a D65 illuminant, then converted to the sRGB colour space and gamma expanded [1] for visualization purposes. The perceived colours may vary depending on monitor or printer settings and the reader’s own colour vision. Readers who cannot distinguish between the colours shown here may benefit from taking a colour vision test; many variants are freely available online. The anomalous examples correspond to k = 0.50.

Colour blindness narrows the gamut of the FU scale, as shown in Fig 3. It has little effect on the lightness (L*) of the FU scale but affects its colour components. Protan and deutan (red-green blindness) reduce the range of a* (red-green) while tritan reduces the range of b* (blue-yellow). These shifts imply that colour blindness reduces the ability to discriminate FU colours based on hue, meaning the user will have to rely more on lightness.

Fig 3. Forel-Ule colours in CIE Lab space.

Fig 3

Both regular and deficient vision are included. Regular vision is hidden in the top and bottom panels behind protan and deutan. These affect the a* (green-red) coordinate the most while tritan affects b* (blue-yellow) the most. None of the deficiencies significantly affect L* (lightness).

3.2 Colour discrimination

The discriminability of FU colours is reduced by colour blindness. The confusion matrices for regular and deficient vision, calculated as in Section 2.3, are shown in Fig 4. They show that the reduced range in a* (red-green) for protan and deutan and in b* (blue-yellow) for tritan, observed in Section 3.1, reduce the discriminability at opposite ends of the FU scale. The former primarily affect FU 10–21 (green–brown) while tritan affects FU 1–9 (blue–green).

Fig 4. Confusion matrices for regular and deficient colour vision.

Fig 4

The top panels show the full range of ΔE00, while the bottom panels have a narrower colour bar, in units of just-noticeable difference (JND, ΔE00 = 2.3). Even with regular vision, some pairs of FU colours are difficult to distinguish (ΔE00 ≤ 3 JND) Protan and deutan primarily decrease the discriminability of the middle (green) and high (brown) colours, while tritan primarily affects the low (blue) colours, as expected.

Several pairs of FU colours become fully indistinguishable. Deuteranopia causes two colour pairs (FU 19-20 and 20-21) to fall within 1 JND and thus become indistinguishable. For tritanopia, six pairs become indistinguishable, namely 1-2, 1-3, 2-3, 3-4, 4-5, and 5-6. Protanopia does not cause indistinguishable pairs.

Additionally, many more pairs exhibit reduced discriminability. While most adjacent pairs are <3 JND apart even with regular colour vision, deficiency extends this further off the diagonal. In particular, protan and deutan cause confusion between the central colours (FU 9–13), which is also apparent from Fig 3 as they have similar L*, a*, and b*. On the other hand, tritan significantly reduces the discriminability of FU 1–9. As seen in Fig 5, the number of pairs within 3 JND increases from 17 (regular) to 24 (protanopia), 21/24 (deuteranopia/deuteranomaly), or 30 (tritanopia).

Fig 5. Discriminability of Forel-Ule colours.

Fig 5

The median and minimum (left) ΔE00 difference between FU colour pairs, and the number of pairs within 3 and 1 JND (right), are shown as a function of the relative cone contribution k. k ranges from 1 (full colour vision) to 0 (dichromacy), with intermediate values representing partial colour blindness (anomalous trichromacy). Pairs with ΔE00 < 1 JND are fully indistinguishable, pairs with <3 JND are difficult to distinguish (Section 3.2).

These trends also apply to partial colour blindness (anomalous trichromacy). Fig 5 shows the relation between k and median/minimum ΔE00 as well as the number of indistinguishable pairs. The median decreases smoothly for protan, deutan, and tritan (from 33 to 27, 26, and 22, respectively) from k = 1 to 0. The minimum ΔE00 decreases smoothly for protan and deutan (from 3.3 to 2.5 and 2.2, respectively) while the tritan curve is piecewise smooth. Fully indistinguishable pairs (ΔE00 < 2.3) appear at k ≤ 0.20 for deutan and tritan.

Chromatic adaptation with a daylight illuminant (Section 2.1) did not affect these results. While the ΔE00 between some pairs changed by up to 1 JND, the patterns seen in Figs 4 and 5 remained, as did the previously discussed pairs of non-discriminable colours.

3.3 Practical consequences

In practice, FU measurements always have an uncertainty of ≥1 FU units. This is due to viewing conditions at the time of measurement including waves, specular reflections, and uneven illumination. As seen in Section 3.2, adjacent pairs of FU colours are difficult to distinguish (ΔE00 < 3 JND) even with regular vision.

Colour blindness increases the uncertainty on FU measurements. Observers with protan or deutan experience increased difficulty in distinguishing adjacent pairs. Moreover, protans have difficulty distinguishing FU 9–13 while for deutans, FU 19-20 and 20-21 are completely indistinguishable. For a FU 20-type water body, a deutan cannot specify their observation more precisely than 19–21. Furthermore, ΔE00 = 2.33 for FU 18 and 20, further reducing this precision to 18–21 given imperfect viewing conditions. Similarly, since tritans cannot distinguish six pairs of colours in the FU 1–6 range, they can provide little precision on 99% of global surface waters [33].

This increased uncertainty affects data quality and user motivation. This is further discussed in Section 4 and recommended guidelines for considering these issues are given in Section 5.

4 Discussion

Simulating the effects of colour blindness on Forel-Ule (FU) measurements, we have found significant reductions in colour discriminability and hence precision (Sections 3.2 and 3.3). This matches the authors’ and colleagues’ experiences in the field, and the simulation methods are well-attested in other contexts [27, 29, 36]. However, wider validation specific to the FU scale, with participants representing different types of colour blindness, is desirable.

The reduction in precision due to colour blindness reduces the quality and value of citizen science data. The magnitude of this effect depends on the type and severity of colour blindness, as described in Section 3.3. Protans and deutans, the vast majority of colour blind people in the West [21, 25], experience a reduction in median discriminability (ΔE00) between FU colours of up to 21%; for tritans this is 33%. The uncertainty in FU data increases correspondingly, though not evenly. For example, tritans’ ability to identify green-brown waters (FU 10–21) changes little, but they cannot distinguish the blue water types (FU 1–6) that represent most global surface waters [33].

This reduction in data value can be addressed by modifying data entry protocols to include uncertainties. Currently, many citizen science projects require users to provide a single value, for example FU 9 or 10. An entry field for uncertainty, or allowing the user to enter multiple values, accounts for the decrease in selectivity. Participants can estimate this uncertainty themselves. Even FU measurements by participants with regular colour vision have a typical uncertainty of ±1 FU (Section 3.2), which should be accounted for when using them to validate remote sensing data [34]. Colour blindness, particularly dichromacy, increases this uncertainty to up to ±3 FU.

We propose three methods to include uncertainties in data entry forms. The first is simply to include two fields, one for the best estimate (for example FU 9) and one for the estimated uncertainty (e.g. ±2 FU). This method is commonly used in scientific publications but it may be difficult for citizens to understand and apply [45], especially for asymmetric uncertainties. The second method is to have participants estimate a sequential range of possible values (e.g. FU 8–11), optionally including a single best estimate (e.g. FU 9). This is intuitive, simple to apply, and easily translated into traditional uncertainty intervals. It is most applicable for sequential scales like FU where confusion occurs primarily between adjacent numbers (Fig 4). The third method is to have participants select any number of possible values (e.g. FU 8, 9, 11). This is the most general method for discrete colour scales but makes the uncertainties more difficult to process. It is best suited to colour scales with many non-adjacent indistinguishable pairs. Our Python code (Section 2) can be adapted to other colour scales to determine which method is most suitable. A more detailed discussion on handling uncertainty in citizen science data is provided in [46].

Colour blindness can also affect the motivation of citizen scientists. As discussed in Section 1, participants need to feel they are contributing to science with valuable data. A participant presented with a colour scale where multiple colours appear indistinguishable may dismiss the method as either too difficult or nonsensical, and stop participating [4, 6, 9, 1618, 20]. This is especially true for one unaware of their colour blindness. Since citizen science benefits from a large and diverse group of participants [5, 14, 16, 17], participant retention is important.

Demotivation can be prevented by modifying training materials. Explaining the choice of colour scale and how colour blindness affects its appearance helps participants understand the method. Particular care should be taken in emphasising the value of citizen data, even with colour blindness. For example, while tritans cannot distinguish the FU colours covering the open sea, their ability to distinguish FU 10–21 differs little from regular vision. These cover many inland waters [3, 47], which are commonly studied with the FU scale [30], so training materials should emphasise the value of tritans’ observations there.

Training participants to estimate and provide uncertainties would further help them understand the value of their data [45]. Moreover, since uncertainty estimation is an integral part of professional science, citizen scientists may even gain motivation from learning about it [7, 17, 18]. For existing applications, if modifying data entry forms is impossible, explaining why colours may appear similar and how to pick a single colour would reduce the perceived difficulty.

The severity of these motivational effects and the efficacy of these preventative measures should be tested in practice. Comparing the retention of participants with regular and deficient colour vision, with and without modified training materials and data entry forms, would serve this purpose. This is ideally done in the design stage, as part of a co-creation process [7, 17].

Additional future work includes investigating the effects of other variations in colour perception. Even among those with regular colour vision, variations in colour perception exist [22], including demographical trends [21, 28]. Moreover, monochromacy was not discussed in this work because of its rarity [24] but likely has an even more pronounced effect on colour discriminability than the deficiencies investigated here.

Finally, unrelated to human observations, Fig 3 highlights the importance of lightness in distinguishing FU colours. Many FU index algorithms, which apply the FU scale to remote sensing data, only account for chromaticity [3, 33, 34]. Introducing lightness to these algorithms may improve their precision and accuracy.

5 Conclusions & recommendations

Citizen science projects that depend on colour vision should account for colour blindness, which affects up to 1 in 11 participants. For Forel-Ule water colour measurements, colour blindness reduces the median discriminability between colours by up to 33% and makes multiple pairs of colours fully indistinguishable. This affects data quality and citizen motivation.

Modifying data entry forms to include uncertainty estimates would reduce the impact on data quality. This can be done by letting participants estimate the uncertainty in their measurement or choose multiple colours on the scale. Our provided Python code can be adapted to determine the best suited method for different colour scales. Learning how to estimate uncertainties may also increase participants’ motivation and understanding of science.

The impact on motivation is reduced by including colour blindness in training materials. This includes explaining the colour scale and the difficulties colour blind participants may face, but also emphasising the continued value of their data. Through improved retention, this increases the number and diversity of the participants, which in turn increases both the social and scientific impact of citizen science.

Acknowledgments

The authors wish to thank Mortimer Werther, Steele Farnsworth, Emmanuel Boss, and Akupara Panchagnula for valuable discussions relating to this work. Data analysis and visualization were done using the Matplotlib, NumPy, and colorio libraries for Python. Fig 4 uses the cividis colour map [29]; the line colours in Figs 3 and 5 were obtained from https://davidmathlogic.com/colorblind/.

Data Availability

The data underlying this study are from Novoa et al. 2013 (http://dx.doi.org/10.2971/jeos.2013.13057). All findings can be entirely replicated using the data from Novoa et al. 2013 and the protocol in our Methods section. The authors did not have any special access privileges that others would not have.

Funding Statement

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 776480. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

References

  • 1. Burggraaff O, Schmidt N, Zamorano J, Pauly K, Pascual S, Tapia C, et al. Standardized spectral and radiometric calibration of consumer cameras. Optics Express. 2019;27(14):19075–19101. 10.1364/OE.27.019075 [DOI] [PubMed] [Google Scholar]
  • 2. Leeuw T, Boss E. The HydroColor app: Above water measurements of remote sensing reflectance and turbidity using a smartphone camera. Sensors. 2018;18(1):256. 10.3390/s18010256 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3. Novoa S, Wernand MR, van der Woerd HJ. WACODI: A generic algorithm to derive the intrinsic color of natural waters from digital images. Limnology and Oceanography: Methods. 2015;13(12):697–711. 10.1002/lom3.10059 [DOI] [Google Scholar]
  • 4. Marshall NJ, Kleine DA, Dean AJ. CoralWatch: Education, monitoring, and sustainability through citizen science. Frontiers in Ecology and the Environment. 2012;10(6):332–334. 10.1890/110266 [DOI] [Google Scholar]
  • 5. Worthington JP, Silvertown J, Cook L, Cameron R, Dodd M, Greenwood RM, et al. Evolution MegaLab: A case study in citizen science methods. Methods in Ecology and Evolution. 2012;3(2):303–309. 10.1111/j.2041-210X.2011.00164.x [DOI] [Google Scholar]
  • 6. Bone J, Archer M, Barraclough D, Eggleton P, Flight D, Head M, et al. Public participation in soil surveys: Lessons from a pilot study in England. Environmental Science and Technology. 2012;46(7):3687–3696. 10.1021/es203880p [DOI] [PubMed] [Google Scholar]
  • 7. Bremer S, Haque MM, Aziz SB, Kvamme S. ‘My new routine’: Assessing the impact of citizen science on climate adaptation in Bangladesh. Environmental Science & Policy. 2019;94:245–257. 10.1016/j.envsci.2018.12.029 [DOI] [Google Scholar]
  • 8. Brewin RJW, Brewin TG, Phillips J, Rose S, Abdulaziz A, Wimmer W, et al. A Printable Device for Measuring Clarity and Colour in Lake and Nearshore Waters. Sensors. 2019;19(4):936. 10.3390/s19040936 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9. Scott AB, Frost PC. Monitoring water quality in Toronto’s urban stormwater ponds: Assessing participation rates and data quality of water sampling by citizen scientists in the FreshWater Watch. Science of the Total Environment. 2017;592:738–744. 10.1016/j.scitotenv.2017.01.201 [DOI] [PubMed] [Google Scholar]
  • 10. Novoa S, Wernand MR, van der Woerd HJ. The modern Forel-Ule scale: A’do-it-yourself’ colour comparator for water monitoring. Journal of the European Optical Society: Rapid Publications. 2014;9:14025. 10.2971/jeos.2014.14025 [DOI] [Google Scholar]
  • 11. Castilla EP, Cunha DGF, Lee FWF, Loiselle S, Ho KC, Hall C. Quantification of phytoplankton bloom dynamics by citizen scientists in urban and peri-urban environments. Environmental Monitoring and Assessment. 2015;187(11):690. 10.1007/s10661-015-4912-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12. Wernand MR, van der Woerd HJ. Ocean colour changes in the North Pacific since 1930. Journal of the European Optical Society: Rapid Publications. 2010;5:10015s. 10.2971/jeos.2010.10015s [DOI] [Google Scholar]
  • 13. Wernand MR, van der Woerd HJ, Gieskes WWC. Trends in Ocean Colour and Chlorophyll Concentration from 1889 to 2000, Worldwide. PLOS ONE. 2013;8(6):e63766. 10.1371/journal.pone.0063766 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14. Brouwer S, Hessels LK. Increasing research impact with citizen science: The influence of recruitment strategies on sample diversity. Public Understanding of Science. 2019;28(5):1–16. 10.1177/0963662519840934 [DOI] [PubMed] [Google Scholar]
  • 15. Rambonnet L, Vink SC, Land-Zandstra AM, Bosker T. Making citizen science count: Best practices and challenges of citizen science projects on plastics in aquatic environments. Marine Pollution Bulletin. 2019;145:271–277. 10.1016/j.marpolbul.2019.05.056 [DOI] [PubMed] [Google Scholar]
  • 16. Alender B. Understanding volunteer motivations to participate in citizen science projects: a deeper look at water quality monitoring. Journal of Science Communication. 2016;15(03):A04. 10.22323/2.15030204 [DOI] [Google Scholar]
  • 17. Land-Zandstra AM, Devilee JLA, Snik F, Buurmeijer F, van den Broek JM. Citizen science on a smartphone: Participants’ motivations and learning. Public Understanding of Science. 2016;25(1):45–60. 10.1177/0963662515602406 [DOI] [PubMed] [Google Scholar]
  • 18. Asingizwe D, Poortvliet PM, Koenraadt CJM, van Vliet AJH, Ingabire CM, Mutesa L, et al. Why (not) participate in citizen science? Motivational factors and barriers to participate in a citizen science program for malaria control in Rwanda. PLOS ONE. 2020;15(8):e0237396. 10.1371/journal.pone.0237396 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19. Vohland K, Land-Zandstra A, Ceccaroni L, Lemmens R, Perelló J, Ponti M, et al. The Science of Citizen Science. 1st ed. Springer International Publishing; 2017. Available from: https://www.springer.com/gp/book/9783030582777. [Google Scholar]
  • 20. Budde M, Schankin A, Hoffmann J, Danz M, Riedel T, Beigl M. Participatory Sensing or Participatory Nonsense?—Mitigating the Effect of Human Error on Data Quality in Citizen Science. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. 2017;1(3):1–23. 10.1145/3131900 [DOI] [Google Scholar]
  • 21. Birch J. Worldwide prevalence of red-green color deficiency. Journal of the Optical Society of America A. 2012;29(3):313–320. 10.1364/JOSAA.29.000313 [DOI] [PubMed] [Google Scholar]
  • 22. Neitz J, Neitz M. The genetics of normal and defective color vision. Vision Research. 2011;51(7):633–651. 10.1016/j.visres.2010.12.002 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23. Jordan G, Deeb SS, Bosten JM, Mollon JD. The dimensionality of color vision in carriers of anomalous trichromacy. Journal of Vision. 2010;10(8):1–19. 10.1167/10.8.12 [DOI] [PubMed] [Google Scholar]
  • 24. Pentao L, Lewis RA, Ledbetter DH, Patel PI, Lupski JR. Maternal uniparental isodisomy of chromosome 14: Association with autosomal recessive rod monochromacy. American Journal of Human Genetics. 1992;50(4):690–699. [PMC free article] [PubMed] [Google Scholar]
  • 25. Alpern M, Kitahara K, Krantz DH. Classical tritanopia. The Journal of Physiology. 1983;335(1):655–681. 10.1113/jphysiol.1983.sp014557 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26. Verriest G. Further studies on acquired deficiency of color discrimination. Journal of the Optical Society of America. 1963;53(1):185–195. 10.1364/JOSA.53.000185 [DOI] [PubMed] [Google Scholar]
  • 27. Viénot F, Brettel H, Mollon JD. Digital Video Colourmaps for Checking the Legibility of Displays by Dichromats. Color Research & Application. 1999;24(4):243–252. [DOI] [Google Scholar]
  • 28. Hashemi H, Khabazkhoob M, Pakzad R, Yekta A, Heravian J, Nabovati P, et al. The prevalence of color vision deficiency in the northeast of Iran. Journal of Current Ophthalmology. 2019;31(1):80–85. 10.1016/j.joco.2017.05.005 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29. Crameri F, Shephard GE, Heron PJ. The misuse of colour in science communication. Nature Communications. 2020;11:5444. 10.1038/s41467-020-19160-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30. Wernand MR, van der Woerd HJ. Spectral analysis of the Forel-Ule ocean colour comparator scale. Journal of the European Optical Society: Rapid Publications. 2010;5(0):10014s. 10.2971/jeos.2010.10014s [DOI] [Google Scholar]
  • 31. Ule W. Die bestimmung der Wasserfarbe in den Seen. Kleinere Mittheilungen Dr A Petermanns Mittheilungen aus Justus Perthes geographischer Anstalt. 1892; p. 70–71. [Google Scholar]
  • 32. Forel FA. Une nouvelle forme de la gamme de couleur pour l’étude de l’eau des lacs. Archives des Sciences Physiques et Naturelles/Société de Physique et d’Histoire Naturelle de Genève. 1890;6(25). [Google Scholar]
  • 33. Pitarch J, van der Woerd HJ, Brewin RJW, Zielinski O. Optical properties of Forel-Ule water types deduced from 15 years of global satellite ocean color observations. Remote Sensing of Environment. 2019;231:111249. 10.1016/j.rse.2019.111249 [DOI] [Google Scholar]
  • 34. Nie Y, Guo J, Sun B, Lv X. An evaluation of apparent color of seawater based on the in-situ and satellite-derived Forel-Ule color scale. Estuarine, Coastal and Shelf Science. 2020;246:107032. 10.1016/j.ecss.2020.107032 [DOI] [Google Scholar]
  • 35. Novoa S, Wernand MR, van der Woerd HJ. The Forel-Ule scale revisited spectrally: preparation protocol, transmission measurements and chromaticity. Journal of the European Optical Society: Rapid Publications. 2013;8(0):13057. 10.2971/jeos.2013.13057 [DOI] [Google Scholar]
  • 36. Brettel H, Viénot F, Mollon JD. Computerized simulation of color appearance for dichromats. Journal of the Optical Society of America A. 1997;14(10):2647–2655. 10.1364/JOSAA.14.002647 [DOI] [PubMed] [Google Scholar]
  • 37. Sharma G, Wu W, Dalal EN. The CIEDE2000 color-difference formula: Implementation notes, supplementary test data, and mathematical observations. Color Research and Application. 2005;30(1):21–30. 10.1002/col.20070 [DOI] [Google Scholar]
  • 38. Sharma G. Digital Color Imaging Handbook. Taylor & Francis Ltd; 2003. [Google Scholar]
  • 39.Lindbloom B. Chromatic Adaptation; 2017. http://www.brucelindbloom.com/index.html?Eqn_ChromAdapt.html.
  • 40. Hunt RWG. Revised colour-appearance model for related and unrelated colours. Color Research & Application. 1991;16(3):146–165. 10.1002/col.5080160306 [DOI] [Google Scholar]
  • 41.Schmitz J. Color Blindness Simulation Research; 2020. https://ixora.io/projects/colorblindness/color-blindness-simulation-research/.
  • 42. Harris CR, Millman KJ, van der Walt SJ, Gommers R, Virtanen P, Cournapeau D, et al. Array programming with NumPy. Nature. 2020;585(7825):357–362. 10.1038/s41586-020-2649-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43. Chickering KD. Perceptual Significance of the Differences between CIE Tristimulus Values. Journal of the Optical Society of America. 1969;59(8):986–990. 10.1364/JOSA.59.000986 [DOI] [PubMed] [Google Scholar]
  • 44. Mahy M, Van Eycken L, Oosterlinck A. Evaluation of Uniform Color Spaces Developed after the Adoption of CIELAB and CIELUV. Color Research & Application. 1994;19(2):105–121. 10.1111/j.1520-6378.1994.tb00070.x [DOI] [Google Scholar]
  • 45. Gustafson A, Rice RE. A review of the effects of uncertainty in public science communication. Public Understanding of Science. 2020;29(6):614–633. 10.1177/0963662520942122 [DOI] [PubMed] [Google Scholar]
  • 46. Jiménez M, Triguero I, John R. Handling uncertainty in citizen science data: Towards an improved amateur-based large-scale classification. Information Sciences. 2019;479:301–320. 10.1016/j.ins.2018.12.011 [DOI] [Google Scholar]
  • 47. Wang S, Li J, Zhang B, Spyrakos E, Tyler AN, Shen Q, et al. Trophic state assessment of global inland waters using a MODIS-derived Forel-Ule index. Remote Sensing of Environment. 2018;217:444–460. 10.1016/j.rse.2018.08.026 [DOI] [Google Scholar]

Decision Letter 0

Adrian G Dyer

9 Mar 2021

PONE-D-21-04836

Citizen science with colour blindness: A case study on the Forel-Ule scale

PLOS ONE

Dear Dr. Burggraaff,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

I have so far had the manuscript reviewed by one expert in the field, and I have also carefully read the manuscript which is close to my research area. Reviewer 1 sees a lot of value in the study, and has provided a couple of points that would likely enhance understanding for the broad readership pf PLoS1. If you can make these revisions I will look at the paper again in detail and it should be possible to proceed.

Please submit your revised manuscript by Apr 23 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

We look forward to receiving your revised manuscript.

Kind regards,

Adrian G Dyer, Ph.D.

Academic Editor

PLOS ONE

Journal Requirements:

Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Burggraaff et al present an interesting study evaluating the effect of various known conditions of colour vision anomalies on colour evaluation of RGB images as those used for characterising water quality using the Forel-Ule scale.

In my opinion the authors present the results of a well-designed experiment and based their conclusions on the observed data. I think this manuscript is suitable for publication in PlosOne after the authors have address the two minor points detailed below.

##SPECIFIC COMMENTS ##

1. Subsection 2.2 presents details of the calculations necessary to simulate the effect of the various colour deficiency conditions on the discriminability of the samples present in the FU scale. However, readers can benefit from an early introduction to the rationale of this mapping perhaps in the Introduction section or early in section 2.2. This can be done by discussing the methods by Brettel et al (1997) on which the simulations are based.

2.The authors propose modifying current data entry protocols to include measurements of uncertainty associated with reference similarity for normal and deficient colour vision (lines 217-220). This is a very interesting idea and useful conclusion of the paper. I recommend to expand on this idea based on the presented results. By doing that this manuscript will became a very useful tool for designing new protocols improving not only data collected but also its efficiency,

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Decision Letter 1

Adrian G Dyer

25 Mar 2021

Citizen science with colour blindness: A case study on the Forel-Ule scale

PONE-D-21-04836R1

Dear Dr. Burggraaff,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Adrian G Dyer, Ph.D.

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Acceptance letter

Adrian G Dyer

8 Apr 2021

PONE-D-21-04836R1

Citizen science with colour blindness: A case study on the Forel-Ule scale

Dear Dr. Burggraaff:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Adrian G Dyer

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    Attachment

    Submitted filename: Response to Reviewers.pdf

    Data Availability Statement

    The data underlying this study are from Novoa et al. 2013 (http://dx.doi.org/10.2971/jeos.2013.13057). All findings can be entirely replicated using the data from Novoa et al. 2013 and the protocol in our Methods section. The authors did not have any special access privileges that others would not have.


    Articles from PLoS ONE are provided here courtesy of PLOS

    RESOURCES