Table 7.
Privacy and security top policy options
| Issue statement | Trust in the security of a commons is difficult to build given that privacy breaches can never be completely eliminated, protections vary by jurisdiction, and laws/regulations/norms protecting privacy change over time. Further, links among data sets are needed to interpret cancer risk, but then the data becomes more identifiable and privacy risks increase. | |
|---|---|---|
| Top three policy options (label)* | Points to consider | Illustrative quotes (study ID) |
| Funders, clinical labs, individual researchers, institutions, and end users of data could adopt federated models of data sharing to avoid having a centralized database, where data are uploaded and downloaded locally, which would minimize the risks of re-identification and reduce harms from security breaches (PS10) | Fewer copies = greater security | ‘The best improvement on data protection is to have fewer copies of the data requiring protection. Federation is a great step forward for this’ (014) |
| Need to evaluate tradeoffs (utility) | ‘Federated models do reduce the risk - but at what cost to utility (hence why I said, “don’t know” from a feasibility standpoint)’ (022) | |
| Need evidence that federated models work in hereditary cancer context | ‘Federated models have a part to play (privacy by design), but risk being interpreted too narrowly (data never moves) and have still not been shown to be feasible outside some health surveillance networks’ (024) | |
| Funders, institutions, and end users of data could invest in the development and use of novel technologies geared toward protecting privacy and enhancing data security (eg leveraging synthetic data to reduce re-identification risk, and leveraging secure computational methods to allow analysis of data without moving data (PS9) | Prevention is best | ‘Technological measures to prevent data breaches/misuse are the best method of preventing these occurrences’ (001) |
| Technology is still immature | ‘Privacy-preserving technologies have a part to play but remain immature …’ (024) | |
| Need to recognize ‘arms race’ dynamic with bad actors | ‘While I think that investing in the development and use of novel technologies geared toward protecting privacy and enhancing data security is a good idea, it is an arms race with no end in sight unless there is lawmaking that removes the incentives to hacking these data’ (012) | |
| Need to weigh tradeoffs (utility) | ‘Technical safeguards (eg differential privacy, homomorphic encryption, secure multiparty computation, federated learning) can offer measurable, effective privacy protection but they have a cost in terms of computational efficiency, accuracy of results, and administrative overhead’ (021) | |
| Faux ideology of effective technology | ‘[E]veryone wants magic tech to fix this, but I haven’t seen any of that tech work at anything approaching meaningful scale’ (015) | |
| Data resources, institutions and individual researchers could be more transparent about security risks and potential harms (PS8) | Transparency may lead to public pressure on lawmakers | ‘Research on privacy risks and costs will help spur lawmaking as will greater transparency (and the public awareness that comes with it)’ (012) |
| Need balanced, non-alarmist messaging | ‘Along with the idea of being more transparent about risks and harms, it is important also to mention benefits and gains’ (010) | |
| Transparency alone is not a solution | ‘Just being more transparent/clear about the risks seems to do little to solve the problem’ (022) | |
*See supplementary tables for all policy options labeled by domain (number), eg PS1 for the first list policy option in the privacy and security domain.