Abstract
Drawing an analogy to past debates over biotechnology, some stakeholders fear that synthetic biology (SB) could raise public concerns. Accordingly, ‘lessons from the past’ should be applied to avoid controversies. However, biotechnology in the 1990s is not the only possible comparator. The potential to become contested has been attributed to a number of other novel technologies. Looking at nanotechnology for example, controversies have not materialised to the extent predicted. The article discusses factors relevant for controversies over technologies as well as differences to the situation when modern biotechnology began to proliferate. Certain properties attributed to SB in the discussion so far indeed suggest a potential for controversies of its own, but perceptions may follow those on other aspects of biotechnology subject to local contingencies. Finally, it is questioned whether ELSI research should see its task in applying lessons from the past to ease technology introduction. Today, rather than seeing themselves being embedded in a linear model of technology development, social scientists take an interest in developments ‘upstream’ where technologies take shape.
Keywords: Synthetic biology, Public perception, Public debate, Converging technology, Biotechnology, Nanotechnology, Embedded scientist
Whenever a novel technology is introduced, stakeholders involved promise huge benefits for the future, but sometimes they get nervous. Will the public see it the same way? While many technologies have become an appreciated part of our daily lives, others such as agricultural biotechnology have met reluctance or rejection among the public. With a new technology1 such as synthetic biology,2 the question many ask themselves is whether history will repeat itself, i.e. whether there will be a public controversy. Can we learn from past experiences in order to avoid a controversy in the future? Rather than assessing whether the comparison with past debates over biotechnology is substantiated, in this discussion paper I will argue that while comparisons may provide insights, the instrumental focus on ‘learning’ in order to ease technology introduction is misplaced and points to a skewed perception of the role of social scientists. To this end, I will briefly address (i) new converging technologies and their possible public perception; (ii) how nanotechnology has fared in comparison; (iii) some elements influencing public debate; (iv) the case of synthetic biology and (v) some possible topics of a future controversy. In the last part (vi), the role of social scientists will be addressed.
Converging technology perceptions
Over the last 50 years, a series of so-called key technologies such as nuclear power, information technology or biotechnology have been in the focus of policy makers. To gain a competitive advantage here was said to be a precondition for every industrialised nation to keep on top. Today, a number of new ones such as nano- and cognitive technology have been added. Rather than replacing each other, they are said to converge and give rise to unforeseen novel technologies that may enable developments on various fields and deeply influence the way we live (Roco and Bainbridge 2002; Nordmann 2004). Synthetic biology has been considered to be such a ‘converging’ technology (de Vriend 2006).3 It is part of modern biology, but other disciplines such as chemistry, computer science and engineering have added to its genesis and development. Apart from interdisciplinary research the term convergence emphasises unprecedented progress in creating the next wave of key technologies. It is often associated with the idea of a race for competitive advantages involving several technologies at the same time.
Such a technological race does not always go undisputed. In the past, several key technologies such as nuclear power and some aspects of biotechnology have met criticism. The question with many stakeholders is whether new buzzwords such as ‘nano’ today and, possibly, ‘synbio’ in the future will be perceived as indicating something new or as denoting an extension of previous technologies (IRGC 2008), and which of the ‘mother’ technology will determine their public perception. In fact, ‘convergence’ may have an additional meaning: European technology developers seem to converge in their fear that the public might react negatively. Concern over public acceptance is one of the few common features of these highly diverse fields. Since technology developers have a fundamental interest in the prevention of non-acceptance, and since obviously there is ample experience to learn from, social scientists have been asked (mostly under the umbrella of ELSI research) to investigate the societal consequences of and discourses over technologies and thus find out what went wrong with biotechnology in the past and what should be done in the future to avoid similar developments.
Predictions of consequences from technologies are social constructs by their very nature and thus subject to debate. The history of such technology debates shows that there is no universal trigger for discontent (Bauer 1995); rather, some issues might render a technology more prone to criticism. Various types of risk carry different potentials to influence public perceptions (Slovic 1987). A particularly important source of concern is a potential health risk. Most frightening is it, for example, if the source of a risk is both difficult to contain and invisible, such as with radiation or ‘genes’, and if people cannot avoid it since the cause cannot be smelled, seen or heard. Particularly disturbing are differing expert opinions on the magnitude, impact or comparator of a risk, and whether or not it is entirely new. These different accounts often go with alleged interests of the experts involved in the assessments or of those they speak on behalf of. Another factor is benefit distribution—if it is perceived being skewed, the technology gets scrutinised. With agricultural biotechnology for example, consumer risks were attributed to modes of production that only benefited the producers, while economic arguments emphasising increases in competitiveness turned out not to be persuasive (Torgersen et al. 2002). If the prospects were displayed to be extremely promising, any suspicion of a hidden risk for human health and the environment was taken up with particular scrutiny (Bauer and Gaskell 2002).
Despite providing some insights into their mechanisms, experiences so far have shown that controversies and their political consequences arise upon local contingencies (Bernauer and Meins 2003) and thus remain little predictable. As a consequence, they can be considered unavoidable, which means that any attempt at preventing them pro-actively may be futile.
Nanotechnology, for example
Assessing the possibility of a future conflict over a novel technology nevertheless is tempting. One of the first questions is what to compare synthetic biology with. Agricultural biotechnology suggests itself as the proverbial bone of contention, but its single-issue character and the close link to food renders it quite different. In contrast, nanotechnology is even broader in its technological basis and range of applications than synthetic biology. In fact, the term only provides a rhetorical umbrella over a bundle of technologies that deliberately handle matter on a very small scale (Schmid 2008). Potential applications are so variegated that any generalised statement on risks or benefits seems out of scope. Despite technical links, comparing nanotechnology to synthetic biology on the basis of their intrinsic properties is therefore not very sensible. However, they both belong to the set of converging technologies in the above understanding, as they are novel, assumed to become key enabling technologies and to provoke concerns regarding public acceptance.
Nanotechnology as a term is more common than synthetic biology without having acquired a clear status yet. Grunwald and Fleischer (2007) identified four areas of possible discourses: apart from ‘classical’ risk for human health and the environment from materials (e.g. nano-particles) there are more speculative debates over the potential for ‘disruptive’ innovations (e.g. nanobots), a number of generic issues from enabling applications in different fields (e.g. privacy and RFID), and broader governance issues (e.g. trust and accountability) because nanotechnology might be considered a ‘risky’ technology. In public debate4 so far, nano-particles were rhetorically taken for the entire technology. Similar to biotechnology, health risk and governance issues gained most prominence here.
Several of the ‘contentious’ characteristics as identified above can also be attributed to nano-particles. Experts assert that there may be risks not yet investigated, but their significance remains unclear. Apart from uncertainty over risks for human health there is even more uncertainty over environmental impacts in the long run (Colvin 2003). As a consequence, insurance companies had initially denied coverage. Part of their problem was that it was unclear what to compare nano-particles with, and which measures would be adequate to contain potential risks (Swiss Re 2004). Although some progress has been made, there is still no conclusive assessment. With respect to the distribution of benefits, consumers may take advantage of some materials, while some others offer more opportunities for streamlining production processes without the consumer benefiting from it. In addition, there was an overselling of future benefits (Schmid 2008).
After 2000, some CSOs5 began to address nano-particles. The Canadian ETC group (mostly dealing with agricultural biotechnology issues) started a campaign on uncertain environmental and health effects. Considering the experience with biotechnology, technology developers imagined public opposition, particularly if ‘something happened’, i.e. a major incident occurred that could be attributed to artificial nano-particles. Consequently, nanotechnology became a playground for attempts to address future public opposition. Under the header of ‘what can we learn’ a main conclusion was to advocate research on health risks from nano-particles (European Commission 2005; Maynard 2006) and their environmental properties. This should contribute to a credible risk assessment and management not only to prevent harm but also to contain outrage in case ‘something happened’. Developers and authorities would be able to claim that they had acted responsibly. Apart from the protection against harm, this responsibility argument was a main reason for research into risks from nano-particles (DEFRA 2005).
Irrespective of the (ir)reality of a health risk,6 the fear that the public might turn hostile to nanotechnology does not seem to be really imminent, though. Technology developers have been using the suffix as a marketing asset even for products without ‘nano’, which shows that the term conveys a positive image indicating the latest technological achievements in very different products. This image is not subject to a rational debate over the pros and cons; rather, it emerges from, and addresses, the fragmented perceptions in the public. The positive image is quite robust: in spring 2006, a German company ran into troubles with a household cleaning spray baptised ‘magic nano’ (not containing nano-particles). Consumers who accidentally inhaled the spray had to be hospitalised (Giftinformationszentrum Nord 2006). This was the sort of incidence technology developers feared regardless of the cause. However, the German media were less interested than those in the US and UK. Even before it was clear that there were no nano-particles CSOs did not take up the issue. If genetically modified organisms had been (said to be) involved, the outcome might have been quite different. Obviously, Germans did not seem to easily take fright at nanotechnology, but this was not a result of a particularly precautious way of introducing it. Consumer products containing nano-particles had been put on the market without any measures of precaution. The technology had been deployed through the back door as in many other cases, and nobody had cared.
Factors influencing the debate
This puzzled some observers, but upon closer inspection a number of reasons emerge why nanotechnology, or nano-particles in this case, might have fared better in the publics’ mind than agricultural biotechnology. Compared to the 1990s, a shift in problem attention could have lead to a general decline in the salience of environmental and technology issues over recent years (Eurobarometer 2005). One explanation frequently given is that pressure on the individual towards higher performance made people worry over other things. Another more convincing argument would be that the interest in environmental issues has been redirected to the more pressing issue of climate change. Although general attitudes towards contested technologies such as genetically modified food have not substantially changed over the years (Gaskell et al. 2006), extending these attitudes to a new item would require re-igniting past discourses on technological risk while other issues were to the fore.
Secondly, the technology sector might be more careful in marketing novel food products when they feel that acceptance is unsure. For non-food products from nanotechnology already on the market, a lack of acceptance has obviously not been considered in the light of the then positive image of ‘nano’. It is indicative to see that in the meantime companies, upon request, are very reluctant in saying whether some of their products contain nano-particles (A. Gazsó, pers. comm.). Apart from commercial secrecy over formulations this can be interpreted as an indication that they have become nervous.
Thirdly, decision takers in many European countries might have reacted to the experiences with food controversies. They adopted new ways of reconciling demands from different actors in the presence of uncertainty over risks. Under the header of ‘governance’, they devised measures (rhetorically) incorporating stakeholders in the decision-making process and rendering them co-responsible for the outcome. The EU strategy on science and society (European Commission 2001) showed that at least talking over governance is considered important. In the same vein, an increasing number of scientists seem to embrace the need to consider ethical, legal and social issues linked to the subject of their research.
Fourthly, since top-down PR approaches or ‘rational’ exercises in public understanding of science and technology (PUS) have rendered little effect in terms of acceptance for contested technologies (Dierkes and von Grote 2000), more open, two-way public debates have officially been recommended as a prerequisite for enhancing the social embedding of a technology (European Commission 2004). Consequently, a frequently heard proposition was to enhance public debate over novel technologies such as nanotechnology (Meili 2006).
A public debate, however, is not easily elicited over something that is hard to understand and has rendered few products on the market. Experiments have shown that in debates, people are interested in—even potential—risks and benefits if they appear salient to them (Wagner and Kronberger 2006). To induce a fruitful discussion a debate must therefore be free to address whatever the participants think is relevant, including risks but also interests or responsibilities of actors. This may have little to do with a risk being scientifically plausible or not. Triggering a ‘rational’ public debate on scientifically implausible risks is an oxymoron—what is salient and worth debating from a public point of view is often held to be implausible hence irrelevant from a scientific standpoint. In addition, if any negative aspects would come to the light, a public debate could stain an initially positive image of a technology. With nanotechnology, there are more concerns about nano-particles among scientists and technology developers than among the public (Scheufele et al. 2007), and they realistically fear blame on the technology emerging in a public debate even if ‘nothing happens’.
Synthetic biology: the next wave?
According to the Synthetic Biology Community homepage, synthetic biology aims at “the design and construction of new biological parts, devices, and systems, and the re-design of existing, natural biological systems for useful purposes”.7 This leaves traditional biotechnology far behind in scope; genetic engineering appears as a handicraft in comparison. Synthetic biology promises to lay the foundation of a new industry not unlike microelectronics decades ago (Endy 2005) or, at least, it will be a significant part of the bio-economy to come (OECD 2009). Hence, the promises are not short of those made for nanotechnology.
Although few lay people have heard about it (Hart 2008), aims such as the construction of entire new genomes, new types of organisms or artificial forms of life with new genetic elements could trigger a lay publics’ suspicion of scientists having gone mad. Early on, the ETC Group took up the issue. Their first report on synthetic biology called the new approach ‘extreme genetic engineering’ or ‘GMOs on steroids’ (ETC Group 2007). The slogan alluded to old controversies over GM food and hormone (mis)use.
The scientific community dealt with this challenge by emulating approaches to mitigate risks from genetic engineering decades ago. The allusion to the Asilomar conferences and the NIH guidelines in the 1970s was no coincidence; the motto was self-governance by scientists rather than state action. This was a foreseeable trigger for critics. In 2006, 38 CSOs signed the ETC Group’s open letter demanding a societal debate on socioeconomic, security, health, environmental and human rights implications. The second annual conference on synthetic biology in 2006 in California addressed possible societal implications from synthetic biology more prominently, issuing a resolution on biosecurity and biosafety. Scientists called for more prudence and for anticipating potential risks and public unease (Maurer et al. 2006), but they abstained from addressing broader political and socioeconomic issues. In the following, CSOs repeatedly attempted to enlarge the view while scientists successfully kept the focus on a restricted range of issues around biohazards.
Other than in Asilomar, and much in line with contemporary issues in US mainstream discourses, most concerns related to biosecurity. Participants focussed on measures to prevent potential intentional misuse of research results for sinister aims and, especially, terrorism. Adequate measures, accordingly, were self-control of the scientists and engineers involved as well as in the surveillance of research laboratories and companies supplying DNA building blocks. Apart from a screening for ‘dangerous’ DNA sequences and watch-lists for companies and individuals, the recommendations included a professional obligation to confidentially report ‘dangerous behaviour’ of colleagues, a clearinghouse and more security research. The move for self-regulation to prevent terrorist attacks was intended to pre-empt US Government intervention (Check 2006) and inevitably entailed secrecy and suspicion among colleagues, extending the practice in biological warfare research to civilian issues. In a way, the resolution appeared to be a brainchild of mid-decade US preoccupations.
Initially in most European member states, synthetic biology and its implications elicited rather little interest on a national level while the EU research policy took up the issue (NEST 2005) and launched several projects not only on scientific but also on ethical, legal and social issues.8 In contrast to the US view, many scientists considered the prevention of risks from unanticipated consequences to be equally relevant (Schmidt 2006). With notable exceptions (Church 2005), leading US scientists had attributed pertaining concerns to the ‘usual European scare-mongering’ (Schmidt, pers. comm.), while some Europeans had diagnosed ‘terrorism paranoia’ in the US.
At the third annual conference in Zurich in 2007,9 societal aspects of synthetic biology including intellectual property rights and ethics gained more prominence and provided a (limited) stage for CSO views. The next conference in 2008 in Hong Kong10 followed along these lines, with the ETC Group organising a session on global societal impacts, inviting speakers from outside the scientific community to voice their concerns. Despite their primary dedication to scientific and technical issues the SB 3.0 and 4.0 conferences provided some opportunities to address broader issues than safety and security such as distributional equity and different views of a desirable future.
In the meantime, a number of institutions dealing with policy analysis and research into ELSI such as The Woodrow Wilson Institute11 in the US or the Rathenau Instituut12 in the Netherlands had taken up the issue. Over time, national (Balmer and Martin 2008) and international research organisations (NEST 2005) and other scientific bodies (IRGC 2008) joined. The ‘Human Practices’ Thrust of SynBERC in the US tried to integrate research on societal aspects into a scientific-technical project in a novel way.13 The Synthetic Society Working Group considers itself “a group of individuals who are working to directly address societal issues embedded and surrounding the emerging field of synthetic biology”.14 By 2006, synthetic biology had arrived on the radar screen of technology assessment and the social studies of science and technology as a proverbial example of converging technologies, alluding to the implications for a new technology race. Immediately, the task was set to measure its potential for raising concerns among the general public.
Possible topics of debate
For those reminding the biotechnology controversy synthetic biology provided certain aspects for public concern. The Rathenau Instituut (de Vriend 2006) highlighted a number of arguments in an effort to early identify future issues of debate. Most of them refer to problems to be dealt with on an expert level, such as biosafety, biosecurity, intellectual property rights or particular ethical aspects. A pertinent question depending on the definition is whether synthetic biology is something new or a mere extension of genetic engineering with more powerful tools (IRGC 2008), implying that existing regulation and methods of risk assessment with conventional criteria (properties of ‘donor’ and ‘acceptor’ organisms) are sufficient. Some voices warned that this might fail to properly establish safety due to the greater possibilities of synthetic biology (Rodemeyer 2009, Schmidt 2009). Currently, most members of the scientific community seem to consider existing rules still to be adequate and assessment criteria applicable (M. Schmidt, based on a series of interviews).15 However, as with any rapidly evolving technology, the question is how long the current regulatory toolbox will prove to be applicable and sufficient. Regulatory amendments will probably become necessary, but when this will be—in five, ten or more years—remains a matter of dispute.16 While the technical problems of criteria and methodology will have to be discussed on an expert level, the implications of uncertainty over risks (alleged or not) may have repercussions with a critical public.
Looking upon synthetic biology as a mere extension of genetic engineering could provide a hackneyed but easy anchor point for public attitudes. Preliminary results from media analysis (Seiringer and Cserer, this issue) and focus group research (Kronberger et al. 2009) in Austria—where the public have been, and still are, rather hostile to agricultural biotechnology—show that both journalists and lay people tend to perceive synthetic biology as fulfilling promises they already had ascribed to conventional genetic engineering. In other words, the exciting possibilities researchers in synthetic biology keep stressing already are in the public minds somehow, and the new technology only sets out to fill in existing beliefs. This may indicate a prolongation of the old debate on biotechnology; however, it could also open up another dimension: if the public considered, falsely or not, the achievements of synthetic biology not to be new, novel risks and points to criticise would go little noticed because they would be subsumed under the old paradigm—synthetic biology would appear to be old wine in new bottles. Ironically, this may be a reason why a new controversy will be less likely to arise—about genetically modified organisms everything has been said already and there would be little interest in a new debate. For CSOs, campaigning on it would not raise additional interest beyond general biotech issues. And if attitudes would turn out to grow just slightly more positive in Europe as the last Eurobarometer survey provides some indications for (Gaskell et al. 2006), then this would probably also pertain to synthetic biology.
If, in contrast, synthetic biology is going to be viewed as novel, two sets of problem framings come into the picture (Schmidt et al. 2008). On the one hand, supported by work such as the successful re-construction of an ancient flu virus (Sharp 2005), the potential to cause harm might be considered much higher than with ‘old’ biotechnology. The consequence not only would be that we needed more surveillance of and awareness by scientists in order to ensure biosecurity (Kelle 2007). It also could trigger a novel frame of synthetic biology being an issue of future warfare and terrorism and, hence, as a technology inherently evil. Whether such an image could be weighed up against the advantages of beneficial applications in medicine and energy production remains questionable. On the other hand, the opportunity to ‘create artificial life’ or a ‘second genesis’ (as the wording was in a newspaper interview with leading scientists in synthetic biology)17 may trigger ethical objections. The example of stem cell research has shown that ethical objections are by no means an academic issue only; rather, if they tap into strong religious convictions, societal dynamics can be generated that can halt a technology.
In addition, differences between a North American and Continental European understanding of the role of science in society may affect attitudes towards synthetic biology.18 Since US scientists dominate the field, practices and attitudes as emerging, for example, from the 2006 conference in California might sound alarming to European ears. The deliberate restriction to self-regulation as the acceptable way of dealing with potential problems may be normal in the US. In Europe, it may be taken as a concretisation of a ‘keep-it-secret-and-leave-it-to-the-experts’ approach. In previous technology debates, secretiveness and expert dominance have been suspected to enhance existing public suspicion (Wynne 2001). Furthermore, the propensity of some US scientists to neglect possible unintended consequences may puzzle those that hold deer the precautionary principle. The argument that no risks could be demonstrated with genetic engineering has turned out less convincing for a European public than for its North American counterpart. Finally, while funding for (bio)defense research is normal in the US, it is highly contentious in many European countries. Discussions over nanotechnology have shown that military or ‘dual’ use literally is a minefield in Europe (Norwegian National Research Council 2005). The problem of basic science being ‘embedded’ in military research has since been critically addressed in the context of the NSF report on converging technologies (Nordmann 2004).
Taken together, there are opportunities for a broader public controversy over synthetic biology compared to what we have seen so far. However, this does not mean that a controversy is really pending. Apart from the reasons outlined above synthetic biology may go little noticed as an extension of genetic engineering not entailing a particular debate of its own. Compared to future perceptions on biotechnology in general synthetic biology might not fare very differently.
This does not leave the scientific community without responsibility. Many of their members have acknowledged that dealing with societal issues, anticipating potential problems and reacting to CSO activity is necessary.19 Especially among younger researchers, societal implications of science and technology are part of what they have to deal with, not unlike performing administrative work, engaging in business activities and relating to the media—doing science has developed into a multi-task endeavour (Jasanoff 2004). With the development of novel converging technologies, today’s researchers, on average, might be more aware of possible problems than their elder peers when the biotechnology controversy set off, irrespective of different opinions on concrete issues.20 This could be shown in a recent e-conference set up by the Synbiosafe project, which revealed that many scientists share a similar view regarding the set of problems, while they put up rather different proposals on how to deal with them (Schmidt et al. 2008).21
The role of social scientists
Over recent years, social scientists experienced a boost in opportunities for investigating societal consequences of science and technology, accompanying major scientific endeavours such as the Human Genome Project under the header of ELSI research. Social sciences, often said to be on the verge of marginalisation, could regain importance and funding. In the beginning, programs mostly conceptualised results from science as a black-boxed input, and the impact on society as the subject of investigation. Apart from addressing societal impacts, the rationale was often seen in the identification of possible obstacles to the practical implementation of scientific results. Consequently, applicants had to make it clear that the utility for technology development of their presumed results warranted the effort and the money. Over recent years, funding applications often contained the magic phrase of ‘learning from past experiences’. This is not so different, after all, from the problem biomedical research is confronted with, where in order to acquire funds applications frequently have to emphasise, substantiated or not, utility in terms of possible new therapies.
In the case of past ELSI programs, this mission orientation had some side effects. When called upon helping deliver practical solutions to mitigate social controversies in a pre-emptive way social scientists were confronted with the implicit claim of helping engineers ‘to make biotechnology happen’ (Jasanoff 1995). For some of them it entailed being ‘embedded’ in technology development with a clear role in the fabric of innovation. At worst, they met naïve demands from some stakeholders to render technologies accepted that other stakeholders would not deem acceptable. In other words, they were expected to take sides with those whose interest it was to smoothly introduce a technology and to overcome obstacles they would trace back to negative public perceptions.
‘Being embedded’ also meant applying participatory methods for more sophisticated PR purposes. Such methods had been developed for providing an opportunity to convey the opinions of informed lay people on issues technological to the political system (Joss 1995). In some instances however, participatory events tended to get caught serving more sophisticated two-ways’ public relation purposes designed to replace useless advertising activities. Often the distinction was blurred, and even those in charge of such events might not have been fully clear over what the purpose was (Bogner and Menz 2005). The methodological set-up was similar; however, in the end it was the aim to promote the technology that determined the activity.
Attempts at instrumentalising social science met criticism, and some more recent reports on societal aspects of synthetic biology such as the paper for the BBSRC seemed to propose turning around the relation between attitudes and scientific developments. Accordingly, “scientific research must not get too far ahead of public attitudes” and public consultation should help to “negotiating the boundaries of what is socially acceptable science” (Balmer and Martin 2008, p. 5). Scientific research appeared as an endeavour independent from society producing a stream of bitter pills society might be expected to swallow until the point of non-acceptance. This left science and society detached as ever.
In more recent ELSI programs it has been acknowledged that science and society are interdependent. ‘Learning’ no longer means avoiding conflicts; rather, the new understanding comprises an acknowledgement of past mistakes with devising measures to counter negative attitudes. The emphasis has moved ‘upstream’, which means that the results from scientific investigations and technology development are no longer taken to be an invariant input; rather, it is their generation that is in the focus of interest. Thus, the interaction of natural and social scientists as well as stakeholders in identifying topics that go beyond scientific problems has become a mainstream activity. Tackling issues on a very early stage in the evolution of a technology in collaboration between technology developers, presumptive users, stakeholders and social scientists takes advantage from Constructive Technology Assessment (Rip et al. 1995) and related approaches. Such endeavours have to be built upon better insights into the mutual relation of science and the rest of society. In the US, for example, a renewed interest in investigating science-society interfaces focus on trajectories of research in their institutional contexts. Rather than trying to establish ‘consequences’ in different sectors of society from scientific research results, under the header of ‘human practices’ the contingent inputs into various streams of research are being analysed (Rabinow and Bennett 2008).
The linear model of technology development has often proved to be at odds with reality. Being involved in the process of shaping a technology entails a different role for social scientists compared with past claims to make technology happen. No longer are they ‘embedded’ in the linear trajectory of implementing a technology as given; rather, they take an active role in defining it. Thus, they are not just providing helping hands; nor are they confined to a role as passive observers. In becoming active players they have to take on their own responsibility for the technology emerging.
Acknowledgement
This article was supported by the project ‘Communicating Synthetic Biology (COSY)’ under the GEN-AU/ELSA program funded by the Austrian Research Promotion Agency (FFG).
Open Access This article is distributed under the terms of the Creative Commons Attribution Noncommercial License which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.
Footnotes
The distinction between a field of scientific research and an upcoming technology is blurred. Hybrid fields where basic science and technology development can no longer be separated have been called technosciences (Nordmann 2006).
Whether synthetic biology is a uniform technology of its own remains contested. IRGC (2008) identified at least three current streams that may share a common perspective but are technologically different.
Gregor Wolbring early described synthetic biology as a converging technology in a 2006 blog contribution (Wolbring 2006).
Departing from a Habermasian view we can say that a public debate brings together several societal actors in an open discourse on a contested issue in the public sphere. In addition, the issue is reflected in the media as being contested, potentially influencing the opinion of a larger number of non-involved individuals. Hence, being brought up by a party or CSO alone does not render a topic subject to public debate unless there are several rounds of resonance.
Civil Society Organisations, formerly often denoted NGOs.
There is an argument that public criticism is always linked to the presence of risk. However historically, risk and risk perception have often been detached (Slovic 1987).
Available via Synthetic Biology Community. http://syntheticbiology.org/. Accessed 16 June 2009.
Other European Synthetic Biology Projects. Available via SYNBIOSAFE. http://www.synbiosafe.eu/index.php?page=other-sb-projects. Accessed 16 June 2009.
Synthetic Biology 3.0 Conference. Available via ETH Zurich. http://www.syntheticbiology.ethz.ch/conf_2007. Accessed 16 June 2009.
Synthetic Biology 4.0 Conference. Available via BioBricks. http://sb4.biobricks.org/. Accessed 16 June 2009.
Synthetic Biology Project homepage. Available via http://www.synbioproject.org/. Accessed 16 June 2009.
Synthetische Biologie homepage. Available via Rathenau Instituut. http://www.rathenau.nl/showpageproject.asp?steID=1&ID=2892. Accessed 16 June 2009.
SynBERC hompage. Available via http://www.synberc.org. Accessed 16 June 2009.
Synthetic Society homepage. Available via http://openwetware.org/wiki/Synthetic_Society. Accessed 16 June 2009.
Watch our Expert Interviews. Available via SYNBIOSAFE. http://www.synbiosafe.eu/index.php?page=expert-interviews. Accessed 16 June 2009.
In their recent volume on the ‘Bioeconomy to 2030’, the OECD predicted for 2015: “Current regulation will render it less likely that applications in health or primary production will become available.” (OECD 2009, p. 102).
Daily Mail online from 12 March 2009. Available via Daily Mail Online. http://www.dailymail.co.uk/sciencetech/article-1161434/Artificial-life-created-FIVE-years-experts-claim.html#. Accessed 16 June 2009.
This is also mirrored in the press coverage in Europe and the US (Pauwels and Ifrim 2008).
A tentative list is available via SYNBIOSAFE. http://www.synbiosafe.eu/index.php?page=resources. Accessed 16 June 2009.
Societal issues have even found their way into IGEM, the international student competition on synthetic biology. Available via IGEM 2008. http://2008.igem.org/Team:Calgary_Ethics. Accessed 16 June 2009.
The resulting ‘priority paper’ lists several areas of concern: biosecurity, biosafety, ethical questions, intellectual property rights and the public-science interface. Other reports (e.g. IRGC 2008) identified similar topics, which points to a mainstreaming having taken place.
References
- Balmer A, Martin P (2008) Synthetic biology—social and ethical challenges. Institute for Science and Society, University of Nottingham
- Bauer MW (ed) (1995) Resistance to new technology, nuclear power, information technology and biotechnology. Cambridge University Press, Cambridge
- Bauer MW, Gaskell G (2002) The biotechnology movement. In: Bauer MW, Gaskell G (eds) Biotechnology, the making of a global controversy. Cambridge University Press, Cambridge
- Bernauer T, Meins E (2003) Technological revolution meets policy and the market: explaining cross-national differences in agricultural biotechnology regulation. Eur J Polit Res 42:643–683 [DOI]
- Bogner A, Menz W (2005) Alternative Rationalitäten? Technikbewertung durch Laien und Experten am Beispiel der Biomedizin. In: Bora A, Decker M, Grunwald A, Renn O (eds) Technik in einer fragilen Welt. Die Rolle der Technikfolgenabschätzung, edition sigma, Berlin, pp 383–391
- Check E (2006) Synthetic biologists try to calm fears. Nature 441:388–389 [DOI] [PubMed]
- Church G (2005) Let us go forth and safely multiply. Nature 438:423 [DOI] [PubMed]
- Colvin VL (2003) The potential environmental impact of engineered nanomaterials. Nat Biotechnol 21:1166–1170 [DOI] [PubMed]
- de Vriend H (2006) Constructing life. Early social reflections on the emerging field of synthetic biology. Working document on converging technologies, Rathenau Instituut, Den Haag. Available via Rathenau Instituut. http://www.rathenau.nl/showpage.asp?item=2106. Accessed 16 June 2009
- DEFRA (2005) Characterising the potential risks posed by engineered nanoparticles. A first UK Government research report, HM Government, London
- Dierkes M, von Grote C (eds) (2000) Between understanding and trust: the public, science and technology. Routledge, London/New York
- Endy D (2005) Foundations for engineering biology. Nature 438:449–453 [DOI] [PubMed]
- ETC Group (2007) Extreme genetic engineering: an introduction to synthetic biology. ETC Group, Ottawa
- Eurobarometer (2005) Europeans, science and technology. Nr. 224/Wave 63.1, Directorate General Research, Brussels
- European Commission (2001) Science and society action plan. Brussels
- European Commission (2004) Towards a European strategy for nanotechnology. Brussels
- European Commission (2005) Opinion on the appropriateness of existing methodologies to assess the potential risks associated with engineered and adventitious products of nanotechnologies. SCENHIR 002/05
- Gaskell G, Stares S, Allansdottir A et al (2006) Europeans and biotechnology in 2005: patterns and trends. Final report on eurobarometer 64.3, Brussels
- Giftinformationszentrum Nord (2006) Vergiftungsfälle durch Versiegelungsspray “Magic Nano”, Göttingen. Available via GIZ. http://www.giz-nord.de/php/index.php?option=com_content&task=view&id=122&Itemid=85. Accessed 16 June 2009
- Grunwald A, Fleischer T (2007) Nanotechnologie – wissenschaftliche Basis und gesellschaftliche Folgen. In: Gazsó A, Greßler S, Schiemer F (eds) Nano – Chancen und Risiken aktueller Technologien. Springer, Wien/New York, pp 1–20
- Hart PD (2008) Awareness of and attitudes towards nanotechnology and synthetic biology. The Woodrow Wilson International Center For Scholars, Peter D. Hart Associates Inc., Washington, DC. Available via Synthetic Biology Project. http://www.synbioproject.org/library/publications/archive/6019/. Accessed 18 Sep 2008
- IRGC (2008) Synthetic biology—risks and opportunities of an emerging field. International Risk Governance Council, Geneva. Available via IRGC. http://www.irgc.org/IMG/pdf/IRGC_ConceptNote_SyntheticBiology_Final_30April.pdf. Accessed 16 June 2009
- Jasanoff S (1995) Product, process or programme: three cultures and the regulation of biotechnology. In: Bauer MW (ed) Resistance to new technology. Nuclear power, information technology and biotechnology. Cambridge University Press, Cambridge, pp 311–331
- Jasanoff S (ed) (2004) States of knowledge: the co-production of science and social order. Routledge, London/New York
- Joss S (1995) Consensus conferences and their contribution to science policy. Sci Technol Innov 8(3):14–19
- Kelle A (2007) Synthetic biology & biosecurity awareness in Europe. SYNBIOSAFE, IDC/University of Bath/University of Bradford, Vienna/Bath/Bradford
- Kronberger N, Holtz P, Kerbe W, Strasser E, Wagner W (2009) Communicating Synthetic Biology: from the lab via the media to the broader public. Syst Synth Biol. doi:10.1007/s11693-009-9031-x [DOI] [PMC free article] [PubMed]
- Maurer SM, Lucas KV, Terrell S (2006) From understanding to action: community-based options for improving safety and security in synthetic biology. University of California, Berkeley CA. Available via Synthetic Biology Community. http://syntheticbiology.org/SB2.0/Biosecurity_resolutions.html. Accessed 16 June 2009
- Maynard AD (2006) Nanotechnology: a strategy for addressing risk. The Pew Charitable Trust, Philadelphia
- Meili C (2006) Nanoregulation: a multi-stakeholder-dialogue-approach towards a sustainable regulatory framework for nanotechnologies and nanosciences. The Innovation Society, St. Gallen
- NEST (2005) Synthetic biology. Applying engineering to biology. Report of a high-level expert group. European Commission Directorate-General for Research, Brussels
- Nordmann A (2004) Converging technologies—shaping the future of European societies. High level expert group on foresighting the new technology wave. European Commission, Brussels
- Nordmann A (2006) Collapse of distance: epistemic strategies of science and technoscience. Dan Yearb Philos 41:7–34
- Norwegian National Research Council (2005) Nanoteknologier og nye materialer: Helse, miljoe, etikk og samfunn, Oslo
- OECD (2009) The bioeconomy to 2030—designing a policy agenda. Organisation for Economic Cooperation and Development, Paris
- Pauwels E, Ifrim I (2008) Trends in American and European press coverage of synthetic biology—tracking the last five years of coverage. SYNBIO, The Woodrow Wilson International Center for Scholars, Washington. Available via Synthetic Biology homepage. http://www.synbioproject.org/library/publications/archive/why_scientists_should_care. Accessed 1 Nov 2008
- Rabinow P, Bennett G (2008) Ars synthetica: designs for human practice. Available via Connexions. http://cnx.org/content/col10612/1.2. Accessed 19 Dec 2008
- Rip A, Misa TJ, Schot J (1995) Constructive technology assessment. A new paradigm for managing technology in society. In: Rip A, Misa TJ, Schot J (eds) Managing technology in society—the approach of constructive technology assessment. Pinter, London
- Roco MC, Bainbridge WS (2002) Converging technologies for improving human performance. National Science Foundation/Department of Commerce, Washington, DC
- Rodemeyer M (2009) New life, old bottles: regulating first-generation products of synthetic biology. Woodrow Wilson International Center for Scholars, Washington, DC
- Scheufele DA, Corley EA, Dunwoody S, Shih TJ, Hillback E, Guston DH (2007) Scientists worry about some risks more than the public. Nat Nanotechnol 2:732–734 [DOI] [PubMed]
- Schmid G (2008) The nature of nanotechnology. In: Schmid G (ed) Nanotechnology vol. 1: principles and fundamentals. Wiley-VCH, Weinheim
- Schmidt M (2006) Public will fear biological accidents, not just attacks. Nature 441:1048 [DOI] [PubMed]
- Schmidt M (2009) Do I understand what I can create? Biosafety issues in synthetic biology. In: Schmidt M, Kelle K, Ganguli-Mitra A, de Vriend H (eds) Synthetic biology: the technoscience and its societal consequences. Springer Academic Publishing, Berlin/New York
- Schmidt M, Ganguli-Mitra A, Torgersen H, Kelle A, Deplazes A, Biller-Andorno N (2008) SYNBIOSAFE e-conference: online community discussion on the societal aspects of synthetic biology. Syst Synth Biol 2:7–17. doi:10.1007/s11693-008-9019-y [DOI] [PMC free article] [PubMed]
- Sharp PA (2005) 1918 Flu and responsible science. Science 310(5745):17 [DOI] [PubMed]
- Slovic P (1987) Perceptions of risk. Science 236:280–285 [DOI] [PubMed]
- Swiss Re (2004) Nanotechnology: small matter, many unknowns. Schweizerische Rückversicherungs-Gesellschaft, Zürich
- Torgersen H, Hampel JV, Bergmann-Wimberg ML (2002) Promise, problems and proxies: twenty-five years of debate and regulation in Europe. In: Bauer MW, Gaskell G et al (eds) Biotechnology. The making of a global controversy. Cambridge University Press, Cambridge, pp 21–94
- Wagner W, Kronberger N (2006) Redesigning nature: the natural and the artefactual in the new world of genetic engineering. In: Weiss K, Marchand D (eds) Psychologie sociale de l’environnement. Presses Univérsitaires de Rennes, Rennes
- Wolbring G (2006) Synthetic biology 2.0. Available at Innovation Watch. http://www.innovationwatch.com/choiceisyours/choiceisyours.2006.05.30.htm. Accessed 30 May 2006
- Wynne B (2001) Expert discourses of risk and ethics on genetically manipulated organisms—the weaving of public alienation. Politeia 62(17):51–76 [PubMed]