Skip to main content
Sage Choice logoLink to Sage Choice
. 2022 Oct 13;29(9):1687–1718. doi: 10.1177/10778012221125495

Technology-Facilitated Gender-Based Violence, Hate Speech, and Terrorism: A Risk Assessment on the Rise of the Incel Rebellion in Canada

Esli Chan 1,
PMCID: PMC10248308  PMID: 36226437

Abstract

With the proliferation of the internet, emerging groups such as the men's rights movement involuntary celibate (incel) community have new ways to reproduce real-world harm and gender-based violence (GBV) against women. This study conducts a critical discourse and semantic analysis of the incels.co webpage and the Alek Minassian van attack using the Violent Extremism Risk Assessment and the Cyber Extremism Risk Assessment tool. It reveals that Canadian violent extremism frameworks minimize online GBV as a form of extremism. GBV, which extends from online to offline realities, is not captured in theoretical frameworks for terrorism and hate speech.

Keywords: incel, gender-based violence, online violence, extremism, risk assessment

Introduction

Increasing internet access has enabled greater community connectivity; however, it has also increased online hate and violence that promotes online gender-based violence (GBV) against women (Hinson et al., 2018). In response to the rise of the feminist movement, several misogynist and anti-feminist groups have emerged (Jaki et al., 2019; Marwick & Caplan, 2018). One such group is involuntary celibates, also known as incels, who believe that a lack of sexual intimacy granted by women is a form of oppression against men (Lilly, 2016). Incels rally around shared misogynistic sentiments that project ideas of violence against women (Baele et al., 2019). While not all incels are necessarily violent, incels have enacted mass violence against women in Canada. In 2018, Alek Minassian, a self-declared incel, used a van to target women pedestrians in downtown Toronto, ultimately killing 10 individuals (Cain, 2018).

Recent research on incels has focused on their online interactions (Byerly, 2020; Dynel, 2020; Jaki et al., 2019; Lilly, 2016). Physical violence conducted by incels has not been adequately connected to their online influences. This is in line with a general trend, whereby theoretical understandings of GBV against women are separated into online versus physical spheres (Henry & Powell, 2018; United Nations Broadband Commission For Digital Development Working Group on Broadband and Gender, 2015). Furthermore, only recently has incel violence been evaluated as a form of terrorism (Hoffman et al., 2020). While international security experts have risk assessment tools to evaluate the threat of extremism (Baruch et al., 2018; Pressman & Ivan, 2019), these tools are not inclusive of gendered dimensions of violence.

This study evaluates how online motivations of GBV are handled in the theoretical frameworks of hate speech and terrorism in Canada and identifies if Canadian risk assessment tools, such as the Violent Extremism Risk Assessment (VERA-2) and the Cyber Extremism Risk Assessment (CYBERA), sufficiently capture gendered dimensions of offline and online violence in Canada. Through critical discourse and semantic analysis, I evaluate the activities of an online incel forum as well as a Canadian case of incel-inspired physical violence. I expand on previous research which has looked solely at online discourse (Jaki et al., 2019; O’Donnell & Shor, 2022; O’Malley et al., 2020) or physical violence (Allely & Faccini, 2017) to inform how incel actors can use the online sphere to perpetuate harm (Regehr, 2020). I challenge how existing theoretical conceptions of terrorism and hate speech understand online gendered threats.

Literature Review

GBV and Online Hate Speech

GBV against women is generally defined as a threat of or act resulting in physical, sexual, or psychological harm to women (Van Der Wilk, 2018). Traditional definitions of GBV against women under-estimate technology in the perpetuation of violence, particularly given that technology reproduces offline socio-cultural inequalities (Henry & Powell, 2015; Tanczer et al., 2018). Although there is currently no uniformly accepted definition of online GBV (Van Der Wilk, 2018), the United Nations 2018 Report of the Special Rapporteur on Violence Against Women describes online GBV against women as:

Any act of gender-based violence against women that is committed, assisted, or aggravated in part or fully by the use of [internet communication technology], such as mobile phones and smartphones, the Internet, social media platforms or email, against a woman because she is a woman, or affects women disproportionately (United Nations General Assembly, 2018, p. 7).

This and other definitions of online GBV are limited in scope and do not highlight the influence of ideological underpinnings emboldened by technology. Canada's Strategy to Prevent and Address Gender-Based Violence addresses online GBV in the context of cyberviolence against youth (Status of Women Canada, 2019). The UN 2015 Cyber Violence Against Women and Girls Report focuses on “hacking… impersonation… surveillance/tracking… harassment/spamming…recruitment…[and] malicious distribution” (United Nations Broadband Commission for Digital Development Working Group on Broadband and Gender, 2015, p. 22). This does not account for incel-inspired violence aimed to attack any woman in affirmation of their ideological rebellion (British Broadcasting Corporation [BBC], 2018; Zimonjic, 2018).

Henry and Powell (2015, 2018) connect technological influence to gendered violence. Technology-facilitated sexual violence (TFSV) is defined as the use of technology that emboldens sexual violence in both online and offline realms (Henry & Powell, 2018), highlighting technology as a mechanism that connects online realities to real physical impact (Henry & Powell, 2015). A focus on facilitation emphasizes the ability for abusers to organize, disguise, and manipulate the online sphere to create new realities for women's participation (Henry & Powell, 2018). This definition broadens frameworks for GBV as it adopts new perspectives on mediums and methods of harm by emerging ideological groups (Henry & Powell, 2018). Their definition of TFSV includes distribution of sexual images, cyberstalking, gender-based hate speech, and online facilitation of sexual assault (Henry & Powell, 2015).

Technologization has also enabled the ability to gather like-minded individuals who might otherwise be isolated from one another (Winter, 2019, p. 5). This also contributes to the potential production of an echo chamber of beliefs that can enable a violent narrative, as has been argued for incels (Janis, 1991). Furthermore, the networked nature of online connectivity can “diffuse moral or legal responsibility for group members” (Henry & Powell, 2015, p. 769), which can promote gendered violence without individual accountability, and instead reinforce an opposition, women, of which to ascribe blame (Bratich & Banet-Weiser, 2019).

Online communication provides new perspectives to hate speech. Hate speech is defined as communication that seeks to degrade another based on one's identity or stereotypical characteristics (Nockelby, 2000, as cited in Warner & Hirschberg, 2012). Canada has explicitly prohibited hate speech since the 1960s (Mahoney, 2009), and gender identity has been as a factor of hate speech in the Criminal Code since 2017 (Walker, 2018). Current Canadian legislation, however, does not reflect directly on the online dimension of hate speech (Platt, 2018). Section 13 of the Canadian Human Rights Act, which “made it a discriminatory practice to convey messages over the phone or internet that contain any matter that is likely to expose a person or persons to hatred or contempt” (Platt, 2018, p. 1), was repealed in 2013.

Regulation of online speech remains contentious and complex (Mahoney, 2009). The transcendence of national boundaries is a challenge in the prosecution of hate speech—“adding the internet to the scope of the hate speech laws creates challenges of jurisdictional enforcement and technological feasibility” (Mahoney, 2009, p. 344). Furthermore, narrow legal and institutional interpretations of hate speech allow groups, such as incels, to “modify their language in such a way that ensures they evade being captured by legislation” (Sorial, 2015, p. 301). Groups tactfully advance their interests and are “becoming more sophisticated, polite, and civil” (Sorial, 2015, p. 301), to maneuver around legal boundaries of hate speech. This allows incels to continually employ gendered hate speech against women.

Terrorism and Gendered Physical Violence

The United Nations General Assembly defines terrorism as “[A]cts intended or calculated to provoke a state of terror in the general public, a group of persons or particular persons for political purposes” (United Nations Office on Drugs and Crime, 2018, para. 15). Within Canada, terrorism is defined under Section 83 of the Criminal Code as:

an act or omission, inside or outside of Canada, committed for a political, religious, or ideological purpose that is intended to intimidate the public, or a subset of the public, with respect to its security, including its economic security, or to compel a person, government or organization…from doing or refraining from doing any act, and that intentionally causes one of a number of specific forms of serious harm, such as causing death or serious bodily harm. (Public Safety Canada, 2018, p. 6)

Ideological motive is an important component of terrorism definitions, but what is considered “ideology” is often racialized and tinted with religious bias, where white actors are labeled as individual aggravated actors, or “lone wolves,” and held to a lesser offense (O’Malley et al., 2020). Likewise, gendered motives, such as hegemonic masculinity, are often neglected as a possible ideological source of terrorism (Stoleru & Costescu, 2014). Hegemonic masculinity is the performance of masculine hierarchies, entangled in societal norms and institutions, which highlight the dynamics of power within gender relations (Connell & Messerschmidt, 2005).

This is particularly important for the incel community, where gendered ideology underpins support for mass violence (O’Donnell & Shor, 2022). Lone-wolf and incel actors that act in violence against women are largely supported and emboldened by their online communities (Byerly, 2020; Malthaner & Kindekilde, 2017; O’Donnell & Shor, 2022; Regehr, 2020). The dismissal of online networks in connection to physical violence by lone-wolf actors minimizes the patterns of systemic violence against women.

Risk Assessment Frameworks

In response to emerging extremist threats, risk assessment frameworks consider the probability and impact of events (Lupton, 1999) and provide insight to threat and vulnerability. While risk assessment frameworks cannot prove the absolute certainty of violence (Bouder et al., 2007), the structured nature of frameworks allows assessors to categorically evaluate the behaviors of increased threat. Risk assessment tools, which are often used by governments, are differentiated by their ability to assess different characteristics of harm. The Violent Extremism Evaluation Measurement (VEEM) framework evaluates the process of extremist radicalization, drawing reference to various extremism risk assessment frameworks (Baruch et al., 2018), and broadly acknowledges the existence of gendered dynamics to extremism (Baruch et al., 2018). The Extremism Risk Guidelines (ERG 22 + ) is used in Canada to assess “pathway influences” (Lloyd & Dean, 2016, as cited in Hart et al., 2017, p. 15) for convicted terrorists to identify the possibility of future harm. The Terrorist Radicalization Assessment Protocol (TRAP-18) assesses the probability of lone-wolf acts of terrorisms (Logan & Lloyd, 2019). Risk assessment frameworks capturing gendered violence include the Domestic Abuse, Stalking, and Honor-Based Violence (DASH), used by law enforcement frontline workers in the United Kingdom to respond to domestic violence cases (McCulloh et al., 2016). Other GBV risk assessment frameworks include the Domestic Violence Risk Appraisal Guide (DVRAG), employed in Canada to assess the risk of interpersonal violence committed by men (McCulloh et al., 2016).

Specific to this study, the Violent Extremist Risk Assessment (VERA) tool, created by Canadian intelligence expert Elaine Pressman, evaluates beliefs, intentions, environment, and the capability of specific individuals toward violent extremism in national security settings (Pressman, 2009). It is used by intelligence and law enforcement bodies across the Asia-Pacific and Europe (Pressman & Ivan, 2019). The tool employs a “structured professional judgment” (SPJ) technique (Pressman & Ivan, 2019, p. 46) in which experts are guided by a systematic framework to inform their analysis of distinct behaviors (Logan & Lloyd, 2019). VERA-2 was subsequently developed in response to feedback gained from its first iteration (Pressman & Flockton, 2012). The Cyber-Violent Extremist Risk Assessment (CYBERA) is used in conjunction with VERA-2 to provide additional analysis of risk indicators that manifest in an online context (Pressman & Ivan, 2019). Risk indicators within CYBERA focus on linguistic analysis of online activity (Pressman & Ivan, 2019).

Case Study

Rise of the Incel Movement

The MRM is a backlash movement initiated by men in response to the rise of feminism and the changing role of women (Marwick & Caplan, 2018). It focuses on the perceived inequalities posed by feminist ideologies and gender equality that diminish traditional forms of masculinity (Lilly, 2016). The movement moved online during 1980–1990s, creating an online network, or manosphere, that sought to reinstate masculine hegemony (Jaki et al., 2019). The MRM is divided into sub-movements with distinct characteristics, including men's rights activists (MRAs), men going their own way (MGTOW), pickup artists (PUAs), and incels (Lilly, 2016).

The term, incel, was first coined by a woman in Toronto who sought to share her romantic disappointment on social forums (Byerly, 2020); this sentiment was co-opted by members of the men's rights movement, who felt that being an incel was a uniquely male experience that occurred because of women's denial of sex. Incels combine an entitlement to sex with self-deprecation and perceptions of victimization by women (Bratich & Banet-Weiser, 2019; Jaki et al., 2019). Incels perceive masculinity as superior, difficult to earn, and often challenged (O’Malley et al., 2020); as such, incels believe that women's refusal to grant sexual intimacy to certain men is a threat and form of oppression against men (Lilly, 2016). Previous research demonstrates that incels subscribe to the idea of the red-pill philosophy, where they seek to reveal the truth of how modern dating and left-wing movements have led to women controlling gender relations and that incels must rebel against this idea (Bratich & Banet-Weiser, 2019; O’Malley et al., 2020). As a result, one of the incel subculture's primary goals is that masculine hegemony be re-established (Jaki et al., 2019; O’Malley et al., 2020).

Previous research demonstrates that the incel community has grown exponentially from the use of online communal social platforms and technological connectivity (Byerly, 2020). Popular forums on Reddit include “/r/inceltears” and “/r/TheRedPill”—the latter of which boasted 145,000 members (Lilly, 2016). Incel communities also host their own websites, such as “incels.co,” which has over 60,000 users (Incel Inside Wikipedia, 2020c). Online platforms allow incel members to use various modes of interaction, such as humor, critique, and discussion, to forge closer in-group bonds (Dynel, 2020). As such, these forums are central to the solidification of group ideologies and the promotion of violence (Byerly, 2020; Dynel, 2020; O’Malley et al., 2020). Inceldom emerges where incel actors coalesce and reaffirm shared attitudes based on shared perspectives on the demotion of women (Bratich & Banet-Weiser, 2019; Rational Wikipedia, 2020).

The Case of Alek Minassian: 2018 Toronto Van Attack

On April 23, 2018, Alek Minassian aimed his van at woman pedestrians on a busy intersection in downtown Toronto, murdering 10 people, eight of which were women. Before his attack, Minassian posted to Facebook stating his allegiance to the incel rebellion and Elliot Rodger, a notorious 2014 incel killer. Minassian affirmed his beliefs to restore masculine hegemony and praising previous actors of violence against women (Cain, 2018). Minassian also acknowledged his affiliation to the incel community after the attack (Thomas, 2018). Although this event is one of the largest incitements of mass violence in Canada, it has not been prosecuted as terrorism (Zimonjic, 2018) and has been labeled as “an isolated incident” (Zimonjic, 2018, para. 3).

While a separate March 2020 incel-related murder of a woman in Toronto has been prosecuted as an act of terrorism for the first time (Bell et al., 2020), the incel community remains under-characterized as a rising force of violent extremism in Canada. The Toronto Van Attack provides insight into how GBV and violence connected to the incel community is situated between offline and online realities—while Minassian's act of violence is committed offline, the influence of radicalization originates online.

Research Design

This study employs a critical discourse and semantic analysis method using qualitative data analysis software, NVivo, to conduct an analysis of language used within the online “incels.co” webpage and the transcript of Minassian's police interview. While incels.co reveals the breadth and depth of incel community discussions, analysis of the transcript from Minassian's police interview provides a realistic snapshot of how online incel communities can manifest into physical violence. In this way, this study moves beyond research, which has looked only at online discourse (Jaki et al., 2019; O’Donnell & Shor, 2022; O’Malley et al., 2020) or only at specific incidents of major physical violence (Allely & Faccini, 2017). Examination of incel language through the use of designated lexicons will highlight underlying connotations and intentions within the incel community and will inform risk rankings used in the VERA-2 and CYBERA risk frameworks. This will reveal how the case of Minassian and the larger incel community are interpreted within the perspective of violent extremism.

Data

Data is derived from the incels.co webpage and Minassian's police video interview and transcript. The “Incels.co” webpage (Incels.co, 2020) currently hosts the largest online forum of the incel community; users on this website create online personas and engage on conversation forums. This webpage was created in response to Reddit's removal of the popularized online incel community “/r/incels” in November 2017 (Incel Inside Wikipedia, 2020a) and the removal of the “incels.me” platform that Minassian frequented (Wilson, 2018). Data is gathered from incels.co from 23 April 2018, the date of Minassian's attack, until 23 May 2018.

Minassian's police interview details his affiliations to the incel community, ideological motivations, and plan of attack (Thomas, 2018). I reviewed the video interview and transcript—the full police interview is publicly available on The Mob Reporter - Youtube (2019), and the transcript of the video is available on Scribd (2018), an online publishing platform. The interview serves as a snapshot of Toronto Police Detective Thomas’ response to the attack and Minassian's disposition. This interview is used to understand how online GBV can manifest into offline harm.

Discourse Analysis

Lexicons

I selected eight lexicons to specifically identify vocabulary related to incel and online communities to uncover language used to conceal and convey in-group communications or culture. This includes: Farrell's Aggregated Lexicon (2019), Rezvan's Harassment Corpus (Farrell et al., 2019; Rezvan, 2018), Squirrell's Incel Dictionary (Squirrell, 2018), Geen's Violence Verbs (Farrell et al., 2019; Geen & Stonner, 1975), two Hatebase repositories (Davidson, 2019; Hatebase, 2020), are and two incel and manosphere-specific Wikipedia glossaries (Incel Inside Wikipedia, 2020b; Rational Wikipedia, 2020). Additional terms are manually added to contribute to the understandings of incel-specific language and assessment of risk. I chose a select number of terms from the lexicons, as not all terms produced results in the analysis nor are relevant within the incel community. Overall, I selected 386 terms from the collection of lexicons (Table 1).

Table 1.

Description of Incel-Related Lexicons and Number of Terms Derived.

Lexicon Description # of terms in the lexicon # of terms used
Farrell's Aggregated Lexicon Misogynistic online terms (Farrell et al., 2019, 87): “Physical Violence, Sexual Violence, Hostility, Patriarchy, Stoicism, Racism, Homophobia, Belittling, and Flipped Narrative.” 2,454 119
Rezvan's Harassment Corpus Harassment-related terms (Farrell et al., 2019): “1) Sexual 2) appearance-related 3) Intellectual 4) Political 5) Racial 6) Combined” (Rezvan, 2018, 1). 713 101
Squirrell's Incel Dictionary Incel-specific terms (Squirrell, 2018). 71 31
Incel Inside Wikipedia Glossary Collaborative incel-related glossary 380 40
Manosphere Wikipedia Glossary Collaborative men's rights movement and alt-right glossary.1 136 14
Geen and Stonner's Violence Verbs Words associated with five dominant violent verbs: “torture, stabbing, murder, massacre, and choke” (Farrell et al., 2019, 91; Geen et al., 1975). 322 33
Hatebase—Female and Gender Hatebase gathers the largest database of hateful words (Hatebase, 2020). The database is filtered to identify female and gender-related words. 83 10
Davidson's Revised Hatebase The Hatebase lexicon is revised to remove terms that indicate “false positives” (Davidson, 2019) 1,034 8
Manual Additions Additional terms are manually added to capture the context of incel language in reference to risk factors. N/A 30

Critical Discourse and Semantic Analysis

This study employs a critical discourse and semantic analysis method to highlight how language is connected to gendered violence. Critical discourse analysis (CDA) is used for “describing, interpreting, and explaining how discourse construct, maintain, and legitimize social inequalities” (Mullet, 2018, p. 116). CDA is “critical” in its attempt to uncover structures of power and inequality rooted in language and norms (Mullet, 2018). Semantic analysis uncovers emotions or motivations conveyed through language by identifying underlying circumstances of which dialogue is delivered (Hassanpour, 2013). Critical discourse and semantic analysis reveal gendered power structures in language used within online incel forums and by incel actors.

CDA is employed through word frequency and word tree queries on lexicon-identified terms using the NVivo tool. Word frequency searches reveal the prevalence of a term within online forums and Minassian's interview. Word tree queries display the most used phrases in relation to a word and is visually presented to highlight common in-group sentiments.2 These assessments reveal common in-group language in the incel community.

Risk Assessment Evaluation

This paper uses VERA-2 and CYBERA to analyze the Minassian case and the incel forum as it provides a holistic perspective of physical and technological harms linked to ideological extremist motivations. VERA-2 and CYBERA are designed to complement one another (Pressman & Ivan, 2019); when used together, the purpose of these tools enables:

(1) a robust and cyber-focused risk assessment intended to provide early warning indicators of violent extremist action, (2) provides consistency and reliability in risk and threat assessments, (3) determines risk trajectories of individuals, and (4) assists intelligence and law enforcement analysts in their national security investigations. (Pressman & Ivan, 2019, p. 231)

First, using semantic analysis, I assigned each term derived from the lexicons to one or multiple risk factors within VERA-2 and CYBERA, depending on three factors: (a) the existing and commonly understood definition and usage of the term; (b) underlying in-group connotations of the term; (c) the context of which the term is used to convey an emotion or emulate a behavior.3

For example, Saint Elliot is a description of Rodger as a martyr due to his violent actions (Squirrel, 2018). A word frequency search of Saint Elliot reveals 25 occurrences within incels.co; a word tree query on Saint Elliot reveals that conversations featured the adoration and justification of his actions. Incels.co members also celebrated a “Happy Saint Elliot's Day” on May 23, the date of his shooting, favoring Rodger's actions. As a result, the term “Saint Elliot” is assigned to VERA-2's Glorification of Violent Action risk factor as it is as a term of adoration of incel-inspired violence against women.

Second, in analysis of the Minassian case and incels.co forum through VERA-2 and CYBERA, I prescribed each risk factor with a rating from zero to two according to the existing framework—with zero as the lowest and two as the highest level of risk, respectively (Beardsley & Beech, 2013). Through the use of word frequency and word tree queries, I ranked risk factors based on the following four aspects: the diversity and number of terms assigned to each risk factor; the numeric frequency of each term in each risk factor; terms that indicate a commitment to harm; and terms used to depict gendered degradation or violence against women. Each VERA-2 and CYBERA risk factor belongs to a larger risk section. Risk factors may also overlap between CYBERA and VERA-2, while others are unique to each framework.4 While risk factors, as depicted in Tables 2 and 3, belong to their respective VERA-2 and CYBERA frameworks (Pressman & Flockton, 2012; Pressman & Ivan, 2019), resultant risk rankings are original to this work.

Table 2.

VERA-2 Risk Rating for the Incels.co Forum and Minassian's Interview.

VERA-2 indicator items Incels.co forum Minassian police interview transcript
Low Moderate High Low Moderate High
Beliefs & attitudes 2
 Commitment to ideology justifying violence 2 2
 Perceived victim of injustice 2 2
 Dehumanization/demonization of identified targets of injustice 2 2
 Rejection of democratic society and values 2 2
 Feelings of hate, frustration, persecution, alienation 2 2
 Hostility to national collective identity 2 1
 Lack of understanding, tolerance outside own group 2 2
Section A total 14/14 13/14
Context & intent
 Seeker, consumer, developer of violent extremist materials 1 1
 Identification of target (person, place, group) for attack 1 2
 Personal contact with violent extremists 1 2
 Anger and expressed intent to act violently 2 2
 Willingness to die for cause 2 2
 Expressed intent to plan, prepare violent action 2 2
 Susceptible to influence, authority, indoctrination 1 2
Section B total 10/14 13/14
History & capability
 Early exposure to pro-violence militant ideology 1 1
 Network (family, friends) involved in violent action 1 1
 Prior criminal history of violence 0 0
 Tactical, paramilitary, explosives training 0 2
 Extremist ideological training 1 1
 Access to funds, resources, organizational skills 0 2
Section C total 3/12 7/12
Commitment & motivation
 Glorification of violent action 2 2
 Driven by criminal opportunism 1 2
 Commitment to group, group ideology 2 2
 Driven by moral imperative, moral superiority 2 2
 Driven by excitement, adventure 2 2
Section D total 9/10 10/10
Protective items—a higher rating = less risk
 Re-interpretation of ideology less rigid, absolute 1 0
 Rejection of violence to obtain goals 1 0
 Change of vision of enemy 0 0
 Involvement with non-violent, de-radicalization, offense related programs 0 0
 Community support for non-violence 0 0
 Family support for non-violence 0 1
Section E total (subtracted from overall total as it is protective) 2/12 1/12
Overall total (SECTION A + B + C + D − E) 34 /50 42/50

Note. Low = 1–16; medium = 17–33; high = 34–50.

Source. Pressman and Flockton (2012); self-identified.

Table 3.

CYBERA Risk Rating for the Incels.co Forum and Minassian's Interview.

CYBERA indicator items Incels.Co forum Minassian police interview transcript
Low Moderate High Low Moderate High
Imagery
 Uses graphic representations, symbols or logos of terrorist groups, steganography 1 0
 Organizes site or page, rubrics to glorify violence, promote aggression, rigid worldview 2 0
 Uses musical background to incite hatred, recruitment to violent action 0 0
 Frequently changes profile and cover photos, inconsistent identity 2 0
Section A total 5/8 0/8
Semantic content
 Personal, terrorist narrative to promote extremist views, aggression, grievances 2 2
 Uses multiple or alternative narratives related to identity 2 2
 Uses idealized content about own community 1 2
 Uses language to display extremist views, allegiance to terrorist group, narrative 2 2
Section B total 7/8 8/8
Beliefs, attitudes, intention
 Commitment to ideology justifying violence 2 2
 Perceived victim of injustice, grievances for collective group or individual 2 2
 Rejection of democratic society and values 2 2
 Adherence to conspiracy theories about the affiliated ethnic/religious group 1 1
 Moral emotions: hate, anger, frustration, persecution, and/or alienation 2 2
 Identity conflict, rejection of national collective identity 2 1
 Lack of understanding, tolerance outside own group 2 2
 Expresses intention to act violently, incites violence 2 2
 Expresses willingness to die for cause, achieve martyrdom 2 2
 Evidence planning, preparing for violent action 2 2
Section C total 19/20 18/20
Virtual social network context
 Is affiliated with an online group or social media promoting terrorist violence 2 2
 Repetitively access posts blogs, extremist forums, terrorist information, know-how 2 2
 Establishes friendship bonds, networks other violent extremists online 2 2
 Susceptible to influence, authority, indoctrination 1 2
Section D total 7/8 8/8
Individual online activity related to capacity
 Has paramilitary, explosives training or information on explosives-bomb making 0 2
 Uses multiple e-mail addresses, names to obscure identity 2 2
Section E total 2/4 4/4
 Leadership, organizational skills
 Subject has large following, network, significant role as leader online 2 0
Section F total 2/2 0/2
Overall total (SECTION A + B + C + D + E + f) 42/50 38/50

Note. Low = 1–16; medium = 17–33; high = 34–50; CYBERA and VERA average of Incel forums = 38; CYBERA and VERA average of Minassian's transcript = 40.

Source. Pressman and Ivan (2019); self-identified.

In addition to analysis of language and terms, I conducted a manual review of incel forums and Minassian's interview to reveal non-textual details that contribute to risk rankings. This includes imagery in user-profiles and discussions, polls and surveys, and statistics on readership. I assess these findings according to the same standard of textual data.

As a result, I tallied risk rankings, comprised of assessed risk factors to a total of 505 marks; a low-risk rating would fall between 1–16 marks, medium as 17–33 marks, and high as 34–50 marks, respectively. I used CYBERA and VERA-2 to analyze both the incels.co forum and Minassian's interview individually, creating four total scores.

Limitations

Although incels.co is the largest incel forum, my analysis provides only one perspective of online participants’ activity while members are often involved in multiple social media groups. The format of these online forums also inhibits insight into highly personal or consistent relations between members, as it is designed for larger community discussion. I also only review public comments, not private messages. Importantly, Minassian refers to his private messaging with Rodger in discussing his grievances and plan of attack (Thomas, 2018). A wider analysis of online activity beyond public forums could enhance the understanding of the risk environment and the capability of extremist actors.

One of the key reasons for performing this research is also one of the main limitations: online discussion does not consistently translate to real-world violence. In particular, the format of online forums does not allow for concrete evidence to support claims of access to resources that could cause imminent danger.

Risk ratings are influenced by available information. Public information regarding his case derived from the Toronto Police Service continues to be limited as Minassian's trial is ongoing. Analysis of Minassian's case is limited to his police interview transcript and the use of media and imagery, as considered in CYBERA, is not captured in the method of analysis for Minassian's interview. This impacts the risk ranking of Minassian's case within CYBERA. Further investigation into his online lifestyle would provide a more fulsome risk assessment.

Findings and Discussion

Analysis of the incels.co forum and the Minassian case through VERA-2 and CYBERA suggest that the traditional assessments of terrorism dismiss gendered harm and hegemonic masculinity ideology.

Findings

I ranked the incels.co forum with a rating of 34 out of 50 on VERA-2, indicating a moderately high level of risk toward violent extremism. I assigned the incels.co forum with a rating of 42 out of 50 on CYBERA, which indicates a high level of risk. In assessment of the Toronto Van Attack, I ranked the Minassian case with a 42 out of 50 on VERA-2, indicating a high level of risk. I assigned the Minassian case with a 38 out of 50 on CYBERA, which also indicates a high level of risk. Tables 2 and 3 provide a breakdown of risk factor ratings.

Risk Assessment Findings

In the assessment of incels.co through VERA-2, the highest marks are in the Beliefs and Attitudes risk section, equating to 14 out of 14, due to a high level of hostility toward society and rationalization for violent behavior. In the Context and Intent and Commitment and Motivation risk sections, I ascribed high ratings of 10 out of 14 and nine out of 10, demonstrating increased and violent aggravation tied to ideological cause. In contrast, I rated the History and Capability risk section with a three out of 12, due to minimal discussion on personal criminal history. The Protective Items risk section has a two out of 12, suggesting limited discourse on interpersonal support and indicating a higher likelihood toward radicalization.

In analysis of incels.co through CYBERA, the Semantic Content risk section and the Virtual Social Network Context section both have a ranking of seven out of eight, demonstrating aggression toward society and ideological loyalty. The Beliefs, Attitudes, Intention risk section has a 19 out of 20, due to aggravated discourse and indications of rejection of society. I prescribed the Leadership, Organizational Skills risk section with a two out of two due to the large following of the platform. The Imagery and Individual Online Activity Related to Capacity risk sections have a rating of five out of eight and two out of four, respectively, as there is moderate use of graphics to reinforce in-group culture and limited information on personal history.

Assessment of the Minassian case through VERA-2 revealed notably high scores of 10 out of 10 in the Commitment and Motivation, Beliefs, and Attitudes, and 13 out of 14 in the Context and Intent section, respectively. This is due to Minassian's justifications for violence, attitudes of moral supremacy, and intent to harm. I ascribed a score of seven out of 12 for the History and Capability section, informed by his influence by other incel actors and acquisition of military training, while considering his lack of criminal history. The Protective Items section has a score of one out of 12, indicating a high level of risk as there is little evidence of community support or de-radicalization, though there is mention of emotional attachment to family.

In the assessment of the Minassian case through CYBERA, Semantic Content and Virtual Social Network Context sections have a score of eight out of eight, demonstrating aggressive language and social affiliations with radical individuals. I gave high rankings to the Beliefs, Attitudes, Intentions, and Individual Online Activity Related to Capacity sections amounting to 18 out of 20 and four out of four, respectively. This is due to Minassian's commitment to radical ideology, acquisition of weapons training for the purpose of violence, and development of online profiles to contact violent actors. I attributed low scores to the Imagery section resulting in zero out of eight due to limited access to available information. I gave the Leadership, Organizational Skills a low score of zero out of two due to his limited capacity as a leader within the movement.

Thematic Findings

Offline and Online Violence

Pressman's risk assessment frameworks effectively highlight the cruciality of online networks but dismisses the connectivity of online and offline violence. Online connectivity is central to the distribution of violent ideas (Megarry, 2014). The usage of online communications for violent discussion is apparent in high ratings of the CYBERA Virtual Social Network Context risk section—I prescribed the incels.co forum with a score of seven out of eight, while Minassian has an eight out of eight. Within the incels.co forum itself, there are also 204 references to shooting that discuss the effectiveness of mass shootings and violence. The vast network of the incel community is apparent through the incels.co forum; within one month, there are 5,302 new conversation threads and 117,016 individual comments in incels.co. Collectively, there are 5,298,240 views across all threads.

The online sphere and network is central in the spread of violent ideology—Minassian discusses his use of Reddit and 4chan forums as a form of social support (Thomas, 2018). In particular, he discusses his friendship with Rodger as a point of radicalization (Thomas, 2018) and praises Rodger in his final Facebook post before his act of violence (Cain, 2018). Furthermore, Minassian states that the nature of online anonymity is attractive as “it makes it very easy and you can ah hide behind a computer screen” (Thomas, 2018, p. 92). Online concealment enables the continuation of cyberviolence (Van Der Wilk, 2018) and emboldens individuals to act in violence (Jaki et al., 2019).

I prescribed the incels.co forum with higher marks in CYBERA versus VERA-2 with an overall score of 42 versus 34 out of 50, respectively. While taking into consideration the limitations of available information, discrepant risk ratings, with a difference of eight marks, point to a fractured understanding of motivations toward violence and how technology influences gendered harm. While CYBERA captures online dynamics and influences, VERA-2 focuses heavily on physical preparedness. The History and Capability risk section of VERA-2 describes prior physical training experience—incels.co forum has a score of three out of 12, and Minassian has seven out of 12, respectively. The Virtual Social Network Context and Individual Online Activity Related to Capacity risk sections of CYBERA point to online training and resources and social influences—I prescribed the incels.co forum with a combined score of nine out of 12 and Minassian with a 12 out of 12, respectively. With similar VERA-2 and CYBERA risk sections both equating to results out of 12, resultant assessments for both the incels.co forum and the Minassian differ—the CYBERA risk rating results are overall higher than VERA-2. This demonstrates how technology is a key component for understanding gendered and extremist harm. There is a need to synthesize perspectives of online and offline inspirations for violence. While CYBERA and VERA-2 are complementary tools, divergence in risk assessments on similar characteristics can fracture a holistic perspective in capturing gendered violence; the separation of assessment in online and offline activities of the same violent behavior can cloud the reality of the threat and underestimate the impact of online gendered violence.

Lone-Wolf Actors

Lone-wolf and incel-related violence is assumed to be isolated cases of aggression rather than patterns of GBV. In analyzing incels.co and the Minassian case, the glorification of violence is connected to the moral superiority of incel ideology and collectivity within group loyalty. In VERA-2, I prescribed the incels.co forum and Minassian with a score of two out of two in the Commitment to Ideology and Driven by Moral Imperative, Moral Superiority risk sections. Minassian affirms the morality of violence by arguing that “they [should] acknowledge uhm the incel…as the more superior ones” (Thomas, 2018, p. 109). Moral superiority is also exemplified in the term supreme gentleman, used in the incel community to honor incel members that inflict violence against women (Incel Inside Wikipedia, 2020b). Through analyzing references to Alek Minassian in the incels.co forum, I identify that there are 113 mentions of his name that glorify his actions. This includes phrases such as “Alek Minassian heroic sacrifice”; “Alek Minassian was doing God's work”; and “All hail Alek Minassian” (Incels.co, 2020). Violence is reaffirmed by collective reification of the moral superiority of incel ideology. Social influences toward extremism require the collective support of the incel community.

Acts of violence are assured by group loyalty and collectivity. Minassian has a 10 out of 10 risk rating in the VERA Commitment and Motivation risk section, indicating a willingness to act in violence out of group loyalty (Pressman & Ivan, 2019). Minassian identifies online incel forums as a form of social support (Thomas, 2018), discussing his planned attack online which was well-received by other members (Thomas, 2018). He stresses the importance of the online community, where members provide “encouraging support so that he would have the courage to ah start his rebellion” (Thomas, 2018, p. 115). His actions have also inspired others within the community to follow suit in discussing further plans of violence (Thomas, 2018). Lone-wolf actors reinforce their beliefs and justify gendered violence through online communities (Figure 1).

Figure 1.

Figure 1.

Word tree query of “Alek Minassian” within the incels.co forum. Source. NVivo; Incels.co (2020).

Social Identities and Terrorism

Risk frameworks align with traditional conceptions of terrorism based on race and religion, underemphasizing gendered ideology (Stampnitzky, 2017). In CYBERA, risk factors such as Adherence to Conspiracy Theories about the Affiliated Ethnic/Religious Group categorically dismiss gendered theories of violence, such as hegemonic masculinity. Gendered motivations remain dismissed as a factor for ideological violence. Furthermore, Detective Thomas attempts to ascribe assumptions of religious and racialized motivation in interviewing Minassian, a man of Iranian descent. This is evident in his series of questions, asking “Okay ah can I ask what religion you are…You don’t follow any denominations…Can I ask what religion you were raised…so you don’t have any religious ideology or…” (Thomas, 2018, p. 170). This demonstrates how Thomas attempts to prescribe religious and racial assumptions of terrorism upon Minassian while dismissing gendered ideologies as a motivation for violence.

Detective Thomas indulged gendered stereotypes in interviewing Minassian. When discussing how women, known as Stacys, pursue alpha men, known as Chads, Thomas states that “Exactly, the Stacey's are…the [ditzy] dumb girls dating ah the goofy…jocks” (Thomas, 2018, p. 87). Thomas draws on gendered stereotypes while simultaneously downplaying gendered ideology, neglecting to question why Minassian's attack targeted women pedestrians.

Gender and Terrorism

Analysis of gendered terminology reveals indications of aggravated violence against women, where gendered harm is a frequent theme and topic within the incel community; gendered violence must be incorporated in the assessment of extremism and terrorism. The incels.co forum has a seven out of eight ranking in CYBERA's Semantic Content risk section, highlighting grave hostility toward others and allegiance to an extremist narrative (Pressman & Ivan, 2019). While blatantly aggressive terminology such as 204 counts of shooting and 49 counts of assault in incels.co is understood as violent, a gendered analysis of the forum reveals the underlying nuances and of violence against women (Hammer, 2002, as cited in Pain, 2014). Phrases such as going ER refer to the actions of Elliot Rodger, short formed as ER, which glorifies Rodger's actions of violence against women. This phrase is mentioned 119 times on incels.co. Forum participants argue that mass violence against women is the only solution for the retribution of perceived harm against men. Additionally, frequently used gendered phrases of sexual violence include rape fuck, going to rape, and get raped.

The beta uprising is mentioned 55 times in the incels.co forum. The beta uprising is a proposed incel revolution that will violently rectify perceived injustices against women who neglect incels for more attractive men (Thomas, 2018). References to this phrase draw on dominance and superiority, demonstrating urgency and revolutionary undertones. In Figure 2, notable phrases associated with the beta uprising include “beta uprising is happening BOYOS, BRACE YOURSELVES”; “Only way is beta uprising”; and “beta uprising WHEN BOYS?” (Incels.co, 2020). The term, BOYOS, is a colloquial reference to boys, inferring to the incel community at large in this case. Furthermore, Minassian repeatedly promotes the beta uprising ideology (Thomas, 2018). Minassian expresses no remorse for his actions but refers to the attack as a beta uprising in the completion of his incel operation (Thomas, 2018). Analysis of the term beta uprising demonstrates that gendered motivation in targeting mass violence against women is central to the incel ideology.

Figure 2.

Figure 2.

Word tree query of “Beta uprising” within incels.co forum. Source. NVivo; Incels.co (2020).

Gendered Hegemonic Language

Critical discourse and semantic analysis of the incels.co forum and Minassian's interview reveal dominating hate speech and gendered hegemonic language, which seeks to re-establish a gendered hierarchy (Connell & Messerschmidt, 2005). I assigned the incels.co forum with a score of 14 out of 14 in the VERA-2 Beliefs and Attitudes risk section, demonstrating frequent usage of deprecating language used to subjugate women to their sexual performance. This includes 994 counts of roastie to describe a woman's labia after sexual intercourse and 880 counts of femoid to describe women as non-human and inferior objects (Squirrell, 2018). Other terms used to reaffirm male domination include cock carousel, cum dumpster, and cunt rag to denote women as inferior sexual objects. These descriptions are used in conjunction with depictions of sexual violence such as smash and dash, pump and dump, or fuck and chuck, to demonstrate the sexual domination and discard of women. These terms, which are not common to everyday language, reaffirm a sense of group identity and allow incel actors to adapt their language to evade mainstream attention (Sorial, 2015).

Furthermore, gendered language focuses on degradation of women, rather than commenting on men. While discourse surrounding men can range from self-deprecation and victimization or empowerment by the beta rising, discourse related to women focuses on denigration and deprecation. This allows incel members to reclaim masculine identity and reposition themselves as the dominant figure in the gender hierarchy. For example, incels adopt aggressive and lethal language against women, indicative of hate speech. The incels.co forum and Minassian both receive a ranking of two out of two in the VERA-2 Dehumanization and Demonization of Identified Targets of Injustice risk factor, demonstrating hateful language directed toward women. Pejorative language toward women is used in forums, such as cocksucker and fuckface. Other violent verbs used include 358 counts of beat, 99 counts of burn, 25 counts of choke, and 18 counts of slaughter. The frequent and graphic use of violent vocabulary directed toward women also affirms that hate speech targeted toward the female population often takes a more lethal and violent approach (El Sherief et al., 2018). The frequent use of hate speech, in conjunction with an online environment where group identity forms around mutual resentment toward women and normalization of violence, creates an increasingly volatile environment that contributes to GBV.

Recommendations and Conclusion

This paper has examined how the incel community is interpreted using Canadian violent extremism risk assessment tools and has evaluated how GBV is considered within frameworks for hate speech and terrorism. As a result of my findings, I suggest that existing literature on online GBV must expand to explore the implications of technology on women's experience online and for gendered harm. In addition, risk assessment frameworks must place importance on gendered violence, the continuum of online and offline harm, and the use of technology.

Existing literature and institutional definitions of online GBV focus on the sexualization of harm (Henry & Powell, 2015) or cyberbullying of young women (Status of Women Canada, 2019); however, this study demonstrates that emerging forms of online GBV are not always sexual and are rooted in fundamental beliefs that seek to establish gendered dominance. Expansion of Henry & Powell's (2018) term “technology-facilitated sexual violence” to “technology-facilitated gender-based violence (TFGBV)” extends definitions of online instances of harm against women to incorporate wider intent and experiences, rooted in conflicts surrounding gender roles and ideologies, such as hegemonic masculinity (Hinson et al., 2018). As demonstrated in this study, TFGBV, in the case of incels, can manifest in physical and psychological ways that draw on stereotypical tropes or dominating and lethal language to cause gendered harm that is not necessarily sexual.

Extremist risk assessment frameworks must be inclusive of GBV. Methods for countering violent extremism posed by the incel community require intercepting hateful speech online, bridging the online and offline divide, and providing additional support for young men (Hoffman et al., 2020). Within VERA-2, the Seeker, Consumer, Developer of Violent Extremist Material risk factor must include the consumption of incel manifestos and online motivations of violence against women. This study reveals that technology simultaneously enables the facilitation of GBV against women while protecting perpetrators of harm; technology is leveraged by gender-based ideological groups to expand their goals. Explicitly gender-aware risk assessment must effectively identify how technology contributes to the continuation of TFGBV. As forms of terrorism continue to evolve toward online and gendered violence, categorization of risk must be adaptable over time and place, without assumptions of ethnic and religious bias. This includes broadening the CYBERA Adhere to Conspiracy Theories about the Affiliated Ethnic/Religious Group risk factor to encompass fluid ideological theories and gendered dimensions. VERA-2 and CYBERA categories must reflect the evolving nature of terrorism.

As demonstrated, VERA-2 and CYBERA produce separate results when analyzing the same case study. While VERA-2 and CYBERA are complementary frameworks, a dichotomy between offline and online assessment prevents a unified analysis of extremism and does not acknowledge how violence operates on a continuum across online and offline. Risk assessment tools must connect how offline capabilities, such as access to physical resources, are tied to online activity such as information gathering and network-building.

Beyond CYBERA and VERA-2, risk assessment tools must simultaneously consider gendered and technological dynamics. Violent extremism tools must meaningfully integrate gender dynamics, while GBV risk assessment tools must analyze technological influence in violence. Violent extremism and domestic violence cannot be assessed as mutually exclusive forms of violence, as this allows online groups such as incels to continue to be prescribed as lone-wolf actors. This study demonstrates a need to revise risk assessment tools to be inclusive of gendered ideological motivations and technologization of harm. Emerging threat actors can capitalize on the online sphere to mobilize violence. Technology is mobilized by the incel community to further extremist ideological goals and perpetuate online and offline GBV against women.

Acknowledgments

I thank Dr. Cathy McIlwaine for her expertise and endless support throughout the completion of this study. Catharina O’Donnell provided valuable discussion about research direction as well as comments on drafts, and Dr. Kelly Gordon provided guidance on editorial direction. Jason Masih and Christopher Balian provided advice on data acquisition. I also thank Cherry Wu and Vivian Qiang for their encouragement throughout the research process.

Author Biography

Esli Chan's research focuses on the intersection of gender, technology, and extremism. She is currently a PhD Political Science Candidate at McGill University. She has obtained a Master’s of Science in Risk Analysis (Distinction) from King's College London and a Bachelor of Arts in Political Science (Honours) from McGill University.

Appendix 1.

Lexicon-defined terms manually assigned to associated CYBERA and VERA-2 frameworks.

Framework Risk section Risk factors Associated terms from lexicons
CYBERA Beliefs & Attitudes Adherence to conspiracy theories about the affiliated ethnic/religious group BBC theory, theory
CYBERA, VERA-2 Beliefs & Attitudes Commitment to ideology justifying violence Assault, attack, blood, death, massacre, shoot, shootings, assail, bang, batter, bang, batter, blast, block, bring down, bruise, brutalize, butcher, carry off, clobber, constrain, crush, decimate, demolish, destroy, dispose of, drown, enslave, exterminate, finish off, flagellate, gag, hit, kick, maul, obliterate, pelt, plunk, pummel, punch, raid, ram, shove, slam, slap, slaughter, slog, smack, smash, smother, stab, strangle, strike, thrash, thwack, trample, whip
VERA-2 Beliefs & Attitudes Dehumanization/demonization of identified targets of injustice Bulldyke, cock carousel, cocksucker, demons, fag, false rape accusation, femoid, fuckface, lardass, mangle, negress, roastie, SMV, stacey, target, Western Women, WW, Ameriskanks, skank, cunt, twat, whores, whore, dyke, slut, sluts, pussy, baleia, ass whore, bitch ass, bunga, carpet muncher, chode, cocksucker, cocksucking, cumdumpster, cunt rag, gangbang, lolita, milf, cherry popper, clitfuck, cock block, cock tease, conquer, gangbang, incest, lolita, molest, molestation, pound, sodomize, sodomize, spank, unclefucker, virginbreaker
CYBERA, VERA-2 Beliefs & Attitudes Feelings of hate, frustration, persecution, alienation Beta, cope, cuck, dumb fuck, harem, LDAR, rope, unfuckable, dumbfuck, fuckhead, fucktard, scumbag, bite me, chav, eat pussy, gangbanger, slag
CYBERA, VERA-2 Beliefs & Attitudes Hostility to national collective identity Canadian, inbred, indians, arab, assnigger, beaner, Chinaman, chink, darky, drake, gipsy, gook, goy, gringo, guido, gypsy, half-breed, half-caste, hapa, heeb, injun, jap, jerry, kaffir, kafir, kike, kraut, mick, mulatto, native, negro, negroes, negroid, nig, niger, nigga, niglet, nignog, nonwhite, oriental, paddy, polack, paki, porch monkey, primitive, pygmy, raghead, sandnigger, snownigger, spade, spic, spook, uncle tom, WASP, waspy, wetback, white trash, whitey, top, Yank, yankee, albino, anglo, can eater, ching chong, cracker, dego, dot head, fob, Greaser, heeb, hodgie, mick, mulatos, mutt, pancake face, slant, towel head
CYBERA, VERA-2 Beliefs & Attitudes Lack of understanding, tolerance outside own group AWALT, chad, creep shaming, divorce rape, gaming
CYBERA, VERA-2 Beliefs & Attitudes Perceived victim of injustice, grievances for collective group or individual Brutal, chang, tyrone, chadpreet, cruel, equalism, fairy, it's over, it never began, pain, race pill, scream, suffer, TFL, torture
CYBERA, VERA-2 Beliefs & Attitudes Rejection of democratic society and values Beastiality, coon, crime, democrats, felony, JBW, just be white, law, numale, oneitis, prison, racist, SJW, snowflake, western women, white knight
VERA-2 Commitment & Motivation Commitment to Group, Group Ideology Androphile, apex fallacy, black pill, Blackops2cel, BTFO, currycel, dark triad, ethnicel, incel brothers, mangina, red pill, ricecel, whitepill, womb to tomb, wizard
VERA-2 Commitment & Motivation Driven by criminal opportunism Molest, molestation, molester, molestor, opportunity
VERA-2 Commitment & Motivation Driven by excitement, adventure War, fight, prevail, army
VERA-2 Commitment & Motivation Driven by moral imperative, moral superiority Beta uprising, nofap, shitcunt, smash and dash, pump and dump, fuck and chuck, smear, sub human
VERA-2 Commitment & Motivation Glorification of violent action Alek Minassian, beat, chambers, Elliot Rodger, Saint Elliot, ER, stomp, supreme gentleman, throat, cough, neck, breath, Andreas Lubitz, Cho, Colez, Misclegend, supreme gentleman
VERA-2 Context & Intent Anger and expressed intent to act violently Annihilate, assassinate, assassination, burn, choke, cut, going ER, knife, loaded gun, mog, murder, punish, rape, rebellion, rifles, slaughter, subjugate, suppress, suey, visit Gandy, visit Grier, visit Orton, visit Kinney, suicide fuel, suifuel, terrorize, threaten, weapon
VERA-2 Context & Intent Expressed intent to plan, prepare violent action Clitoris, scrotum, get raped, gun, kill, kill all, kill yourself, menace, overthrow, uprising
VERA-2 Context & Intent Identification of target (person, place, group) for attack Target
VERA-2 Context & Intent Personal contact with violent extremists Alek Minassian, Elliot Rodger, Saint Elliot, ER
VERA-2 Context & Intent Seeker, consumer, developer of violent extremist materials Manifesto
VERA-2 Context & Intent Susceptible to influence, authority, indoctrination All hail, HH, 88
VERA-2 Context & Intent Willingness to die for cause I would die for, suey, visit Gandy, visit Grier, visit Orton, visit Kinney, suicide fuel, suifuel,
VERA-2 History & Capability Access to funds, resources, organizational skills Money
VERA-2 History & Capability Early exposure to pro-violence militant ideology How old are
VERA-2 History & Capability Extremist ideological training Radicalized
VERA-2 History & Capability Network (family, friends) involved in violent action My brother, my dad, my family
VERA-2 History & Capability Prior criminal history of violence Was in jail
VERA-2 History & Capability Tactical, paramilitary, explosives training Military
CYBERA Imagery Frequently changes profile and cover photos, inconsistent identity N/A
CYBERA Imagery Organizes site or page, rubrics to glorify violence, promote aggression, rigid worldview
CYBERA Imagery Uses graphic representations, symbols or logos of terrorist groups, steganography Pepe the frog
CYBERA Imagery Uses musical background to incite hatred, recruitment to violent action N/A
CYBERA Individual Online Activity Related To Capacity Has paramilitary, explosives training or information on explosives-bomb making Military, weapon, war
CYBERA Individual Online Activity Related To Capacity Uses multiple e-mail addresses, names to obscure identity N/A
CYBERA Leadership, Organizational Skills Subject has large following, network, significant role as leader online Alpha, AMOG
VERA-2 Protective Items Change of vision of enemy White knight
VERA-2 Protective Items Community support for non-violence Community, kid
VERA-2 Protective Items Family support for non-violence Friendless, handholdless, kissless, hugless, my family
VERA-2 Protective Items Involvement with non-violent, de-radicalization, offense related programs Therapy
VERA-2 Protective Items Re-interpreation of ideology less rigid, absolute Purplepill
VERA-2 Protective Items Rejection of violence to obtain goals Violence is not
CYBERA Semantic Content Personal, terrorist narrative to promote extremist views, aggression, grievances Attack, assault, death, shoot, rope, fight, my brothers, blackpill, going ER, suey, suifuel, terrorize, murder, kill, kill all, kill yourself, uprising
CYBERA Semantic Content Uses idealized content about own community Beta uprising, rebellion, all hail, my brothers, overthrow, prevail, fight, sub human, supreme gentleman
CYBERA Semantic Content Uses language to display extremist views, allegiance to terrorist group, narrative All hail, HH, 88, beta uprising, rebellion, beta, it's over, it never began, black pill, red pill, whitepill, womb to tomb, wizard, supreme gentleman
CYBERA Semantic Content Uses multiple or alternative narratives related to identity N/A
CYBERA Virtual Social Network Context Establishes friendship bonds, networks other violent extremists online Community, inceldom, my brother,
CYBERA Virtual Social Network Context Is affiliated with an online group or social media promoting terrorist violence 4chan, cucktears, foreveralone, inceltears, pol, r9k, reddit
CYBERA Virtual Social Network Context Repetitively access blogs, extremist forums, terrorist information, know-how N/A
CYBERA Virtual Social Network Context Susceptible to influence, authority, Indoctrination All hail, HH, 88

Source. Pressman and Flockton (2012), Pressman and Ivan (2019), Incels.co (2020); self-identified.

1.

The Incel Inside Wikipedia glossary has 380 terms, and the Manosphere Wikipedia Glossary has 136 terms as of 18 July 2020, respectively; however, the glossaries are amendable by the public, terms may be added or deleted.

2.

Refer to Figure 1 for an example of a word tree query.

3.

Refer to Appendix 1 for a matrix of terms assigned to VERA-2 and CYBERA risk factors.

4.

Refer to Tables 2 and 3 for an overview of the VERA-2 and CYBERA framework.

5.

Each factor is given the weight of 1 total mark, adding up to a total of 50 marks. Protective items are left out from the addition of 50 marks but would be subtracted from the total as they indicated the mitigation of risk. The resultant 50 marks are divided into three to provide a relative scale of low to high risk.

Footnotes

The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding: The author(s) received no financial support for the research, authorship, and/or publication of this article.

References

  1. Allely C. S., Faccini L. (2017). ‘Path to intended violence’ model to understand mass violence in the case of Elliot Rodger. Aggression and Violent Behaviour, 37, 201–209. 10.1016/j.avb.2017.09.005 [DOI] [Google Scholar]
  2. Baele S., Brace L., Coan T. (2019). From “Incel to “Saint”: Analyzing the violent worldview behind the 2018 Toronto attack. Terrorism and Political Violence, 33(8), 1–25. 10.1080/09546553.2019.1638256 [DOI] [Google Scholar]
  3. Baruch B., Ling T., Warnes R., Hofman J. (2018). Evaluation in an emerging field: Developing a measurement framework for the field of counter-violent-extremism. Evaluation, 24(4), 475–495. 10.1177/1356389018803218 [DOI] [Google Scholar]
  4. BBC News. (2018). Elliot Rodger: How misogynist killer became ‘incel hero’. BBC. https://www.bbc.com/news/world-us-canada-43892189
  5. Beardsley N., Beech A. (2013). Applying the violent extremist risk assessment (VERA) to a sample of terrorist case studies. Journal of Aggression, Conflict and Peace Research, 5(1), 4–15. 10.1108/17596591311290713 [DOI] [Google Scholar]
  6. Bell S., Russell A., McDonald C. (2020, May 19). Deadly attack at Toronto erotic spa was incel terrorism police allege. Global News. https://globalnews.ca/news/6910670/toronto-spa-terrorism-incel/
  7. Bouder F., Slavin D., Lofstedt R. (2007). The tolerability of risk: A new framework for risk management. Earthscan. [Google Scholar]
  8. Bratich J., Banet-Weiser S. (2019). From pick-up artists to incels: Con(fidence) games, networked misogyny, and the failure of neoliberalism. International Journal of Communication, 13, 5003–5027. https://ijoc.org/index.php/ijoc/article/view/13216 [Google Scholar]
  9. Byerly C. (2020). Incels online reframing sexual violence. The Communication Review, 23(4), 290–308. 10.1080/10714421.2020.1829305 [DOI] [Google Scholar]
  10. Cain P. (2018). What we learned from Alek Minassian’s incel-linked Facebook page – and what we’d like to know. Global News. https://globalnews.ca/news/4164340/alek-minassian-facebook-page/
  11. Connell R. W., Messerschmidt J. (2005). Hegemonic masculinity: Rethinking the concept. Gender & Society, 19(6), 829–859. 10.1177/0891243205278639 [DOI] [Google Scholar]
  12. Davidson T. (2019). Hate-speech-and-offensive-language. Github. https://github.com/t-davidson/hate-speech-and-offensive-language/tree/master/lexicons
  13. Dynel M. (2020). Vigilante disparaging humor at r/IncelTears: Humor as critique of incel ideology. Language & Communication, 74, 1–14. 10.1016/j.langcom.2020.05.001 [DOI] [Google Scholar]
  14. El Sherief M., Kulkarni V., Nguyen D., Wang W., Belding E.(2018). Hate lingo: A target- based linguistic analysis of hate speech in social media. arXiv, 42–51. 10.48550/arXiv.1804.04257 [DOI] [Google Scholar]
  15. Farrell T., Fernandez M., Novotny J., Harith A. (2019). Exploring misogyny across the manosphere in Reddit. WebSci’19 Proceedings of the 10th ACM Conference on Web Science, 87–96. 10.1145/3292522.3326045 [DOI] [Google Scholar]
  16. Geen R., Stonner D. (1975). Primary associates to 20 verbs connoting violence. Behaviour Research Methods & Instrumentation, 7(4), 391–392. 10.3758/BF03201552 [DOI] [Google Scholar]
  17. Hammer R. (2002). Antifeminism and family terrorism: A critical feminist perspective . Rowman & Littlefield Publishers. [Google Scholar]
  18. Hart S., Cook A., Pressman E., Strang S., Lim Y. (2017). A concurrent evaluation of threat assessment tools for the individual assessment of terrorism. Canadian Network for Research on Terrorism, Security, and Society, 17(1), 1–59. [Google Scholar]
  19. Hassanpour N. (2013). Tracking the semantics of politics: A case for online data research in political science. Political Science and Politics, 46(2), 299–306. 10.1017/S1049096513000280 [DOI] [Google Scholar]
  20. Hatebase. (2020). The world’s largest structured repository of regionalized, multilingual hate speech . https://hatebase.org/
  21. Henry N., Powell A. (2015). Embodied harms: Gender, shame, and technology-facilitated sexual violence. Violence Against Women, 21(6), 758–779. 10.1177/1077801215576581 [DOI] [PubMed] [Google Scholar]
  22. Henry N., Powell A. (2018). Technology-facilitated sexual violence: A literature review of empirical research. Trauma, Violence, & Abuse, 19(2), 195–208. 10.1177/1524838016650189 [DOI] [PubMed] [Google Scholar]
  23. Hinson L., Mueller J., O’Brien-Milne L., Wandera N. (2018). Technology-facilitated gender-based violence: What is it, and how do we measure it?. Washington, DC: International Center for Research on Women. [Google Scholar]
  24. Hoffman B., Ware J., Shapiro E. (2020). Assessing the threat of incel violence. Studies in Conflict and Terrorism, 43(7), 565–587. 10.1080/1057610X.2020.1751459 [DOI] [Google Scholar]
  25. Incel Inside Wikipedia. (2020a). /r/inceltears. https://incels.wiki/w//r/inceltears .
  26. Incel Inside Wikipedia. (2020b). Incel term glossary. https://incels.wiki/w/Incel_Term_Glossary
  27. Incel Inside Wikipedia. (2020c). Incels.co. https://incels.wiki/w/Incels.co
  28. Incels.co. (2020). Incels.co involuntary celibate forums. https://incels.co
  29. Jaki S., De Smedt T., Gwóźdź M., Panchal R., Rossa A., De Pauw G. (2019). Online hatred of women in the incels.me forum: Linguistic analysis and automatic detection. Journal of Language Aggression and Conflict, 7(2), 240–268. 10.1075/jlac.00026.jak [DOI] [Google Scholar]
  30. Janis I. (1991). Groupthink. In A First Look at Communication Theory, pp. 235–246
  31. Lilly M. (2016). ‘The world is not a safe place for men’: The representational politics of the manosphere. University of Ottawa. [Google Scholar]
  32. Lloyd M., Dean C. (2016). The development of structure guidelines for assessing risk in extremist offenders. Journal of Threat Assessment and Management, 2(1), 40–52. 10.1037/tam0000035 [DOI] [Google Scholar]
  33. Logan C., Lloyd M. (2019). Violent extremism: A comparison of approaches to assessing and managing risk. Legal and Criminological Psychology, 24(1), 141–161. 10.1111/lcrp.12140 [DOI] [Google Scholar]
  34. Lupton D. (1999). Risk. Routledge. [Google Scholar]
  35. Mahoney K. (2009). Hate speech, equality, and the state of Canadian law. Wake Forest Law Review, 44(1), 321–352. [Google Scholar]
  36. Malthaner S., Kindekilde L. (2017). Analyzing pathways of lone-actor radicalization: A relational approach. In Stohl M., Burchill R., Englund S. (Eds.), Constructions of terrorism: An interdisciplinary approach to research and policy (pp. 163–180). University of California Press. [Google Scholar]
  37. Marwick A. E., Caplan R. (2018). Drinking male tears: Language, the manosphere, and networked harassment. Feminist Media Studies, 18(4), 543–559. 10.1080/14680777.2018.1450568 [DOI] [Google Scholar]
  38. McCulloh J., Maher J. M., Fitz-Gibbon K., Segrave M., Roffee J. (2016). Review of the family violence risk assessment and risk management framework (CRAF). Monash University. [Google Scholar]
  39. Megarry J. (2014). ‘Online incivility or sexual harassment? Conceptualising women’s experiences in the digital age’. Women’s Studies International Forum, 47(A), 46–55. 10.1016/j.wsif.2014.07.012 [DOI] [Google Scholar]
  40. Mullet D. (2018). A general critical discourse analysis framework for educational research. Journal of Advanced Academics, 29(2), 116–142. 10.1177/1932202X18758260 [DOI] [Google Scholar]
  41. Nockelby J. (2000). Hate speech. Encyclopedia of the American Constitution, 3, 1277–1279. [Google Scholar]
  42. O’Donnell C., Shor E. (2022). ‘This is a political movement, friend’: Why “incels” support violence. British Journal of Sociology, 73(2), 336–351. 10.1111/1468-4446.12923 [DOI] [PubMed] [Google Scholar]
  43. O’Malley R., Holt K., Holt T. (2020). An exploration of the involuntary celibate (incel) subculture online. Journal of Interpersonal Violence, 37(7–8), 1–28. 10.1177/0886260520959625 [DOI] [PubMed] [Google Scholar]
  44. Pain R. (2014). Everyday terrorism: Connecting domestic violence and global terrorism. Progress in Human Geography, 38(4), 531–550. 10.1177/0309132513512231 [DOI] [Google Scholar]
  45. Platt B. (2018, January 22). Liberals reviewing option to revive controversial internet hate speech law repealed in 2013. National Post. https://nationalpost.com/news/politics/liberals-reviewing-option-to-revive-controversial-hate-speech-law-repealed-in-2013
  46. Pressman E. (2009). Risk assessment decisions for violent political extremism. Public Safety Canada. https://www.publicsafety.gc.ca/cnt/rsrcs/pblctns/2009-02-rdv/index-en.aspx
  47. Pressman E., Flockton J. (2012). Calibrating risk for violent political extremists and terrorists: The VERA 2 structured assessment. The British Journal of Forensic Practice, 14(4), 237–251. 10.1108/14636641211283057 [DOI] [Google Scholar]
  48. Pressman E., Ivan C. (2019). Internet use and violent extremism: A cyber-VERA risk assessment protocol. In Khosrow-Pour M. (Ed.), Multigenerational online behaviour and media use: Concepts, methodologies, tools, and applications (pp. 43–61). IG Global. [Google Scholar]
  49. Public Safety Canada. (2018). 2018 public report on the terrorism threat to Canada. https://www.publicsafety.gc.ca/cnt/rsrcs/pblctns/pblc-rprt-trrrsm-thrt-cnd-2018/index-en.aspx
  50. Rational Wikipedia. (2020). Manosphere glossary . https://rationalwiki.org/wiki/Manosphere_glossary
  51. Regehr K. (2020). How technology facilitated misogyny moves violence off screens and on to streets. New Media & Society, 24(1), 138–155. 10.1177/1461444820959019 [DOI] [Google Scholar]
  52. Rezvan M., Github. (2018). Harassment-corpus . https://github.com/Mrezvan94/Harassment-Corpus
  53. Scribd. (2018). Alek Minassian interview . https://www.scribd.com/document/427612854/Alek-Minassian-Interview
  54. Sorial S. (2015). Hate speech and distorted communications: Rethinking the limits of incitement. Law and Philosophy, 34, 299–324. [Google Scholar]
  55. Squirrell T. (2018). May 31: A definitive guide to Incels part two: The A-Z incel dictionary . https://www.timsquirrell.com/blog/2018/5/30/a-definitive-guide-to-incels-part-two-the-blackpill-and-vocabulary
  56. Stampnitzky L. (2017). Can terrorism be defined? In Stohl M., Burchill R., Englund S. (Eds.), Constructions of terrorism: An interdisciplinary approach to research and policy (pp. 11–20). University of California Press. [Google Scholar]
  57. Status of Women Canada. (2019). A year in review 2018-2019. Canada’s strategy to prevent and address gender-based violence . https://cfc-swc.gc.ca/violence/strategy-strategie/report-rapport2019-en.html
  58. Stoleru M., Costescu E. (2014). (Re)Producing violence against women in online spaces. Philobiblon, 19(1), 95–114. [Google Scholar]
  59. Tanczer L., Neira I., Parkin S., Patel T., Danezis G. (2018). Gender and IOT research report: The rise of the internet of things and implications for technology-facilitated abuse. UCL. [Google Scholar]
  60. The Mob Reporter - Youtube. (2019. September 17). Alek Minassian – FULL police interrogationof incel van attack driver [Video]. https://www.youtube.com/watch?v=VyHgtSy41VM&t=4808s
  61. Thomas R. (2018). Electronically recorded interview of Alek Minassian by Detective Robert Thomas (3917) of the Sex Crimes Unit Polygraph Unit on Monday, April 23, 2018, at 2246 hours . Toronto Police Service. [Google Scholar]
  62. United Nations Broadband Commission for Digital Development Working Group on Broadband and Gender. (2015). Cyber violence against women and girls . https://www.broadbandcommission.org/Documents/reports/bb-wg-gender-discussionpaper2015-executive-summary.pdf
  63. United Nations General Assembly. (2018). Report of the special rapporteur on violence against women, its causes and consequences on online violence against women and girls from a human rights perspective . https://www.ohchr.org/EN/HRBodies/HRC/RegularSessions/Session38/Documents/A_HRC_38_47_EN.docx
  64. United Nations Office on Drugs and Crime. (2018). Defining terrorism . https://www.unodc.org/e4j/en/terrorism/module-4/key-issues/defining-terrorism.html
  65. Van Der Wilk A. (2018). Cyber violence and hate speech online against women. Study for the FEMM Committee, European Parliament. [Google Scholar]
  66. Walker J. (2018). Hate speech and freedom of expression: Legal boundaries in Canada. Library of Parliament. https://lop.parl.ca/sites/PublicWebsite/default/en_CA/ResearchPublications/201825E
  67. Warner W., Hirschberg J. (2012)Detecting hate speech on the world wide web. Proceedings of the Second Workshop on Language in Social Media, 19–26. [Google Scholar]
  68. Wilson J. (2018, April 24). Toronto van attack: Facebook post may link suspect to misogynist ‘incel’ subculture. The Guardian. https://www.theguardian.com/world/2018/apr/24/toronto-van-attack-facebook-post-may-link-suspect-with-incel-group
  69. Winter A. (2019). Online hate: From the far-right to the ‘alt-right’, and from the margins to the mainstream. In K. Lumsden & E. Harmer (Eds.), Online othering: Exploring violence and discrimination on the web (pp. 39–63). Palgrave. https://www.palgrave.com/9783030126322. [Google Scholar]
  70. Zimonjic P. (2018, April 23). Deadly van ‘attack’ is not part of organized terror plot, says Public Safety Minister Ralph Goodale. CBC News. https://www.cbc.ca/news/politics/federal-leaders-respond-van-incident-1.4631909

Articles from Violence against Women are provided here courtesy of SAGE Publications

RESOURCES