Skip to main content
Wiley Open Access Collection logoLink to Wiley Open Access Collection
. 2026 Mar 16;61(2):e70202. doi: 10.1002/ijop.70202

The Psychological Pathways to AI Resistance: Religiosity, Populism and Anxiety as Predictors of AI Engagement

Zeynep Aytaç 1, Erdal Yıldırım 2, Muhammed Bilgehan Aytaç 3,
PMCID: PMC12993245  PMID: 41840953

ABSTRACT

This study investigates how religiosity affects individuals' intentions to engage with artificial intelligence (AI) using a non‐WEIRD sample (N = 422). We examined populist beliefs and AI‐related anxiety as mediators in this relationship. Results indicate that higher religiosity is associated with stronger populist attitudes, which increase AI anxiety. Elevated AI anxiety then leads to reduced willingness to engage with AI technologies. The serial mediation pathway from religiosity through populism and AI anxiety to AI engagement intention was marginally significant, suggesting a potential indirect effect. Additionally, religiosity directly negatively predicts AI engagement and positively predicts populism, while populism strongly predicts AI anxiety, which negatively impacts AI engagement. These findings reveal important psychological mechanisms underlying how religiosity shapes attitudes towards AI adoption.

Keywords: AI, AI anxiety, ChatGPT, populism, religiosity

1. Introduction and Literature Review

The pace of advancements in AI is so rapid that behaviours once deemed intelligent in machines just a few years ago are now seen as scarcely notable. Today, computers and machines are far more capable of performing tasks that were once exclusive to humans. They have become more intelligent, interacting through voice and executing various functions with incredible speed and accuracy. Artificial intelligence, demonstrated by machines exhibiting elements of human intelligence, is increasingly used in services and is now a significant driver of innovation (Rust and Huang 2014). Through the development and assessment of sophisticated programs and gadgets known as intelligent agents, which are capable of carrying out a variety of tasks, artificial intelligence (AI) has impacted the way we go about our daily lives.

Although AI technologies offer potential benefits, public acceptance, adoption and continuous use still vary significantly. This variation can be attributed to several psychological and cultural factors. Addressing these concerns is especially essential in the Global South, where limited access to AI education, infrastructure, and investment can deepen existing inequalities (Das et al. 2024). In this study, we tried to address these concerns through religious beliefs and populist attitudes towards science/technology, which are commonly observed in the context of Turkish consumers, a developing country from the Global South. Turkey, with a Muslim population exceeding 90%, is governed by a political party that embraces a liberal interpretation of Islam while maintaining secular law. Furthermore, while the nation has long been a European Union candidate striving to integrate with the West, fundamentalist narratives in politics are simultaneously on the rise. Recent statistics reveal that secularisation is increasing prominently among younger cohorts. However, religiosity remains a consistent influence on consumers' daily decisions. Although the current government has successfully adopted modern technological and scientific developments, populism continues to be a cornerstone of its political marketing strategies. All these contradictions make the country's context unique and worth researching prominent topics like populism and AI attitude.

Populism is primarily driven by powerful anti‐arguments towards the academic elite, and different from political populism, it ‘focuses on the core logic of science and epistemic authority’ (Mede and Schäfer 2020, 484). Within this, scientists are seen as looking out for their own interests or the interests of the companies or politicians they work with and taking care of their own careers rather than the realities of hotly debated issues such as climate change or vaccines (Żuk and Żuk 2020). Since AI tools like ChatGPT are the products of modern scientific advancement, we posit that, as an attitude, populism will be a significant determinant of the AI anxiety accordingly usage intention of AI tools like ChatGPT.

Studies illustrated that religiosity may be related to more scepticism towards technological advancements, often due to perceived conflicts with religious beliefs and values. Additionally, religiosity has been found to affect the acceptance and use of several technologies, like the Internet and digital media, with religious individuals sometimes demonstrating more conservative and cautious approaches (Campbell 2005; Lilley 2007). Many religious doctrines assert the unique and special status of humans in the universe. AI, which mimics human cognition and behaviour, might be perceived as a threat to this uniqueness, leading to resistance or cautious acceptance among highly religious individuals. Religiosity and populism can intersect in ways that reinforce each other, particularly in contexts where religious groups feel threatened by secularisation and globalisation and where populist leaders successfully tap into religious narratives and communities (Yilmaz and Morieson 2021). Concerns about the implications of AI on privacy, human dignity, and moral behaviour can affect the acceptance and use of ChatGPT among religious individuals. Populist sentiments can amplify these concerns by framing AI as a tool of the elite that undermines traditional values. In this study, religiosity is explored as a predictor of populism and AI anxiety while also examining its impact on the intention to embrace AI. Embracement of new products, consumer innovativeness, and openness to change are also negatively related to religiosity (e.g., Rathore and Mahesh 2021). According to the theory Religious‐Social Shaping of Technology (RSST), religious communities do not directly reject new technologies; instead, they negotiate (Campbell 2005). They evaluate the features of the new technologies within their belief systems to see whether they comply with their values or not. Thus, people high in religiosity are considered as late adopters of new technological developments (Modliński et al. 2022).

Religiosity negatively predicts AI engagement intention.

Almost all religious teachings place a strong emphasis on human beings (Amormino et al. 2022), particularly figures such as prophets and saints. Religious beliefs often emphasise human uniqueness and the sanctity of life, which can lead to concerns about the applications of AI technologies. These can include job displacement and ethical considerations such as bias and privacy. These concerns align with the dimensions of AI anxiety identified by Wang and Wang (2022), such as ‘job replacement anxiety’ and ‘AI configuration anxiety’, which reflect fears about the impact of AI on human roles and its potential to mimic human capabilities. Furthermore religion, in essence, is authentic and traditional and new technologies, especially anthropomorphised technologies like robots or AI tools can be seen as thread to traditions and sanctity of human. Thus, we posit:

Religiosity positively predicts AI Anxiety.

In recent years, populism has increasingly gained traction in Western democracies, framing the (supposedly) virtuous ordinary people in opposition to the (supposedly) corrupt elites. Those populist views are found not only in politics but also in the media, economics, law, and science. Science‐related populism targets scientific elites, seeing them as antagonists of the ordinary people (Mede and Schäfer 2020). Accordingly, conceptualisation of science‐related populism, the corrupt elite consists of scientists, experts, and scientific institutions. Populist views against science have gained much power in many countries (e.g., Vasilopoulos and Jost 2020), accusing scientists of deceiving the public on controversial issues such as COVID‐19, genetically modified foods, and climate change (Merkley 2020) and by the production of conspiracy theories on these issues (Mede et al. 2021). Traditionally, the source of knowledge has been conceptualised as having two poles: one rooted in religion and the other in science. Religiosity often intensifies the tension between these opposing epistemologies (Preston et al. 2025) and is also associated with conspiracy theories (Brzóska et al. 2022; Nowak et al. 2022). Religion‐based populistic attitudes are widespread in politics globally in which the people represent the good while the elite represent the evil and leaders imitate the prophets (Yilmaz and Morieson 2021). Overall, many populist attitudes historically almost in all societies benefit from religious rhetoric and we expect to find out same pattern here. Thus:

Religiosity positively predicts populism.

Recent advancements in artificial intelligence (AI) are increasingly influencing populist political narratives, and this trend is expected to grow as AI‐driven automation leads to more job losses. Levy (2018) suggests that in this evolving political landscape, the ‘people’—a core concept in populism—will primarily consist of workers who have lost their jobs due to AI, while the ‘elite’ will be represented by software developers and venture capitalists who drive AI innovation and profit from it. Following this perspective, individuals who resonate with populist ideologies may view AI not just as a technological advancement but as a force that threatens their economic stability and social identity. This perception is likely to fuel negative attitudes towards AI tools like ChatGPT. In particular, we expect populism to be a significant predictor of both negative intentions towards ChatGPT, as populist‐leaning individuals may perceive it as an agent of job displacement, and heightened AI Anxiety, driven by fears of economic insecurity and loss of control over technological progress. Especially, discussions on social media significantly shaping public attitudes towards AI tools, primarily among those who have not directly interacted with the:

Populism negatively predicts AI engagement intention.

Populism positively predicts AI Anxiety.

Higher religiosity has been associated with less favourable attitudes towards autonomous vehicles (Modliński et al. 2022). Similarly, Catholic students in the USA have been found to hold negative attitudes towards new transhuman technologies like genetic engineering, perceiving these advancements as threats to their religion (Lilley 2007). In the USA, religiosity has been significantly related with negative attitudes towards science and lower levels of scientific knowledge (McPhetres and Zuckerman 2018). Similarly, an analysis of data from 52 countries by Chan (2018) revealed that religiosity is reliability associated with lower trust in scientific authority, a pattern observed even in predominantly Muslim countries. Supporting this, a recent study in Turkey found that individuals with high Islamic religiosity, particularly those working in religious services, exhibit shallow levels of scientific attitudes (Memiş and Certel 2022). Building on these findings, we hypothesise that religiosity not only directly shapes individuals' responses to emerging technologies like AI, but also exerts an indirect influence via populist attitudes (Figure 1). Specifically, individuals high in religiosity may be more prone to adopt populist worldviews characterised by anti‐elitism and distrust of scientific and technological authorities. These populist beliefs, in turn, may heighten AI‐related anxiety and foster more negative attitudes towards AI applications such as ChatGPT;

Populism serves as a mediator in the relationship between religiosity and AI engagement intention.

Populism serves as a mediator in the relationship between religiosity and AI anxiety.

FIGURE 1.

FIGURE 1

Conceptual model.

People are concerned about AI and may have distrust, even if they have high hopes for the technology. 47% of US workers fear they may lose their employment in the following years as a result of automation, including robotics and artificial intelligence (Frey and Osborne 2017). The four dimensions of AI anxiety identified by Wang and Wang (2022) are: ‘job replacement anxiety’, which describes the fear of the negative impacts of AI on the workplace; ‘sociotechnical blindness’, which describes the anxiety created by an incomplete understanding of how dependent AI is on humans; ‘AI configuration anxiety’, which indicates fear about human‐like AI; and ‘AI learning anxiety’, which describes anxiety related to learning AI technologies. It is natural to expect that AI anxiety will predict negative intentions to use. Anxiety surrounding artificial intelligence often stems from concerns about job displacement, ethical implications such as bias and privacy, and uncertainties about AI's role in society (Frey and Osborne 2017; Wang and Wang 2022). These anxieties can lead individuals to hesitate or resist engaging with AI‐driven technologies like ChatGPT, fearing their potential consequences on employment and societal norms:

AI anxiety negatively predicts AI engagement intention.

2. Methods

2.1. Procedure and Participants

The current study employed convenience sampling methodology in 2024 at Türkiye. To increase the diversity of the potential participants, three researchers simultaneously distributed the survey link through different social media channels (e.g., WatsApp, Facebook). Apart from that, students from a Turkish University, with which all three authors were associated, were invited to the study through University's mail portal. The aim and scope of the study, the participants' freedom to withdraw at any moment, and the confidentiality and anonymity of their answers were all explained to the participants. Electronic consent was given by respondents at the very beginning of the survey. To increase the number of participants, participants were offered a chance to win a voucher from an online retailing company. Six hundred and ten participants agreed to participate in the study. After eliminating participants who failed attention checks who did not succeed completing all scales, 400 and 22 usable answers were derived. Participants' age ranged between 18 and 75 (M = 30.48; SD = 13.07). Two hundred and twenty‐two (52.6%) of the participants were females. The average income was 54,675 1 (SD = 4658 Turkish Liras). The entire dataset was uploaded into Mendeley data repository (DOI: 10.17632/k9pk3f35xw.2).

2.2. Measures

Religiosity was assessed using the Santa Clara Strength of Religious Faith Scale, a widely used measure in prior studies (Plante and Boccaccini 1997), with items such as ‘My faith is an important part of who I am as a person’. Science‐related populist attitudes were measured using four items from the SciPop Scale, including ‘Scientists are only after their own advantage’. Artificial intelligence anxiety was measured using the Turkish adaptation of the AI Anxiety Scale (AIAS) (Wang and Wang 2022; Akkaya et al. 2021), with sample items like ‘I am afraid that if I begin to use AI techniques/products, I will become dependent upon them and lose some of my reasoning skills’. AI engagement intention was assessed using items adapted from Venkatesh et al. (2012), such as ‘I am very likely to subscribe to ChatGPT‐4 in the future’. All items were adapted to the study context through a translation–back translation process conducted by bilingual experts to ensure semantic equivalence. Additionally, the questionnaire was pretested with a small sample to check for clarity and relevance before full‐scale data collection.

2.3. Analysis

The study's analytical strategy utilised PLS‐SEM. Bootstrapping was performed with 5000 resamples, applying a significance threshold of 0.05 for both direct and specific indirect effects. The model's validity and reliability were assessed using the following criteria: (1) multicollinearity (VIF < 10) for latent variables; (2) internal consistency reliability (Cronbach's alpha > 0.70); (3) discriminant validity (Heterotrait–Monotrait ratio < 0.85); (4) convergent validity (AVE > 0.50); and (5) unidimensionality (DG‐rho > 0.70) (Hair et al. 2022).

2.4. Findings

Discriminant validity was assessed using the Heterotrait–Monotrait Ratio (HTMT) and the Fornell–Larcker criterion (Henseler et al. 2015). The square root of AVE for each construct, as shown in Table 1, proved discriminant validity in accordance with the Fornell–Larcker criteria. The square root of the AVE for each construct exceeded its correlation coefficients with other constructs. The model's overall fit was found satisfactory by the good SRMR of 0.073. SRMR values below 0.08 or 0.10 are typically regarded as indicators of a good model fit.

TABLE 1.

Fornell–Larcker criterion for discriminant validity.

Variable 1 2 3 4
1. AI Anxiety 0.773
2. AI Engagement Intention −0.147 0.820
3. Religiosity 0.123 −0.210 0.826
4. Populism 0.355 −0.073 0.204 0.712

Likewise, HTMT values were found to be less than the suggested threshold of 0.85, which represents good discriminant validity (Table 2). See Appendix A for outer loadings and VIF values. All loadings are above or near 0.7, and all VIFs are below 10, indicating good validity and no multicollinearity issues.

TABLE 2.

HTMT values for discriminant validity assessment.

Variable 1 2 3 4
1. AI anxiety
2. AI engagement intention 0.155
3. Religiosity 0.137 0.211
4. Populism 0.448 0.106 0.248

All constructs demonstrated acceptable reliability and convergent validity. Cronbach's alpha values ranged from 0.679 (Populism) to 0.948 (Religiosity), indicating moderate to excellent internal consistency. Composite reliability (ρ a; ρ c) values were all above the 0.70 threshold, with the exception of Populism, which was slightly below but still acceptable given its ρ c value of 0.804. All average variance extracted (AVE) values exceeded the 0.50 benchmark, supporting convergent validity (Henseler et al. 2015) (Table 3).

TABLE 3.

Reliability and validity statistics.

Cronbach's α ρ a ρ c AVE
AI anxiety 0.885 0.903 0.911 0.598
AI engagement intention 0.907 0.944 0.924 0.672
Religiosity 0.948 0.949 0.956 0.683
Populism 0.679 0.687 0.804 0.507

The bootstrap sample means indicate that the model explains 0.134 of the variance in AI anxiety (p < 0.001), 0.063 of the variance in intention to engage with AI (p = 0.038), and 0.043 of the variance in populism (p = 0.031). The model revealed several significant direct effects. Religiosity was negatively associated with AI engagement intention (β = −0.198, p < 0.001) (H1 accepted), and positively associated with populism (β = 0.204, p < 0.001) (H3 accepted). Populism significantly predicted AI anxiety (β = 0.345, p < 0.001) (H5 accepted), and AI anxiety was negatively related to AI engagement (β = −0.128, p = 0.015) (H7 accepted). However, religiosity did not significantly predict AI anxiety directly (β = 0.052, p = 0.269) (H2 rejected), and the direct effect of populism on AI engagement was also non‐significant (β = 0.013, p = 0.823) (H4 rejected) (Table 4).

TABLE 4.

Direct effects.

Effect (β) SE t p
Religiosity → AI engagement −0.198 0.052 3.82 < 0.001
Religiosity → populism 0.204 0.044 4.65 < 0.001
Populism → AI anxiety 0.345 0.046 7.51 < 0.001
AI anxiety → AI engagement −0.128 0.053 2.43 0.015
Religiosity → AI anxiety 0.052 0.047 1.11 0.269
Populism → AI engagement 0.013 0.057 0.22 0.823

Bootstrapping results (with 5000 resamples) revealed a significant indirect effect of religiosity on AI anxiety through populism (β = 0.070, p < 0.001), indicating that individuals with higher levels of religiosity tend to score higher in populism, which in turn increases AI‐related anxiety (H7 accepted). Additionally, populism significantly influenced AI engagement intention via AI anxiety (β = −0.044, p = 0.024), suggesting that higher populist attitudes are associated with greater AI anxiety, which then lowers the intention to engage with AI technologies. The serial mediation path religiosity → populism → AI anxiety → AI engagement was marginally significant (β = −0.009, p = 0.053), indicating a potential, though not definitive, sequential mediation. Other indirect effects were non‐significant (p > 0.36), suggesting no evidence for those pathways (H6 rejected) (Table 5).

TABLE 5.

Indirect effects.

Effect (β) SE t p
Religiosity → populism → AI anxiety 0.070 0.018 3.90 < 0.001
Populism → AI anxiety → AI engagement −0.044 0.019 2.26 0.024
Religiosity → AI anxiety → AI engagement −0.007 0.007 0.90 0.369
Religiosity → populism → AI engagement 0.003 0.012 0.21 0.831
Religiosity → populism → AI anxiety → AI engagement −0.009 0.005 1.94 0.053

Additional independent samples t‐test was applied to data for gender variable, and it is found that male participants (m = 3.59) exhibited significantly higher AI engagement than females (m = 3.34; t = 2.71, p = 0.007). For other variables no significant differences were observed (see Appendix B).

3. Discussion

Our analysis uncovers several significant findings that contribute to the broader understanding of how psychological and cultural factors influence AI acceptance. Religiosity emerges as a significant predictor in our model, impacting both populism and AI engagement. This aligns with existing literature that often portrays religiosity as a factor associated with scepticism towards technological advancements and modern science (Scheufele et al. 2009). However, the finding contradicts Minton et al.'s study (Minton et al. 2022) in which religiosity positively predicts attitude towards artificial Intelligence technologies within a marketing context, which is mediated by faith in the unseen. Authors interpreted these findings as since AI is something that cannot be seen just as a spiritual entity, people with religious beliefs were found to have favourable evaluations related to AI. However, the authors did not focus on a specific religious group, and they did not measure direct involvement in AI technology. They have just evaluated their attitude towards a company or technology. Differently, this study directly measures the usage and buying intentions of an AI service within a specific religious belief.

Contrary to expectations, populism did not significantly predict the AI engagement. This finding suggests that while populist attitudes may influence perceptions of AI technologies, they may not directly deter individuals from using AI‐driven services like ChatGPT. Our study supports the hypothesis that populism mediates the relationship between religiosity and AI anxiety Unexpectedly, AI anxiety did not mediate the link between religiosity and AI engagement. Furthermore, our analysis reveals that populism predicts AI engagement negatively through AI anxiety. This indirect effect underscores how populist attitudes, characterised by scepticism towards elites and technological advancements, hinder adoption intentions by increasing AI anxiety. Populist narratives often emphasise concerns about AI's potential misuse, job displacement, or societal control, thereby fostering apprehensions that deter individuals from embracing AI technologies like ChatGPT. By amplifying fears and uncertainties, populism indirectly diminishes the willingness to engage with AI services despite their potential benefits. The positive relationship between populism and AI anxiety identified in this study supports the notion that populist attitudes often coincide with distrust in elite‐driven technological progress (Levy 2018). Populism, characterised by an anti‐establishment stance, may fuel apprehensions about AI as an extension of technocratic control and surveillance. This relationship accentuates the potential for populist rhetoric to amplify public fears regarding AI, a dynamic that can significantly hinder the adoption of AI technologies.

Artificial intelligence has several practical applications, including disease diagnosis, environmental resource preservation, natural disaster prediction, education improvement, deterring violent crimes, and lowering workplace dangers. Thus, these advantages of AI could free up time for individuals to study, explore, and experiment, which could improve human creativity and quality of life. Despite these considerable benefits, the findings of our study emphasise that there is a considerable level of AI anxiety (in line with previous findings in a similar context, e.g., Aytaç 2022) fuelled by religiosity and populist attitudes, which can significantly hinder the engagement into AI services like ChatGPT. Addressing AI anxiety not only has the potential to improve subscription intentions but also to enable individuals to reap the substantial benefits that AI technologies offer. By reducing anxiety and increasing AI literacy, we can encourage broader adoption and utilisation of AI's capabilities, ultimately contributing to improved quality of life and societal advancement.

Even it is not central focus in this study, additional analyses revealed a significant gender difference in AI engagement, with male participants reporting higher levels than females. This finding suggests that gender‐related perceptions or technological socialisation might play a role in how individuals interact with AI tools. Future research may investigate the underlying psychological or social drivers of the AI adoption and anxiety across different demographic or even cultural groups. Notably, this study's data were collected in 2024, during the early adoption phase where AI technologies were gradually becoming prevalent. Given the rapid evolution of AI engagement, future studies may identify substantially different effects of religiosity and populism as AI tools become commonplace, potentially leading to lower levels of AI anxiety.

Author Contributions

Zeynep Aytaç: writing – original draft, data curation, project administration. Erdal Yıldırım: conceptualisation, data curation, investigation, project administration. Muhammed Bilgehan Aytaç: methodology, software, supervision, writing – review and editing, investigation, validation, project administration.

Funding

The authors have nothing to report.

Disclosure

The findings are based on a non‐WEIRD sample and may not generalise to Western or culturally diverse populations. Additionally, the concept of AI engagement might vary across contexts, suggesting caution in applying these results broadly.

Ethics Statement

The authors have nothing to report.

Conflicts of Interest

The authors declare no conflicts of interest.

Acknowledgements

Generative artificial intelligence tools were only used for grammatical control of a few parts of the manuscript.

Appendix A. See Table A1

TABLE A1.

Items with outer loadings and VIF values.

Religiosity Outer loadings VIF value
R1 I look to my faith as a source of comfort. 0.875 3.712
R2 I look to my faith as providing meaning and purpose in my life. 0.855 3.537
R3 My faith is an important part of who I am as a person. 0.842 3.050
R4 My faith impacts many of my decisions. 0.843 2.919
R5 My religious faith is extremely important to me. 0.841 3.261
R6 I look to my faith as a source of inspiration. 0.834 3.124
R7 My relationship with God is extremely important to me. 0.813 2.849
R8 I pray daily. 0.815 2.758
R9 I consider myself active in my faith or church. 0.791 2.180
R10 I enjoy being around others who share my faith. 0.739 1.941
AI anxiety
AIA1 Learning to use AI techniques/products makes me anxious. 0.878 5.670
AIA2 Learning to use specific functions of an AI technique/product makes me anxious. 0.883 6.634
AIA3 Learning how an AI technique/product works makes me anxious. 0.877 4.820
AIA4 Learning to understand all of the special functions associated with an AI technique/product makes me anxious. 0.775 2.030
AIA5 I am afraid that an AI technique/product may replace humans. 0.676 2.047
AIA6 I am afraid that if I begin to use AI techniques/products I will become dependent upon them and lose some of my reasoning skills. 0.643 1.899
AIA7 I am afraid that AI techniques/products will replace someone's job. 0.628 2.251
Populism
P1 In case of doubt, one should rather trust the life experience of ordinary people than the estimations of scientists. 0.760 1.334
P2 We should rely more on common sense and less on scientific studies. 0.727 1.294
P3 Scientists are only after their own advantage. 0.706 1.372
P4 Scientists are in cahoots with politics and business. 0.653 1.336
AI engagement intention
AIE1 I am willing to subscribe to ChatGPT‐4 (a paid version of ChatGPT) in the future. 0.900 7.158
AIE2 I am planning to subscribe to ChatGPT‐4 (a paid version of ChatGPT) in the future. 0.898 7.375
AIE3 I am very likely to subscribe to ChatGPT‐4 in the future. 0.887 4.645
AIE4

I intend to use the ChatGPT in the future.

0.742 7.204
AIE5

I predict to use the ChatGPT in the future.

0.736 4.882
AIE6 I am considering using the ChatGPT in the future. 0.733 5.656

Appendix B. See Table B1

TABLE B1.

Comparison of study variables by gender (independent samples T‐test).

Variable Group N M SD t df p
Religiosity Male 200 3.71 1.11 −1.44 386.82 0.151
Female 222 3.85 0.91
AI Anxiety Male 200 3.36 1.43 −1.42 420 0.157
Female 222 3.54 1.28
Populism Male 200 2.23 0.74 −0.78 420 0.435
Female 222 2.29 0.76
AI Engagement Male 200 3.59 0.98 2.71 420 0.007
Female 222 3.34 0.92

Note: Bold indicates p < 0.01.

Endnotes

1

The minimum wage was 17,002 Turkish Liras (monthly) in Turkey at the time when the data collected.

Data Availability Statement

The data that support the findings of this study are openly available in Mendeley at https://data.mendeley.com/preview/k9pk3f35xw?a=2e066876‐b8ff‐41c9‐aacd‐e7d88f2350fd.

References

  1. Akkaya, B. , Özkan A., and Özkan H.. 2021. “Yapay Zeka Kaygı (YZK) ölçeği: Türkçeye uyarlama, geçerlik ve güvenirlik çalışması.” Alanya Akademik Bakış 5, no. 2: 1125–1146. [Google Scholar]
  2. Amormino, P. , O'Connell K., Vekaria K. M., Robertson E. L., Meena L. B., and Marsh A. A.. 2022. “Beliefs About Humanity, Not Higher Power, Predict Extraordinary Altruism.” Journal of Research in Personality 101: 104313. [Google Scholar]
  3. Aytaç, Z. 2022. “Üniversite Öğrencilerinin Yapay Zekâ Öğrenme ve İş Değiştirme Kaygılarının Otonom Araçlar ve Akıllı Evler Özelinde Değerlendirilmesi.” Üçüncü Sektör Sosyal Ekonomi Dergisi 57, no. 4: 2975–2989. [Google Scholar]
  4. Brzóska, P. , Nowak B., Piotrowski J., et al. 2022. “Different Paths to COVID19's Conspiratorial Beliefs for the Religious and Spiritual: The Roles of Analytic and Open‐Minded Thinking.” PsyArXiv. 10.31234/osf.io/zsv89. [DOI]
  5. Campbell, H. A. 2005. Exploring Religious Community Online. Lang. [Google Scholar]
  6. Chan, E. 2018. “Are the Religious Suspicious of Science? Investigating Religiosity, Religious Context, and Orientations Towards Science.” Public Understanding of Science 27, no. 8: 967–984. [DOI] [PubMed] [Google Scholar]
  7. Das, A. , Muschert G., Dutta M. J., et al. 2024. “AI Impacts, Concerns, and Perspectives in the Global South A Thought Leadership Round Table.” Социологическое обозрение 23, no. 4: 173–195. [Google Scholar]
  8. Frey, C. B. , and Osborne M. A.. 2017. “The Future of Employment: How Susceptible Are Jobs to Computerisation?” Technological Forecasting and Social Change 114: 254–280. [Google Scholar]
  9. Hair, J. F. , Hult G. T. M., Ringle C., and Sarstedt M.. 2022. A Primer on Partial Least Squares Structural Equation Modeling (PLS‐SEM). 3rd ed. Sage. [Google Scholar]
  10. Henseler, J. , Ringle C. M., and Sarstedt M.. 2015. “A New Criterion for Assessing Discriminant Validity in Variance‐Based Structural Equation Modeling.” Journal of the Academy of Marketing Science 43, no. 1: 115–135. [Google Scholar]
  11. Levy, F. 2018. “Computers and Populism: Artificial Intelligence, Jobs, and Politics in the Near Term.” Oxford Review of Economic Policy 34, no. 3: 393–417. [Google Scholar]
  12. Lilley, S. 2007. “Catholic Students' Fatalism in Anticipation of Transhuman Technologies.” International Journal of Interdisciplinary Social Sciences: Annual Review 2, no. 1: 313–319. [Google Scholar]
  13. McPhetres, J. , and Zuckerman M.. 2018. “Religiosity Predicts Negative Attitudes Towards Science and Lower Levels of Science Literacy.” PLoS One 13, no. 11: e0207125. [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Mede, N. G. , and Schäfer M. S.. 2020. “Science‐Related Populism: Conceptualizing Populist Demands Toward Science.” Public Understanding of Science 29, no. 5: 473–491. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Mede, N. G. , Schäfer M. S., and Füchslin T.. 2021. “The SciPop Scale for Measuring Science‐Related Populist Attitudes in Surveys: Development, Test, and Validation.” International Journal of Public Opinion Research 33, no. 2: 273–293. [Google Scholar]
  16. Memiş, M. , and Certel H.. 2022. “Dindarlık İle Bilimsel Tutum Arasındaki İlişki: Yükseköğrenim Mezunları Üzerine Bir Saha Araştırması.” Jass Studies‐The Journal of Academic Social Science Studies 15, no. 93: 81–106. [Google Scholar]
  17. Merkley, E. 2020. “Anti‐Intellectualism, Populism, and Motivated Resistance to Expert Consensus.” Public Opinion Quarterly 84, no. 1: 24–48. [Google Scholar]
  18. Minton, E. A. , Kaplan B., and Cabano F. G.. 2022. “The Influence of Religiosity on Consumers' Evaluations of Brands Using Artificial Intelligence.” Psychology & Marketing 39, no. 11: 2055–2071. [Google Scholar]
  19. Modliński, A. , Gwiaździński E., and Karpińska‐Krakowiak M.. 2022. “The Effects of Religiosity and Gender on Attitudes and Trust Toward Autonomous Vehicles.” Journal of High Technology Management Research 33, no. 1: 100426. [Google Scholar]
  20. Nowak, B. , Brzóska P., Piotrowski J., et al. 2022. “Disentangling the Effects of Religiosity and Spirituality on Contaminated Mindware.” 10.31234/osf.io/fqav5. [DOI]
  21. Plante, T. G. , and Boccaccini M.. 1997. “The Santa Clara Strength of Religious Faith Questionnaire.” Pastoral Psychology 45: 375–387. [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Preston, J. L. , Hotchin V., and Zarzeczna N.. 2025. “Humility and Harmony: The Influence of Intellectual Humility and Religiousness on Science‐Religion Views.” Personality and Individual Differences 240: 113136. [Google Scholar]
  23. Rathore, A. , and Mahesh G.. 2021. “Public Perception of Nanotechnology: A Contrast Between Developed and Developing Countries.” Technology in Society 67: 101751. [Google Scholar]
  24. Rust, R. T. , and Huang M.‐H.. 2014. “The Service Revolution and the Transformation of Marketing Science.” Marketing Science 33, no. 2: 206–221. [Google Scholar]
  25. Scheufele, D. A. , Corley E. A., Shih T. J., Dalrymple K. E., and Ho S. S.. 2009. “Religious Beliefs and Public Attitudes Toward Nanotechnology in Europe and the United States.” Nature Nanotechnology 4, no. 2: 91–94. [DOI] [PubMed] [Google Scholar]
  26. Vasilopoulos, P. , and Jost J. T.. 2020. “Psychological Similarities and Dissimilarities Between Left‐Wing and Right‐Wing Populists: Evidence From a Nationally Representative Survey in France.” Journal of Research in Personality 88: 104004. [Google Scholar]
  27. Venkatesh, V. , Thong J. Y., and Xu X.. 2012. “Consumer Acceptance and Use of Information Technology: Extending the Unified Theory of Acceptance and Use of Technology.” MIS Quarterly 36, no. 1: 157–178. [Google Scholar]
  28. Wang, Y. Y. , and Wang Y. S.. 2022. “Development and Validation of an Artificial Intelligence Anxiety Scale: An Initial Application in Predicting Motivated Learning Behavior.” Interactive Learning Environments 30, no. 4: 619–634. [Google Scholar]
  29. Yilmaz, I. , and Morieson N.. 2021. “A Systematic Literature Review of Populism, Religion and Emotions.” Religion 12, no. 4: 272. [Google Scholar]
  30. Żuk, P. , and Żuk P.. 2020. “Right‐Wing Populism in Poland and Anti‐Vaccine Myths on YouTube: Political and Cultural Threats to Public Health.” Global Public Health 15, no. 6: 790–804. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

The data that support the findings of this study are openly available in Mendeley at https://data.mendeley.com/preview/k9pk3f35xw?a=2e066876‐b8ff‐41c9‐aacd‐e7d88f2350fd.


Articles from International Journal of Psychology are provided here courtesy of Wiley

RESOURCES