Abstract
Survey research is an essential component of epidemiological research to understand the health of older adults. However, there are several limitations to conventional data collection methods that may serve as barriers for recruitment and retention of research participants, especially from minority populations. With recent technological advancements, our research team developed an innovative data collection and management system to address linguistic and cultural barriers, data quality, data security, and data preparation issues. This platform has been utilized in the Population Study of Chinese Elderly in Chicago since 2011. Future use and improvement of this system will facilitate research among minority older adults and increase research participation and representativeness to ultimately understand and improve the health and well-being of diverse populations.
Keywords: aging, minority, survey methods, technology
Conducting survey research is an essential tool to understand population health among older adults.1 Traditional forms of survey data collection include paper-based, in-person, telephone, mail, computer-assisted, and/or internet surveys, but these approaches have practical limitations that influence the recruitment and retention of participants and quality of collected data, especially among special populations, such as older adults and minorities. Many of these methods rely on the availability and proper use of technology by the respondent (telephone, internet),2 additional data entry to facilitate analyses (mail-in and in-person paper-based surveys),3 and literacy and comprehension skills of the respondent (internet and mail-in surveys).4,5 Older adults have less access and facility with technology6 and may have physical impairments (ie, eyesight, hearing, motor skills) that may increase barriers to participation in research.7 Minority populations also historically have less reliable access to technology and have higher rates of limited English proficiency, which could impact response rates and selection bias.8 Additionally, these methods often make it difficult to monitor data quality in real time by project coordinators, as data cannot be reviewed or aggregated in a timely manner.
In-person surveys are a preferred medium of data collection for special populations, such as minority older adults, as the interpersonal connections between interviewer and participant can mitigate some barriers to research participation, such as historical distrust in research, literacy and comprehension issues, and technological barriers.9-14 In-person surveys also reduce missing data, improve data quality, and are the best way to gather linguistically and culturally appropriate data. However, in-person survey strategies have also come with significant limitations, including reliance on paper-based collection, which can be heavy, can add to clutter, is unable to have automatic skip patterns, is unable to be monitored in real time, and may reduce data accuracy at the collection and input stages.1 For surveys involving both interviewers and participants who may speak multiple languages, these barriers may be compounded due to the need for multiple dialects or languages to be available at any given time.
There is a great need for recruitment and retention of underserved populations, such as minority older adults in survey research, but popular methods of data collection have not been conducive for addressing these needs. Therefore, the research team of the Population Study of Chinese Elderly in Chicago (PINE) developed a custom data collection system to facilitate in-person interview of Chinese older adults to build on the existing advantages of in-person survey data collection and address language, data security, environmental, monitoring, and analytic limitations of existing systems.15 This novel data collection platform has removed many barriers and automated many processes to allow interviewers to form interpersonal connections with participants. Given this technological platform, PINE has been able to successfully recruit and retain participants: the original cohort of 3157 Chinese older adults who were first interviewed between 2011 and 2013 and has since had a follow-up rate of 89.4% for time 2 and 90.4% for time 3. This platform may have been especially useful in maintaining high rates of follow-up while examining sensitive issues, such as psychological well-being, family relationships, and life course violence in PINE.16-45 The aims of this article are to outline the development, implementation, operations, and future capacities of a novel data collection platform that has been used in collection for PINE since 2011. Ultimately, this article could serve as a guide to aid population health researchers in the design and implementation of digital survey platforms to facilitate large-scale epidemiological studies with minority older adults.
PINE STUDY OVERVIEW
PINE is a population-based prospective cohort study of community-dwelling Chinese older adults, aged 60 years and older, in the greater Chicago, IL, area. PINE represents the largest epidemiological study of Chinese older adults in the Western countries. The baseline interviews were conducted between July 2011 and June 2013 with an original cohort of 3157 Chinese older adults. From the original cohort, 89.4% completed the first follow-up interviews between 2013 and 2015, and 90.4% completed the second follow-up interviews between 2015 and 2017. In-home interviews were conducted by trained multilingual and bicultural research assistants in participants’ preferred language or Chinese dialects, including English, Mandarin, Cantonese, Toishanese, and Teochew.
Survey Platform Overview and Core Functions
Survey Platform Overview
In below sections, details for each of the core functions are described, including (1) data collection web application; (2) report site; and (3) security features. An overview of technical details of the core functions is included in Table 1.
Table 1.
Core System Functions and Technical Details
Platform functions | Technical details |
---|---|
Data platform technical details | The data platform is an interoperable mix of multiple technologies, frameworks, and programming languages, such as J2EE/JAVA, MySQL, JSP, HTML, CSS, JavaScript, Bootstrap, hibernate, Spring Framework v4.0, and Struts Framework v2.0. The application is designed using the MVC design pattern and architecture, where the “model” portion represents the application database and respective entities (details in Figure 2). |
Core function 1: Survey site: data collection | Survey site is written in a JAVA Enterprise Edition platform using servlets, JSP, and EJB technologies. The application is deployed in Apache Tomcat Server. We have merged the Struts v2.0 framework and the Spring Framework v4.0; the Struts Framework empowers the use of the centralized file-based configuration, the form bean creation, and the validation, while the Spring Framework is used for IOC and loose coupling to make the application extensible to new functionalities and requirements. We also encapsulated Spring MVC internationalizations (i18) and localization (L10n). Spring internationalization allows user to choose the language and detect the user language based on user local setting. The internationalization (i18) is achieved through the use of Spring interceptors, locale resolvers, and resource bundles. Currently, the web application can be used in English, traditional Chinese, and simplified Chinese. Administrator-generated translations are mapped in the GlobalMessages.Properties file inside the web application. |
Core function 2: Report site: quality monitoring | The report site has been designed with the BI, analytics, and reporting requirements of study/survey data. It is developed using the JavaServer Pages technology (standard: JSR 245) and has been deployed in the Apache Tomcat Servlet container. |
Core function 3: Security | We have custom built and configured the run-time environment in the MacOS Server to run database and data collection web applications. |
Abbreviations: CSS, cascading style sheets; EJB, Enterprise Java Beans; HTML, Hypertext Markup Language; IOC, inversion of control; J2EE, Java 2 Enterprise Edition; JSP, Java Server Pages; JSR, Java Specification Request; MVC, model view controller.
Data collection web application comprises five tiers: (1) staff tier; (2) web tier/presentation tier; (3) business tier; (4) data access tier; and (5) data tier (Figure 1). The staff tier consists of wireless devices, such as iPad/iPhones, android phones, desktops, or laptop computers. The JAVA EE Server makes up the web tier and business tier—it is also referred to as the study operation tier. Data are transmitted through wireless services to a secure server in real time. The data access tier is composed of Hibernate ORM, whereas the data tier includes the database server. The database is the fifth tier of the application. The application uses the MySQL 5.7.24 community edition database, which is open source, highly scalable, and a reliable, relational database system. The MySQL database management system is written in C and C++, while being compatible with various operating systems, such as Linux, Windows, and MacOS.
Figure 1.
Five-tier architecture. Abbreviations: API, application program interface; JSP, Java Server Pages; ORM, Object-relational mapping; UI, User Interface
Once data have been entered into the database, multiple features are designed and architected to optimize quality monitoring and data security. Additionally, enhanced security measures are implemented to ensure safety of collected data and participant information.
Survey Site: Data Collection
The survey site can be accessed via the web application (technical description of the survey site can be found in Table 1). It is divided into two different sections based on user roles and responsibilities: administrator and interviewer. The administrator has the ability to create staff accounts, participant accounts, instruments, questions, answer choices, data banks, and interviews. The administrator has access to all the interviews, assigned staff, and participant list. Additionally, the administrator has the ability to monitor the progress of the interviews and assign interviewers to any given interviews based on language preference and location. Administrator accounts can edit all participant’s information, contact information, and deceased status. After logging into the site, the interviewer will see the list of interviews it has to complete and the list of participants assigned to any given survey. The interviewer has the ability to both start the new interview and resume the previous ongoing interview. The interviewer also can edit the assigned participant’s information, edit contact information, and report deceased information.
Report Site: Quality Monitoring
The report site communicates with the application database in real time to facilitate analysis and to generate real-time reports.
Primary components of report sites are study management, study reports, and study assignment, which is further divided into the following subcategories: coding tool; assignments; eligibility; status; scheduling; contact; deceased report; completion reports; and incompletion reports. Same to the survey site, the report site also has two user roles. The administrator can do the following: assignment, reassignment, code measurements, generate progress reports, and view incompletion and completion reports. The incompletion report shows the list of participants who did not complete the particular interview, assigned staff, and the progress status, whereas the completion report shows the list of participants who did complete the interviews and assigned staff. The interviewer can view the incompletion report, completion report, contact list, and wrong address report.
Security
Database, web services, and web application are enclosed in a private virtual subnet whose internet protocal addresses are hidden from the outside world. Our different layers of infrastructure built around database and web applications help us detect and respond quickly to threats before an actual breach can occur. Passwords are hashed using PBKDF2 (password-based key derivation function 2), thereby making it impossible for perpetrators to decipher it, even with direct access to the application database. Such security measures ensure safety of participant information and collected/archived data.
Fieldwork Operations
The modular architecture of the application facilitates the development of a wide variety of multilingual surveys. Based on individual study requirements, new instrument categories could be designed and entered into the system. Automatic skip patterns can be programmed in the system if needed for particular measurements. For the consecutive waves, the instrument, questions, and choices can be copied over using SQL scripts by the database administrator. Similarly, questionnaire data can be copied over for the different cohorts if those fall under the same interview category. These functions ensure streamlined operations of multiple studies and data collection for longitudinal studies. A unique identifier is assigned to each interview and is thereby used throughout the interview lifecycle. Standard naming conventions are followed for the purpose of naming the interviews, such as “PINE baseline interview,” to assist in longitudinal study design.
Once the interview is designed, a group of staff is allocated based on the physical location and language preferences of participants. A group of participants who are filtered based on study criteria and age requirements are assigned to each staff for the interviews and follow-ups. Assigned staff can contact the participants either in person or via telephone or letter, with the requirement of contact being recorded each time. The following seven purposes of contacts are listed in the application: (1) request participation in a study activity; (2) schedule or confirm study appointment; (3) collect study data; (4) provide study information/answer family study questions; (5) contact requested or initiated by subject; (6) appointment cancellation; and 7) gather subject’s contact information from family members/trusted people. Such information on the contact forms enables interviewers to customize recruitment strategies for each participant based on his/her previous record(s).
Administration is handled by the staff, project administrator, and database/application administrator at different levels of the interview process. Data get validated each time they enter the system through the application framework. The staff interactively engages participants during the survey/interview process to make sure that the participant understood all the questions and choices properly. An illustration of the interviewer interface is provided in Figure 2. Cross follow-ups are implemented to increase the reliability and validate interview data.
Figure 2.
Interviewer interface during fieldwork. (A) Interview and participant selection home page. (B) Interview and participant selection home page in Chinese characters. (C) Interview table of contents and start page. (D) Sample question-and-answer selection interface, displaying a trigger question and single- and multiple-choice answers.
The submission of the interview can happen on the same day when the interview started or later. Sometimes, however, the interviews span over multiple days, depending on participant’s availability. Intermediate stages of interviews are captured in the application and database, with the option of follow-ups to be scheduled and interviews to be resumed from the previous question. The interview submission falls under the fourth and fifth phases of the interview lifecycle. The staff member reviews the questions and answers and validates the content before proceeding to the final submission. After the final submission, the interviews are locked, which means further edits are disabled. This lockdown allows for the maintenance of the integrity of collected data.
Quality Monitoring
The quality of the survey data is primarily focused on all the team members involved both directly and indirectly in the study. Quality monitoring is conducted at different levels of the application and interview life cycle. The survey application validates the data automatically through different checks implemented using the application framework, custom functions, and regular expressions. For example, the telephone number field should only contain numbers between 0 and 9, the maximum size of the field should be 10 (integer), and no special characters and alphabets are allowed. Similarly, the email address, date of birth, and other fields are validated through the respective validation interceptors. Given that the data are automatically structured and cleanly inserted into the database tables, the study coordinators are able to request deidentified, randomized data extractions of surveys conducted over a desired period of time, to review the answers, proper completion of interviews, and pilot testing of the surveys. Such regular data monitoring activities allow the study team to identify data quality issues timely and ultimately enhance quality of collected data.
DISCUSSION
In response to the limitations of existing forms of data collection,1 we have developed a novel, adaptable, digital platform to conduct in-person interviews that is especially helpful in addressing barriers to research among minority older adults. Namely, this multilingual data collection, monitoring, and storage system helps to overcome issues related to linguistic and cultural differences, historical distrust in medical research, and access to participants. Regarding other barriers to conducting epidemiological survey-based research, our platform can additionally address issues related to data quality and monitoring, security, and streamlined data set preparation.
This innovative platform addresses multiple challenges related to collecting data in minority older adult populations (Table 2). All survey questions are accessible in programmed, translated languages during field operations, so interviewers can easily switch to the needed language and reduce the need to either provide impromptu translations and/or carry around multiple hard copies of the survey in different languages. Especially, minority older adult populations have high proportions of those who do not speak English, and surveys that only use English are unable to represent the experiences of these minority populations.11-14 The data collection platform can be accessed via touch-screen tablet devices, which are less bulky and cumbersome for interviewers to navigate in the field. The handheld nature of these devices allows for greater eye contact between interviewer and participant, which, in turn, can improve interpersonal interviewing and data quality, as the interviewer and participant can focus more on “having a conversation” vs data collection and extraction. Furthermore, the platform ensures greater uniformity of collected data. Last, interviewers can recruit difficult-to-reach or busy individuals for study participation and immediately interview them while maintaining data quality and security, which reduces the potential burden of participation. Combined, these features have allowed our research team to collect high-quality survey data with an understudied and underserved population of minority older adults and have contributed to high follow-up interview rates.
Table 2.
Common Research Challenges and the Advantages of the Novel Technology-Assisted Platform
Barriers to research among minority older adults |
Solutions based on the technology- assisted survey platform |
---|---|
Quality of collected data | Real-time data quality monitoring and programmed automatic data validation |
Limited access and facility with technology among minority older adults | Wireless devices bring the survey to Chinese older adults, circumventing internet and computer access and literacy |
Limited literacy and comprehension skills of older respondents | Detailed and easy-to-understand interviewer instructions that are built in to facilitate comprehension of survey items while maintaining consistency in data collection |
Missing data | Programmed automatic data validation before submission |
Linguistic and cultural barriers | Multilingual interface that was designed by a multilingual and bicultural research team |
Analytic limitations common to epidemiological datasets | Streamlined data set preparation |
Time-consuming traditional paper survey administration | Built-in skip pattern questions, drop-down menus, and optional categories to increase efficiency and reduce processing errors |
Data security | Enhanced security features developed by the research team |
For survey research, our data collection platform can also address concerns regarding data quality and monitoring, security, and streamlined data set preparation for analysis purposes. The platform has automatic built-in skip patterns to reduce interviewer error in the field. All collected data are immediately sent to our secure database server, and no data are stored on individual tablet devices, ensuring that data cannot be lost between collection and input into the database. In addition to the survey content, time stamps are recorded, so study coordinators can determine how long an individual survey took and whether questions were asked in order. These features allow study coordinators to systematically review the quality of the data and identify needs for supplemental trainings. Regarding data set preparation, variable naming and coding can be easily recorded in the database and can apply to all surveys. Combining this feature with the automatic storage of data in the database, analysis-ready data sets can be created shortly after all data have been collected. In short, this platform streamlines study operations from collection to analysis.
Several limitations to this platform warrant discussion. First, in its current form, surveys cannot be conducted off-line. In urban areas where many minority older adults live, cellular data connections or Wi-Fi is reliable and readily available, so there is less of a need to collect data off-line for the PINE team. However, given the structure of the data collection system, an off-line application could be easily adapted and ensure similar data security and quality as the internet-connected version. Second, while many operations can be conducted by the study coordinators and staff, the database does need to be maintained by a trained database manager. Given the utilization of an “open-source” platform, technical updates may be sporadic, although the open-source platform allows for greater flexibility. Further, the high-end application frameworks, such as Spring Framework and Struts, used in the application development require frequent security patching, which increases the application maintenance work.
Nevertheless, this survey data collection platform has great implications for conducting epidemiological research with minority and vulnerable populations and can be adapted for conducting research in many environments. There are significant advantages to this platform, which can improve data collection speed, quality, monitoring, security, and easiness of data set preparation. Furthermore, ease of use for interviewers and abilities to program in multiple languages remove significant barriers to reaching, recruiting, and interviewing minority populations in the United States by alleviating physical barriers to data collection and allowing interviewers to focus on interviewing a participant more conversationally. Moving forward, this system can be adapted for off-line use and in many additional languages; the flexibility with this system can significantly reduce time and effort spent in study operations without sacrificing quality. Ultimately, we plan for this platform to be available to external researchers to assist in the collection, storage, and analytic processes of related projects.
CONCLUSION
Survey research is a cornerstone of epidemiological research. With advances in technology, data collection systems can be adapted and improved to address barriers to research facing minority and vulnerable populations. By alleviating some of the barriers to participation in research, researchers can leverage innovative data collection platforms and may be able to improve recruitment and retention of minority older adults and representativeness of study samples, thus leading to a better understanding of how to improve health and well-being of minority aging populations.
ACKNOWLEDGMENTS
Financial Disclosure: Dr Dong was sponsored by P30AG059304, R01AG042318, R01MD006173, R01NR014846, and R34MH100443.
Footnotes
Conflict of Interest: The authors have no conflicts to report.
Sponsor’s Role: The sponsors played no role in the preparation of the manuscript.
REFERENCES
- 1.Wright JD, Marsden PV. Survey research and social science: history, current practice, and future prospects. Handb Surv Res. 2010;Apr 28:3–26. [Google Scholar]
- 2.Kempf AM, Remington PL. New challenges for telephone survey research in the twenty-first century. Annu Rev Public Health. 2007;28:113–126. [DOI] [PubMed] [Google Scholar]
- 3.Schmidt WC. World-wide web survey research: benefits, potential problems, and solutions. Behav Res Methods Instrum Comput. 1997;29(2):274–279. [Google Scholar]
- 4.Simonds VW, Garroutte EM, Buchwald D. Health literacy and informed consent materials: designed for documentation, not comprehension of health research. J Health Commun. 2017;22(8):682–691. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Kirkman-Liff B, Mondragón D. Language of interview: relevance for research of southwest Hispanics. Am J Public Health. 1991;81(11):1399–1404. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Gatto SL, Tak SH. Computer, internet, and e-mail use among older adults: benefits and barriers. Educ Gerontol. 2008;34(9):800–811. [Google Scholar]
- 7.Freire AP, Goularte R, de Mattos Fortes RP. Techniques for developing more accessible web applications: a survey towards a process classification. Paper presented at: Proceedings of the 25th Annual ACM International Conference on Design of Communication; October 2007. [Google Scholar]
- 8.Yancey AK, Ortega AN, Kumanyika SK. Effective recruitment and retention of minority research participants. Annu Rev Public Health. 2006;27:1–28. [DOI] [PubMed] [Google Scholar]
- 9.Christopher S, Watts V, McCormick AKHG, Young S. Building and maintaining trust in a community-based participatory research partnership. Am J Public Health. 2008;98(8):1398–1406. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Au C. Cultural factors in preventive care: Asian-Americans. PrimCare. 2002; 29(3):495–502. viii. [DOI] [PubMed] [Google Scholar]
- 11.McGraw SA, McKinlay JB, Crawford SA, Costa LA, Cohen DL. Health survey methods with minority populations: some lessons from recent experience. Ethn Dis. 1992;2(3):273–287. [PubMed] [Google Scholar]
- 12.Aday LA, Chiu GY, Andersen R. Methodological issues in health care surveys of the Spanish heritage population. Am J Public Health. 1980;70(4):367–374. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Hunt SM, Bhopal R. Self report in clinical and epidemiological studies with non-English speakers: the challenge of language and culture. J Epidemiol Community Health. 2004;58(7):618–622. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Frayne SM, Burns RB, Hardt EJ, Rosen AK, Moskowitz MA. The exclusion of non-English-speaking persons from research. J Gen Intern Med. 1996;11 (1):39–43. [DOI] [PubMed] [Google Scholar]
- 15.Dong X, Wong E, Simon M. Study design and implementation of the PINE study. J Aging Health. 2014;26:1085–1099. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Chen Y, Liang Y, Zhang W, Crawford JC, Sakel KL, Dong X. Perceived Stress and Cognitive Decline in Chinese-American Older Adults. J Am Geriatr Soc. 2019;67(Suppl. 3):S519–S524. [DOI] [PubMed] [Google Scholar]
- 17.Dong X, Kong D, Mendhe D, Bergren SM. Leveraging Technology to Improve Health Disparity Research: Trilingual Data Collection Using Tablets. J Am Geriatr Soc. 2019;67(Suppl. 3):S479–S485. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Guo M, Kim S, Dong X. Sense of Filial Obligation and Caregiving Burdens among Chinese Immigrants in the United States. J Am Geriatr Soc. 2019;67(Suppl. 3):S564–S570. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Jiang L, Sun F, Zhang W, Wu B, Dong X. Health Service Use among Chinese American Older Adults: Is There a Somatization Effect? J Am Geriatr Soc. 2019;67(Suppl. 3):S584–S589. [DOI] [PubMed] [Google Scholar]
- 20.Kong D, Solomon P, Dong X. Comorbid Depressive Symptoms and Chronic Medical Conditions Among US Chinese Older Adults. J Am Geriatr Soc. 2019;67(Suppl. 3):S545–S550. [DOI] [PubMed] [Google Scholar]
- 21.Kong D, Solomon P, Dong X. Depressive Symptoms and Onset of Functional Disability Over 2 Years: A Prospective Cohort Study. J Am Geriatr Soc. 2019;67(Suppl. 3):S538–S544. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Lai DWL, Li J, Lee VWP, Dong X. Environmental Factors Associated with Chinese Older Immigrants’ Social Engagement. J Am Geriatr Soc. 2019;67(Suppl. 3):S571–S576. [DOI] [PubMed] [Google Scholar]
- 23.Li C-C, Matthews AK, Dong X. The Influence of Smoking Status on the Health Profiles of Older Chinese American Men. J Am Geriatr Soc. 2019;67(Suppl. 3):S577–S583. [DOI] [PubMed] [Google Scholar]
- 24.Li M, Guo M, Stensland M, Silverstein M, Dong X. Typology of Family Relationship and Elder Mistreatment in a US Chinese Population. J Am Geriatr Soc. 2019;67(Suppl. 3):S493–S498. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Li M, Liang Y, Dong X. Different Definitions of Elder Mistreatment and Mortality: A Prospective Cohort Study from 2011 to 2017. J Am Geriatr Soc. 2019;67(Suppl. 3):S506–S512. [DOI] [PubMed] [Google Scholar]
- 26.Mao W, Chen Y, Wu B, et al. Perceived Stress, Social Support, and Dry Mouth among US Older Chinese Adults. J Am Geriatr Soc. 2019;67(Suppl. 3):S551–S556. [DOI] [PubMed] [Google Scholar]
- 27.Petrovsky D, Wu B, Mao W, Dong X. Oral Health Symptoms and Cognitive Function among US Community-Dwelling Chinese Older Adults. J Am Geriatr Soc. 2019;67(Suppl. 3):S532–S537. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Tang F, Chi I, Dong X. Sex Differences in the Prevalence and Incidence of Cognitive Impairment: Does Immigration Matter? J Am Geriatr Soc. 2019; 67(Suppl. 3):S513–S518. [DOI] [PubMed] [Google Scholar]
- 29.Wang B, Dong X. Life Course Violence: Child Maltreatment, IPV and Elder Abuse Phenotypes in a US Chinese Population. J Am Geriatr Soc. 2019;67(Suppl. 3):S486–S492. [DOI] [PubMed] [Google Scholar]
- 30.Zhang W, Tang F, Chen Y, Silverstein M, Liu S, Dong X. Education, Activity Engagement, and Cognitive Function in U.S. Chinese Older Adults. J Am Geriatr Soc. 2019;67(Suppl. 3):S525–S531. [DOI] [PubMed] [Google Scholar]
- 31.Zheng S, Li M, Kong D, Dong X. Sources and Variations in Social Support and Risk for Elder Mistreatment in a US Chinese Population. J Am Geriatr Soc. 2019;67(Suppl. 3):S499–S505. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Chen R, Simon MA, Dong X. Gender differences in depressive symptoms in US Chinese older adults. AIMS Med Sci. 2014;1(1):13–27. [Google Scholar]
- 33.Chen R, Simon MA, Chang E-S, Zhen Y, Dong X. The perception of social support among US Chinese older adults findings from the PINE study. J Aging Health. 2014;26(7):1137–1154. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Dong X. Do the definitions of elder mistreatment subtypes matter? findings from the PINE study. J Gerontol A Biol Sci Med Sci. 2014;69(suppl 2):S68–S75. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Dong X, Chen R, Chang E, et al. The prevalence of suicide attempts among community-dwelling US Chinese older adults: findings from the PINE study. Ethn Inequal Health Soc Care. 2014;7:23–35. [Google Scholar]
- 36.Dong X, Chen R, Fulmer T, Simon MA. Prevalence and correlates of elder mistreatment in a community-dwelling population of US Chinese older adults. J Aging Health. 2014;26(7):1209–1224. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Dong X, Chen R, Li C. Understanding depressive symptoms among community-dwelling Chinese older adults in the greater Chicago area. J Aging Health. 2014;26:1155–1171. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38.Dong X, Chen R, Simon MA. Anxiety among community-dwelling US Chinese older adults. J Gerontol A Biol Sci Med Sci. 2014;69(Suppl_2):S61–S67. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Dong X, Chen R, Wong E, Simon MA. Suicidal ideation in an older US Chinese population. J Aging Health. 2014;26(7):1189–1208. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Dong X, Chen R, Roepke-Buehler SK. Sociodemographic and socioeconomic characteristics associated with specific subtypes of elder mistreatment in a community-dwelling population of Chinese older adults. AIMS Med Sci. 2014;1(2):103–124. [Google Scholar]
- 41.Dong X, Chang E-S, Wong E, Wong B, Simon MA. Association of depressive symptomatology and elder mistreatment in a US Chinese population: findings from a community-based participatory research study. J Agress Maltreat Trauma. 2014;23(1):81–98. [Google Scholar]
- 42.Dong X. Associations between the differential definitions of elder mistreatment and suicidal ideation outcomes in US Chinese older adults: do the definitions matter? J Gerontol A Biol Sci Med Sci. 2017;72(suppl 1):S82–S89. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Chang E-S, Beck T, Simon MA, Dong X. A psychometric assessment of the psychological and social well-being indicators in the PINE study. J Aging Health. 2014;26(7):1116–1136. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Dong X, Zhang M, Simon MA. The expectation and perceived receipt of filial piety among Chinese older adults in the greater Chicago area. J Aging Health. 2014;26(7):1225–1247. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Simon MA, Chang E-S, Zhang M, Ruan J, Dong X. The prevalence of loneliness among US Chinese older adults. J Aging Health. 2014;26(7):1172–1188. [DOI] [PMC free article] [PubMed] [Google Scholar]