BigSurv20 program
Friday 6th November Friday 13th November Friday 20th November Friday 27th November
Friday 4th December
Back
Ethical challenges in the era of big dataModerator: John Finamore (jfinamor@nsf.gov) |
|
Friday 13th November, 11:45 - 13:15 (ET, GMT-5) 8:45 - 10:15 (PT, GMT-8) 17:45 - 19:15 (CET, GMT+1) |
Consenting respondents in the age of digital media: Considerations for survey researchers
Dr Amelia Burke-Garcia (NORC at the University of Chicago) - Presenting Author
This century has seen incredible social and economic shifts, of which, technological advances are one of the more powerful ones that have emerged. It is estimated that there are currently 4 billion Internet users right now, and these users create 2.5 quintillion bytes of data each day. Such advances are drastically increasing the speed and scale of communication at a global scale.
This plethora of digital data can provide insight into public opinion on a variety of topics, and therefore, inform policy, practice, and innovation. As such, they are having a substantial impact on the field of survey research. In 2015, the American Association for Public Opinion Research’s (AAPOR) released its report on Big Data, stating that, “The change in the nature of the new types of data, their availability, the way in which they are collected, and disseminated are fundamental. The change constitutes a paradigm shift for survey research.”
While digital data and survey research both have a lot to offer, very little work has examined the ways in which they can be used together to provide richer datasets. Therefore, the use of digital data in survey research is an important area for investigation. What is lacking, however, are protocols for doing this kind of work credibly and ethically, something that APPOR’s report acknowledged back in 2015.
Given how quickly the digital space is changing, the time to identify and create these quality guidelines for the survey research field is now. There are numerous areas that APPOR’s 2015 report identified as worthy of investigation but one area that warrants particular consideration is that of consent. Clear and comprehensible consent processes and language is vital to engender and maintain trust and continued participation, but certainly, the request to access respondents’ digital data may result in lower response rates. As such, the language researchers use to communicate efforts to collect respondents’ digital data, becomes paramount. As researchers we must clearly communicate to respondents how and why we collect certain types of data and how those data will be used. Yet the world of digital data, e.g., what data can be collected and the methods of doing so, is complex and can be confusing. Therefore, models of appropriate consent are needed.
This presentation will propose considerations and guidelines in the area of consent, including sample language. These considerations and guidelines will make it possible for scholars and practitioners to begin to apply digital data in their survey research projects in ways that ensure quality and credibility of the research.
Consent to link Twitter data to survey data: A comprehensive assessment
Dr Zeina Mneimneh (University of Michigan) - Presenting Author
Miss Fernanda Alvarado (mleiton@umich.edu)
Researchers across the social and computational sciences have recently been grappling with whether survey responses and social media data can complement or supplement each other and what differences exist between these two data sources in terms of their representation and measurement. These questions are best answered by linking these data sources together via consenting respondents to such linkage. The efficiency of such a design relies on a high rate of consent to maximize the linked information. Thus, identifying factors that affect consent to link these data sources is essential for design decisions. While there have been a number of studies that have investigated consent to social media data linkage, most of these studies focus on rates of consent, demographic predictors, and the effect of mode on consent rates (e.g., Al Baghal, Sloan, Jessop, Williams, & Burnap, 2019; Beuthner, Keusch, Menold, Schroder, Weib, & Silber,2019; Breuer, Stier, Siegers, Gummer, & Blieier, 2019; Murphy, Landwehr, & Richards, 2013; Wagner, Pasek, & Stevenson 2015). Yet, several other factors have been proposed to affect social media linkage such as privacy concerns, risk aversion, and relevance of the task. These factors have not been investigated systematically within the same study in a probability sample.
In this presentation and using a probability online panel sample, we investigate consent to link survey data to Twitter data, and assess the effect of a comprehensive set of factors that map onto proposed frameworks of data linkage consent. We also investigate the effect of location of the consent statement on consent rate, the time spent to read the consent statement, and any context effects ensued on collected measures of privacy due to the location of the consent statement.
Different measurements of attitudes towards misconduct in science: Neutralizations, implicit association and direct questions
Mr Justus Rathmann (University of Zurich)
Mrs Antonia Velicu (University of Zurich) - Presenting Author
Professor Heiko Rauhut (University of Zurich)
Scientific misconduct, such as data fabrication, data falsification, and plagiarism, is an emerging topic in science studies and the scientific community, also driven by recent calls for more transparency in academia. Even though such practices are widely regarded as unethical, scientific misconduct nevertheless happens. The understanding of scientists’ attitudes towards scientific misconduct is therefore crucial in order to be able to tackle the subsequent issues of misconduct.
Attitudes towards socially undesirable practices, such as scientific misconduct, are difficult to measure. To improve the analysis, we amplify our measurement by combining direct questions with the innovative application of theoretically derived techniques from criminology and survey methods borrowed from social psychology: Neutralization techniques (Sykes/ Matza 1957) and the Implicit Association Test (IAT).
Criminology identifies five techniques, originally used by juvenile delinquents to justify their deviant behavior: Denying the responsibility, the injury, or the victim, condemning the condemners, and appealing to higher loyalties. These techniques have been widely used with different forms of deviant and unethical behavior. We deploy these techniques in order to investigate how researchers justify scientific misconduct, as this indicates a person’s attitude towards these practices.
In general, an IAT measures the relative strength of pairwise associations between concepts and attributes and has been frequently used in psychology to measure sensitive implicit attitudes. It can be used to reveal information participants might want to hide due to social desirability, because contrary to direct questions it is very difficult to lie in the IAT.
To answer our research questions, we draw on a unique, newly collected data set of scientists in Austria, Germany, and Switzerland, the Zurich Survey of Academics. The survey focuses on work and research environments, norms and practices of authorship, publication strategies, publication bias, scientific misconduct and combines survey data with bibliometric registers as well as the researcher’s website data.
First, we introduce a group of direct questions, asking for the participants attitudes towards scientific misconduct. Second, we employ statements measuring different ways of legitimizing, or neutralizing scientific misconduct. Third, we introduce single-category IAT to investigate whether scientists’ associate practices of scientific misconduct rather with success or failure.
While combining direct questions with justifications and neutralizations along with implicit associations about scientific misconduct, our research aims to give insights towards a better understanding of attitudes towards scientific misconduct. Further, it allows to analyze advantages and disadvantages of combining multiple implicit and explicit measurements of attitudes.
Preliminary results on a sub-sample of the data show that feeling unwell with scientific misconduct is negatively correlated with neutralizing it. Similarly, feeling unwell with scientific misconduct is correlated with associate it with failure rather than success in the IAT. These preliminary results indicate that the measurement of the innovative survey methods is consistent with the direct questions.