One of the key criticisms of the social sciences by those in the physical sciences and by philosophers of sciences (see Karl Popper’s critiques of pseudosciences) is the lack of rigor in social science research. This lack of rigor compared to the physical or “hard” sciences is not because of a lack of empirical evidence to be collected or falsifiable testing that can be done but is a natural consequence of the subject matter being examined, human behavior. Social science research must be aware of the ethical implications has it attempts to understand the human mind, society, culture, events, etc. A biologist studying an animal in its natural habitat can observe and take notes, take samples of the animal’s droppings or blood, and then leave with the animal barely wondering what had happened. A human being would be greatly concerned by comparable occurrences.
As sapient beings, most people argue that humans have the right to not be monitored or have data collected on them without their consent. This expectation of privacy and personal autonomy are ethical considerations for social science researchers. The mid-20th century saw the development of Institutional Review Boards (IRBs) to ensure that any human-subject research met ethical guidelines and principles such as those found in the Nurmerge Code, the Declaration of Helsinki, and The Belmont Report (Rice 2008). The Belmont Report produced four ethical research requirements that follow their fundamental principle of respect for persons:
- Participants must voluntarily consent to participate in research
- The consent must be informed consent
- Participants’ privacy and confidentiality must be protected.
- Participants have the right to withdraw from research participation without penalty or repercussions.
These requirements remain excellent ethical guidelines for social science research but the newest domain of social research, the digital domain, has made it easy to violate the spirit of these guidelines. Calling into question if ethical digital research can be done. This isn’t just an issue of what data can be accessed online but the scale and speed at which it can be analyzed. Our daily habits can be recorded and be made publicly available information by us or by the platforms we interact with. The use of social media can blur the first requirement above regarding voluntary consent. People submit census data voluntarily and they interact with social media voluntarily, so why not use the data? Aside from the issue of public vs private collection of data, there are possible conflicts with the ethical requirements above. First, in the first rule what is consented to is “participation in the research.” This can be obfuscated with lengthy end-user and terms of service agreements, blurring the second requirement, so those collecting the data can make a case that users have given informed consent.
Second is the effort, or lack thereof, given to the third and fourth requirements. What constitutes privacy protections is a litigious debate constantly ongoing that is entangled with issues of public safety and bureaucratic procedures. While a person does not have the ability to withdraw from research using publicly available census data, those datasets do not contain information that would violate their privacy or confidentiality. Research using digitally collected information often has data on individuals and those individuals are typically unable to withdraw from that research. Even if the data is anonymized, constant comparisons to other datasets may eventually reveal the identity of the individual represented in a particular case (Wallace 2014). Privacy can be violated immediately when data collection can be done by the software is used to quickly scan, comb, and select relevant data as set by the researcher that is then quickly analyzed by statistical programs. Information someone thought was only accessible to a few people has become a data point in a database that they did not consent to be apart of.
Digital research can be ethically done but the requirements of ethical research as outlined by the Belmont Report need to be treated as a continuous whole and not a step-by-step procedure. If a person’s digitally generated data is being used in any research, then each step of that research needs to meet the spirit of these requirements. Since digital research effectively involves everyone in the world now, these guidelines set-up to complement and contrast scientific rigor must now be considered more general regulation on how data analysis is performed. The digital domain has opened access to the individual at a scale that areas of social research have not had before. Marketing, propaganda, and news can now be targeted towards individuals almost instantaneously. Ethics in digital research is more than just a question of how social science research is performed but the importance placed upon personal privacy by society and those in power.
IRBs are insufficient to ensure the protection of people in human-subject digital research. Primarily because not everyone partaking in human-subject digital research are required to use IRBs. This is complicated by research done for marketing purposes since the goal of the research is profit and not knowledge. Additionally, private think tanks do not necessarily need to use IRBs if they do their own data collection. Within academic and governmental research, IRB members and researchers may not have a full and proper understanding of the differences in how digital data collection can have different implications compared to physical observations. If ethical protections IRBs are supposed to provide are not instituted as industry regulation for private organizations engaging in human-subject digital research then IRBs exist has a half-measure in protecting participants in any social science research as all data is digitized. These considerations should be part of data production as it is generated on the digital platforms we use and not just as a review of proposed data collection.
It is possible to protect human subjects in digital research but it would require broader government regulation than there is a current political will to enact. Long and obtuse terms of service agreements would have to be made more accessible. Privacy protections would have to be guaranteed as a matter of everyday living while being constantly surveilled by our digital devices. The questions become “what kind of privacy protection?” or “at what point is digital data sufficiently anonymized?” Like most matters of scale there is a gray area and no clear threshold at which data has been transformed into something that individual people can’t be identified. A single dataset may be sufficiently anonymous within itself but when compared to other datasets that overlap, then privacy could be violated.
For individual researchers, they can take their own precautions after having cleared their IRBs to constantly re-check that their procedure and methods provide the protection of people’s identities and well-being that that is required by ethical guidelines like the ones presented above. If we are using data scrubbers, we need to make sure the dataset we put together has had all unessential, cross-referenceable data removed from it. That can be anything from travel data to irrelevant (to the research) habits. If a demographic marker is not a variable in our tests, remove it. If we are collecting data ourselves or intend on quoting someone in publishable work, contact the individuals. Even if you intended to anonymize it as “20-year-old on Twitter” that quote can be searched for. If your research topic is a sensitive one then critics and detractors of the person you are quoting may be able to find them.
The social sciences lack the supposed firmness of the physical sciences (which are pretty firm) and are derided as “soft.” However, this “softness” is important because the subjects of the social sciences are our fellow human beings. We must have a “soft touch” when it comes to the invasive and intimate procedures that are a part of social science research. We can inject ourselves into other’s lives and ask them to pretend we aren’t there or hid among the crowds to silently observe and record. The golden rule, do to others as you would have them do to you, is the ethic of reciprocity and sits above all other ethics. If we wish to be respected and have privacy we must treat others with respect and give them the privacy each individual deserves.