Evaluating the Method – Digital Ethnography

Uncovering longitudinal life narratives: scrolling back on Facebook

  1. The researchers are trying to explore the potential role of sustained social media use in longitudinal qualitative research. They want to uncover how ‘growing up’ stories are told and archived online, seeing how what people say and share on social media can change over time. They focus on understanding the ‘digital trace’ on Facebook timelines, arguing that ‘scrolling back’ through Facebook with participants as ‘co-analysts’ of their own digital traces can add to the qualitative longitudinal research (QLR) tradition. Finally, they argue for the inclusion of these often highly personal, deep, co-constructed digital texts in qualitative longitudinal research.
  2. They define digital ethnography as studying young people in their twenties who have been using Facebook for more than five years. They set their framing of Facebook as an archive of life narratives. Next, they introduced their own Facebook Timelines study, and the ‘scroll back method’ as the focus of their interviews, along with a discussion of ethics and limitations. Lastly, they brought the threads of the article together to advance their argument around the potential contribution of a ‘scroll back method’ to qualitative longitudinal research (QLR) and how the method may be used to study other social media.

Ethnographic Research in a Cyber Era

  1. The researchers are trying to explain how online spaces were needed to more fully understand physical environments and issues. They argue that studying a group of people in their “natural habitat” now includes their “online habitat” as well. Lastly, they want and urge ethnographers to consider how digital spaces inform the study of physical communities and social interactions.
  2. They define digital ethnography as drawing their research from two ethnographic studies. These studies were used to explain how online spaces were needed to more fully understand physical environments and issues. They present ethnographic data to demonstrate the importance of considering online spaces. Their research focused on three specific online spaces to gather data: Facebook, Yelp, and corporate websites.

Both articles

3. Similarities: Both studies conducted digital ethnographies and examined the use of spaces on social media. They both examined the use of Facebook.

Differences: While they both examine the use of Facebook, they examined it differently. The first study had researchers sitting down with participants and do the scroll back method, which is where they both scrolled through the participant’s Facebook to look at older posts up to most recent posts. In the second study, three specific online spaces were looked at in order to gather data: Facebook, Yelp, and corporate websites. After looking at a participant’s Facebook page, face to face interviews were conducted to see how Facebook influenced participants to act in certain ways. Yelp was examined to grasp participants’ habits such as how they get their information online.

Strengths: The first study had a good sampling procedure of studying young people in their twenties who have been using Facebook for more than five years. The second study included interviews which is useful for answering in-depth questions about gendered space and labor. Conducting face to face interviews is a good qualitative research method, because it allows for a deeper analysis.

Weaknesses: The first study was less comprehensive than the second study, because no face to face interviews were conducted.

Evaluating the Method – Twitter Data

Riots and Twitter: connective politics, social media and framing discourses in the digital public sphere

  1. The researchers show that the connective action does not actually underplay differences between technologies. They also show that connective action sufficiently accounts for cultural and ideological drivers of action. They got their data by analyzing software systems as well as by looking at public issues and discourse to give a fuller account of connective politics during the riot clean-up movements. Their analysis shows that the clean-up movements were both complex and discursive political acts that had much influence from celebrities. Their findings show that the #RiotCleanUp hashtag that is credited with mobilising these groups provides a less compelling explanation of action in comparison to the #OperationCupOfTea hashtag. The researches argue that networks assemble and mobilise through the activation of discourse within a wider media sphere of competing discourses (214).
  2. They use content coding and close textual reading techniques to characterise discourse within major riot-related hashtags (214). They use an empirical method to identify, analyse and compare hashtag-specific discourse critically. This includes three elements: 1) establishing an overview of discourse, which includes identifying the dominant hashtags during the relevant period, 2) differentiating between these hashtags in a way that supports a critical analysis of their relative influence to recognise and allow for the interactive dynamics of the Twitter system, 3) providing a mechanism for interpreting this influence in terms of connective action (revealing clues as to why some discourses energise group mobilisation and others do not). Their methodological approach was an inductive one. They looked at specific hashtags and tried to induce a theory about collective action.
  3. Strengths: They used seven different hashtags, which covered a range of material. Multiple hashtags allow for more representative data on similar topics. Weaknesses: 1000 Tweets is a good number, but it could be increased to more Tweets to provide an even more representative sample. This would be more time-consuming to do so as well.

Understanding a digital movement of opinion: the case of #RefugeesWelcome

  1. The researchers were trying to analyze the digital discussion around the Twitter hashtag #RefugeesWelcome as a case of ‘digital movement of opinion’ (DMO). They got their data by using a triangulation of network, content and metadata analysis. They use a triangulation of quantitative methods (Twitter network and metadata analyses) and more qualitative text-based validations. They found that this DMO was driven primarily by social media elites whose tweets then became echoed by masses of isolated users. Then, they tested the post-DMO status of the hashtag-sphere after the November 2015 Paris terrorist attacks. The researchers argue that the concept of DMO provides a heuristically useful tool for future research on new forms of digital citizen participation because it (1) provides an ideal-type allowing to study empirical cases by observing their adherence and deviations from the theoretical construct; (2) isolates the digital dimension of citizen participation, both as a methodological strategy and an epistemological posture; (3) bridges public opinion and social movement theories and thereby helps apprehend new/future forms – arguably more networked but also more individualized – of collective action.
  2. This study uses three methods: triangulating Twitter network, metadata, and qualitative content analysis. They used a social media analytics tool that was designed for tracking Twitter content. There was a dataset of over 1 million tweets. Their methodological approach was an inductive one. They looked at specific hashtags and tried to understand DMOs.
  3. Strengths: The dataset was large (with over 1 million Tweets). This would provide a representative sample of the population and lead to more generalizable results. Weaknesses: Going through 1 million Tweets can be time-consuming. Were there other hashtags that could be looked at?

Evaluating the Method

Keeping it in “the family”: How Gender Norms Shape U.S. Marriage Migration Politics

1.Summarize briefly each study: what are the researchers trying to accomplish?  Where do they get their data? What are their findings? What are they arguing?

The researcher is trying to examine how gendered standards of legitimacy are applied to both family and sexuality. The researcher also shows how these gendered standards of legitimacy are used among petitioners to achieve genuineness and define red flags that indicate potential marriage fraud.

The researcher got her data from using an online ethnography and textual analysis of conversation threads on a large online immigration forum where U.S. petitioners exchange such information. She used an analytic process called constructivist grounded theory in order to incorporate members’ stories into the analysis to determine the “what, how, and why” of their evaluations (Charmaz 2008). Through two years of being a bystander in ethnographic immersion, she learned the site’s social dynamics, terminology, and common themes. She became aware of how members used the term red flag to mean the opposite of genuine relationships. To analyze red flags, she downloaded and sorted all the posts quantitatively by forum. Then, she wrote a script written in Python. The script was used to collect, clean, and conduct a key-word search for the term “red flag”.

The findings show that “discursive negotiations in virtual spaces are consequential for re-imagining intersectionally gendered citizenship and the policing of national identities and borders.”

She argues that forum members will police immigration requests even before cases reach an immigration officer. “Petitioners use the formal criteria of U.S. immigration in ways that reveal gender ideologies, expectations for conformity to a gendered hegemonic family ideal, and sexual double standards surrounding sexual agency, fertility, and desirability. These intersectional norms shape members’ online discussions about the suitability of marriages and of the migration of noncitizen partners to the United States.”

Speaking ‘unspeakable things’: documenting digital feminist responses to rape culture

1.Summarize briefly each study: what are the researchers trying to accomplish?  Where do they get their data? What are their findings? What are they arguing?

The researchers are trying to examines the ways in which girls and women are using digital media platforms to challenge the rape culture they experience in their everyday lives. This includes street harassment, sexual assault, and the policing of the body and clothing in school settings. The researchers ask three main questions: What experiences of harassment, misogyny and rape culture are girls and women responding to? How are girls and women using digital media technologies to document experiences of sexual violence, harassment, and sexism? And, why are girls and women choosing to mobilize digital media technologies in such a way?

The researcher gets their data from ethnographic methods such as semi-structured interviews, content analysis, discursive textual analysis, and affect. They use the anti-street harassment website Hollaback! to map both the types of misogyny/sexism/ harassment that women frequently encounter as well as their responses to it.

The findings show that the mediation of marginalized voices produce social, cultural, and political possibilities. Social media apps have given girls and women spaces and opportunities to connect, sharing their experiences and finding solidarity with one another. The findings also show how teens responded to the complexity of real, embodied sexism and expressions of rape culture at school through their creative and innovative uses of Twitter.

The researchers argue that digital mediation enables new connections that were previously unavailable to girls and women. This allows them to redraw the boundaries between themselves and others. Feminist activism has been more visible in media culture. Digital culture has the radical potential of reanimating feminist politics both online and off-line.

2.How are they using content analysis?  What are they coding for? What is their methodological approach to coding? (inductive? Deductive?)

Longo uses content analysis by coding for the number of red flags that appear in the online discussion forums. Her methodological approach to coding is an inductive approach, because she was trying to induce the theory to explain the patterns from the data that she was collecting.

Keller, Mendes, and Ringrose use content analysis by looking at posts on the anti-street harassment website Hollaback!. They also look at the Twitter hashtag #BeenRapedNeverReported. Lastly, they look at how teen feminists use Twitter.  Their methodological approach to coding is an inductive one. They looked at social media and induced a theory.

3.What are the strengths and weaknesses of the methods that they are using?  What do they capture well? What do you think they are missing out on? If you were to conduct this study, would you do anything differently?  What and why or why not?

Strengths: In both of the methods the researchers used, the data was able to be obtained easily. It was fairly accessible since it was available online through forums and social media apps like Twitter.

Weaknesses: Collecting data for both could have been time consuming. In Longo’s study, she was only able to look at a snapshot of people’s lives. Keller, Mendes, and Ringrose could have looked at other social media apps to collect data. They also could have looked at other hashtags.

If I were to conduct Longo’s study, I would follow the people who post and see if their interactions are similar in their other forum posts/communications.

If I were to conduct Keller, Mendes, and Ringrose’s study, I would look at other social media apps and hashtags.

Ethical Digital Research

What does ethical digital research mean to you?

Ethical digital research means and consists of minimizing and avoiding harm/risks whenever necessary, maximizing benefits when necessary, obtaining informed consent from participants, keeping participants’ information during and after research anonymous or confidential, allowing participants to withdraw at any time during the research, and debriefing participants after the research is done. This is often tricky because, from the video, Marie Wallace mentioned how analytics can take public data to derive private information. For example, they can monitor daily habits to find your location. Therefore, social media and other sites and corporations should be more transparent about how they mine data from their users.

 

Given your knowledge of the IRB, do you think that they ensure ethical digital research as defined by you? Why or why not?

The article The Historical, Ethical, and Legal Background of Human-Subjects Research states that the main role of IRBs is to protect the rights and welfare of human subjects (Rice 1329). “The ultimate goal of the IRB, however, parallels that of the researcher; both are charged with ensuring that human subjects research is conducted ethically, with sound scientific rationale, to maximize benefits and minimize risks.” Given my knowledge of the IRB, I think that they do not ensure ethical digital research because there are many social media and other sites and corporations that extract data from their users in the background, so that users do not know data is being collected on them.

 

Is it even possible to protect human subjects in digital research? Is there a point in digital research, particularly when examining ‘big data’, where we can truly say that human subjects aren’t affected? If so, what is that threshold?

It is not entirely possible to protect human subjects in digital research. The article Ethics as Methods: Doing Ethics in the Era of Big Data Research—Introduction states that “Cambridge Analytica, a large data mining and analysis firm, was able to access personal details of 50 million Facebook users without their direct permission or knowledge.” Because of this, I think that big data will always affect human subjects, especially without them knowing. Therefore, social media has should take accountability by being transparent with their users.

 

What are some things that you can do as a digital sociologist to protect the human subjects in your own research projects?

Some things I can do as a digital sociologist to protect human subjects in my own research projects include:

  • Obtaining consent before conducting research
  • Being open and transparent about how I will be collecting, analyzing, and using data
  • Keeping subjects anonymous or confidential
  • Minimizing risk
  • Maximizing benefits
  • Allowing subjects to withdraw at anytime
  • Debriefing subjects afterwards