Defining “Connected Learning” through Educational Research Literature

What is connected learning?
I admit that I have always answered that question using the Connected Learning Research Network’s Connected Learning: An Agenda for Research and Design.  But when I got systematic about it, and searched the academic databases that exist beyond Google Scholar, “Connected Learning” elicited a surprising number of peer-reviewed journal articles some that predated my favorite report by almost 20 years.  So I opted to be systematic and review relevant literature in ERIC, Educational Research Complete (ERC), and Academic Search Complete (ASC).  
Because of my area of study I narrowed the ERIC search to “Connected Learning” in “Higher Education” and the ERC and ASC searchers to “Connected Learning” and “Universities.”  Altogether this yielded 102 articles (ERIC, n = 28; ERC, n = 43; and ASC, n =31).  Despite my search restrictions a significant number of these articles related to K-12 learning environments.  After manually removing those articles and other sorts of duplication and irrelevant nonsense, 35 articles published between 1997 and 2014 remained.   
The purpose of my review was to:
1.     To obtain an understanding how the term “connected learning” is used in educational research field beyond Digital Media and Learning Research Hub and the Connected Learning Alliance.
2.       To see how the term may have evolved over time.
3.       To see how the Digital Media and Learning Research Hub framework for connected learning practice and research (Ito et al., 2013) might have impacted the use of the term in the research.
4.       And really, because I need to define “connected learning” for my dissertation.
Six themes emerged as I read through the articles.  
From 1997 to 2014, “Connected Learning” has meant:
1.        Online.  To some researchers, “connected” is directly related to digital technology with less emphasis on community-building or curriculum design. Chronologically these articles are isolated to second half of the 2000s.  An example: Getting Connected: Learning from External Early Childhood Education Students Perceptions of Their Study Experiences (Lennox, Davis, & Heirdsfield 2006).  This work describes the experience of online college students who, among other things, desired more consistent tech support around university websites and online learning resources.
2.       PeerSupported.  To other researchers, “connected” is all about creating a shared learning experience whether digital technology is present or not.  In some cases, connected learning was synonymous with collaborative learning (Franklin, 2000; Scholz & Wang, 2009).  Others, like Barr, Daily, Gordon, & Sinke (2013) went as far as to link connected learning to connected knowing, in the Women’s Way of Knowing sense.  The home stay: A gendered perspective (Gutel, 2008) is another example of this.
3.       Community-Based. Not to be confused with building a peer-supported community, community-based connected learning is about moving learning beyond the classroom to the larger community.  Sometimes this was akin to service learning: the idea that the best learning occurs in meaningful “real world” situations (Hampton, Liguori, Rippberger, 2003).  Alternatively, “community” could mean that learning should overlap with the student’s everyday reality (Chaebong, 2013). 
4.       Interdisciplinary. This is also a “beyond the classroom” theme, but these articles emphasize general education themes. In other words students should be connecting what they learn across disciplines and classes into a more cohesive educational program (I see this argument as the basis for the portfolio movement and interdisciplinary studies, if that helps you understand what I’m talking about).  Examples: Boxer, 1998; Smith & Morgaine, 2004.
5.       Library/Informal Spaces.  This is a cross-theme that I just had to bring up; particularly in recent years, it appears that the librarians are making a move to promote connected learning.  Go librarians, fight, win. My inclusion of this category meant that some articles were double counted in the graph below, so don’t worry, I know my numbers don’t exactly match up. Examples: Ito & Martin, 2013. 
6.       CLA-Framework.  That’s Connected Learning Alliance, and it’s my shorthand to identify articles that pulled all the concepts together: as Kelleher & Swarzlander (2009) put it, connected learning is “active, connected, and communicated.”  Or as Ito et al., (2013) puts it, connected learning through six core design concepts: interest-powered, peer-supported, academically-oriented, production-centered, shared purpose, and openly-networked.  AND all of these things are facilitated through digital technology.  This is a synthesis, really, of all the other categories – a much more holistic, fuller, complex understanding of connected learning.  Not all of these articles directly reference CLA or Ito et al. (2013), but most of the papers published in the last two years do.  And note the trend over the last five years towards a more CLA-like definition of connected learning.
Oh, and I made a graph.

So, to answer my own questions:
 Connected learning has always been about connecting people and ideas across space, time, and spheres of influence.  In the early days, researchers often focused on one aspect of that definition or they focused on the technological vehicles through which connected learning could occur.  Since Connected Learning Alliance came into being, much of the research around connected learning has taken a much fuller, holistic, complex approach to connecting across space, time, and spheres of influence.  In some respects, this is a good thing.  I think the newer definition is fresher, more exciting and engaging, and more likely to make an impact on the world as we know it.  However, from a research perspective, the more complicated definition can make life more difficult; finding environments to study that fulfill new definitions of connected learning, for example, can be hard.  In my research I might have to step back from the CLA framework just a bit, just so I have something to study.  
Critique is absolutely 100% welcome.  In fact, it’s requested.
References Reviewed 
(Duplicates were removed from the ERIC list.  It was an arbitrary decision.)
Abel, R., Brown, M., & Suess, J. (2013). A New Architecture for Learning. EDUCAUSE Review, 48(5), 88.
Barnett, J., McPherson, V., & Sandieson, R. M. (2013). Connected Teaching and Learning: The Uses and Implications of Connectivism in an Online Class. Australasian Journal of Educational Technology, 29(5), 685–698.
Buchanan, J. (2008). Developing Leadership Capacity through Organizational Learning. Journal of College Teaching & Learning, 5(3), 17–24.
Gutel, H. (2008). The Home Stay: A Gendered Perspective. Frontiers: The Interdisciplinary Journal of Study Abroad, 15, 173–188.
Haigh, M. (2014). From Internationalisation to Education for Global Citizenship: A Multi-Layered History. Higher Education Quarterly, 68(1), 6–27.
Kilgore, D. (2004). The Medium Is the Message: Online Technology and Knowledge Construction in Adult Graduate Education. Adult Learning, 15, 12–15.
Lee, J. S., & Anderson, K. T. (2009). Negotiating Linguistic and Cultural Identities: Theorizing and Constructing Opportunities and Risks in Education. Review of Research in Education, 33(1), 181–211.
Maivorsdotter, N., & Lundvall, S. (2009). Aesthetic Experience as an Aspect of Embodied Learning: Stories from Physical Education Student Teachers. Sport, Education and Society, 14(3), 265–279.
McElvaney, J., & Berge, Z. (2009). Weaving a Personal Web: Using Online Technologies to Create Customized, Connected, and Dynamic Learning Environments. Canadian Journal of Learning and Technology, 35(2).
Rae, J., Taylor, G., & Roberts, C. (2006). Collaborative Learning: A Connected Community for Learning and Knowledge Management. Interactive Technology and Smart Education, 3(3), 225–233.
Sidelinger, R. J., & Booth-Butterfield, M. (2010). Co-Constructing Student Involvement: An Examination of Teacher Confirmation and Student-to-Student Connectedness in the College Classroom. Communication Education, 59(2), 165–184.
Spurgin, D. G., & Childress, M. D. (2009). Effects of University and Departmental Community on Online Learners. EDUCAUSE Quarterly, 32(4).
EBSCO (an aggregator that searched ASC and ERC)
AAC&U on Purposeful Pathways and Connected Learning in the Disciplines. (2003). Peer Review, 6(1), 20–20.
Abrams, S. S., & Gerber, H. R. (2014). Cross-Literate Digital Connections: Contemporary Frames for Meaning Making. English Journal, 103(4), 18–24.
Andrews, T., Tynan, B., & James, R. (2011). The lived experience of learners’ use of new media in distance teaching and learning. On the Horizon, 19(4), 321–330. doi:10.1108/10748121111179448
Barr, S., Dailey, M., Gordon, P., & Sinke, K. (2013). Connected Knowing–Connected Dancing. Interdisciplinary Humanities, 30(1), 59–71.
Bilandzic, M., & Johnson, D. (2013). Hybrid placemaking in the library: designing digital technology to enhance users’ on-site experience. Australian Library Journal, 62(4), 258–271. doi:10.1080/00049670.2013.845073
Bowen, K. (2011). Mixable: A Mobile and Connected Learning Environment. EDUCAUSE Quarterly, 34(1), 4–4.
Boxer, M. J. (1998). Remapping the university: The promise of the women’s studies Ph.D. Feminist Studies, 24(2), 389.
Capuano, N., Salerno, S., Mangione, G. R., & Pierri, A. (2014). Semantically connected learning resources fostering intuitive guided learning. International Journal of Continuing Engineering Education & Lifelong Learning, 24(2), 122–140. doi:10.1504/IJCEELL.2014.060153
Chaebong Nam. (2013). When New Media Meet the Strong Web of Connected Learning Environments: A New Vision of Progressive Education in the Digital Age. International Journal of Progressive Education, 9(2), 21–33.
Commander, N. E. (2003). A Model for Strategic Thinking and Learning. About Campus, 8(2), 23.
Creighton, S. (2006). University of Dayton’s Fitz Center: Leadership in Building Community. Metropolitan Universities, 17(1), 75–83.
Gerber, H. R., Abrams, S. S., Onwuegbuzie, A. J., & Benge, C. L. (2014). From Mario to FIFA: what qualitative case study research suggests about games-based learning in a US classroom. Educational Media International, 51(1), 16–34. doi:10.1080/09523987.2014.889402
Hampton, E., Liguori, O., & Rippberger, S. (2003). Binational Border Collaboration in Teacher Education. Multicultural Education, 11(1), 2–10.
Ito, M., & Martin, C. (2013). Connected Learning and the Future of Libraries. Young Adult Library Services, 12(1), 29–32.
Jones, J. (2007). Connected Learning in Co-operative Education. International Journal of Teaching & Learning in Higher Education, 19(3), 263–273.
Kelleher, F. A., & Swartzlander, S. (2009). Action, Connection, Communication: The Honors Classroom in the Digital Age. Journal of the National Collegiate Honors Council, 10(2), 49–52.
King, P. M., & Schroeder, C. C. (1997). From the Editors. About Campus, 1(6), 1.
Lennox, S., Davis, J., & Heirdsfield, A. (2006). Getting Connected: Learning from External Early Childhood Education Students Perceptions of their Study Experiences. Journal of Early Childhood Teacher Education, 27(3), 275–289. doi:10.1080/10901020600843632
Long, D. D., & Shobe, M. A. (2010). Lessons Learned from Preparing Social Workers for Grant Writing via Connected Learning. Administration in Social Work, 34(5), 392–404. doi:10.1080/03643107.2010.512837
Neubauer, B. J., Hug, R. W., Hamon, K. W., & Stewart, S. K. (2011). Using Personal Learning Networks to Leverage Communities of Practice in Public Affairs Education. Journal of Public Affairs Education, 17(1), 9–25.
Paige, K., Lloyd, D., & Chartres, M. (2008). Moving towards transdisciplinarity: an ecological sustainable focus for science and mathematics pre-service education in the primary/middle years. Asia-Pacific Journal of Teacher Education, 36(1), 19–33. doi:10.1080/13598660701793350
Rowan, D., Rankopo, M., Kabwira, D., Long, D. D., & Mmatli, T. (2012). Using Video as Pedagogy for Globally Connected Learning About the Hiv/Aids Pandemic. Journal of Social Work Education, 48(4), 691–706.
Scholz, J. T., & Wang, C.-L. (2009). Learning to Cooperate: Learning Networks and the Problem of Altruism. American Journal of Political Science, 53(3), 572–587. doi:10.1111/j.1540-5907.2009.00387.x
Sims, R. (2008). Rethinking (e)learning: a manifesto for connected generations. Distance Education, 29(2), 153–164. doi:10.1080/01587910802154954
Smith, C. L., & Morgaine, C. (2004). Liberal Studies and Professional Preparation: The Evolution of the Child and Family Studies Program at Portland State University. Child & Youth Care Forum, 33(4), 257–274.
Valdivia, C., & Subramaniam, M. (2014). Connected Learning in the Public Library: An Evaluative Framework for Developing Virtual Learning Spaces for Youth. Public Library Quarterly, 33(2), 163–185. doi:10.1080/01616846.2014.910727
Watson, L. (2007). Building the Future of Learning. European Journal of Education, 42(2), 255–263. doi:10.1111/j.1465-3435.2007.00299.x
Wilensky, U., & Reisman, K. (2006). Thinking Like a Wolf, a Sheep, or a Firefly: Learning Biology Through Constructing and Testing Computational Theories-An Embodied Modeling Approach. Cognition & Instruction, 24(2), 171–209. doi:10.1207/s1532690xci2402_1

Wurl, J. (2014). Connected Learning and the Library: An Interview with Kristy Gale. Young Adult Library Services, 12(4), 19–21.

Ethics of Learning Analytics

Despite robust exposure to ethics and human subject research in a variety of contexts and settings, I still get confused about IRB things sometimes. Also, I know that designing and performing internet research of any variety is an ethical tangle, which intensifies my lack of confidence.  I learned today that my anxiety might be related to the low level of legal maturity found in learning analytic research. In other words, sentinel cases and other forms of precedent are still rapidly accruing, leading to high levels of moral ambiguity and the tendency for researchers to operate almost as they please until the law or public outcry tells them they have made a mistake.  

Since I prefer to reduce my risk of being (ethically) notorious, I decided to take time out of my research activity to read about Internet research ethics in general and learning analytics research specifically.  Two posts ago, I summarized the Association of Internet Researchers (AoIR)’s 2012 working committee report on Internet Research Ethics.  It presented three internet-specific ethical concerns:   

1. Studying participants in Internet culture is inherently different than studying human “subjects” in a clinical trial.  There are different expectations for control and meaningful participation among many people on the Internet as compared to those submitting for medical testing.  Internet participants are less vulnerable and possibly should be treated more like artists than vessels of study.
2. Privacy in the context of the internet is different than in other contexts. 
3. It is challenging to identify the owners of data, who may (or may not) change over time and space.

This week, I was blessed with additional, learning analytics related ethics constructs to consider: 

On August 25, Educause Review published a new ethical framework for Learning Analytics, which builds on the Pardo & Siemens article on the ethical and privacy principles for learning analytics, published just several months ago.  The Internet gods were sending me a message.

The conceptual frameworks described in the two learning analytics (LA)-specific articles are consistent with each other and the AoIR report, although after reading the two LA papers, I concluded that the AoIR document is primarily meant for Internet ethnographic researchers.  My assumptions come from the authors’ long, Lincoln & Gubareminiscent approach to “human subject research.”  In contrast, Pardo & Siemens deemphasized the issue of human subjects altogether, suggesting that: “The notion of subjects being put at risk by a study…is not that evident in learning analytics where large, often anonymized data sets are used for analysis” (pp 442-443).

Issues of privacy, however, are present in all three articles. Pardo & Siemens goes into more detail on privacy than the AoIR working paperdescribing data collection as a “transaction” involving issues of trust, ownership, control (capability to influence flow of personal information), and limitations (possibility of preventing others from accessing information).  The article provides a legal framework for guiding policy for LA, which includes: 

1.  Transparency 

2.  Student control over data
3.  Security
4.  Accountability and Assessment

Questions remain, particularly around the rights of students to change their data (control) once they see it (transparency).  This reminds me of the same conversations qualitative researchers have about memberchecking.  

Also, for clarification, “accountability” has to do with using the information effectively while “assessment” relates to institutional commitment to ongoing program evaluation and refinement.

If you don’t like this framework, there’s always Slade & Prinsloo (2013) which, despite being specific to LA, points out the importance of contextualization of data just as the AoIR working committee report did.  The broad categories of ethical consideration (which are later broken down into six principles) are: 

1. The location and interpretation of data
2. Informed consent, privacy, and the deidentification of data
3. The management, classification, and storage of data

And then finally, the article that triggered the email that made me think the Internet gods were smiling down on my desire to be ethical: Willis’ new framework for Learning Analytics and Ethics. While Slade & Prinsloo and Pardo & Siemens focused on legal frameworks, Willis went the philosophical route.  This article does a lovely job of placing learning analytics within all the ethical lenses we know and love: utilitarianism,  nihilism, utopianism, and ambiguity.  There’s even a shout out to care ethics, which, to me, is a major component of the best motivations for learning analytics: LA researchers aim to provide a better, more holistic, more personalized education for all students.  Or perhaps that’s just me being utopian.

I’ve just scratched the ethical surface, but I feel comfortable enough with my grasp of the major issues (if not any particular solutions) that it is time to move on. Next week (probably. if there’s not intervening email pointing me in a different direction.):  What are Learning Analytics?