Learning Analytics in Higher Education

New Directions for Higher Education, Number 179
 
 
Jossey-Bass (Verlag)
  • erschienen am 28. September 2017
  • |
  • 120 Seiten
 
E-Book | ePUB mit Adobe DRM | Systemvoraussetzungen
978-1-119-44383-4 (ISBN)
 
Gain an overview of learning analytics technologies in higher education, including broad considerations and the barriers to introducing them. This volume features the work of practitioners who led some of the most notable implementations, like:
* the Open Learning Initiative now at Stanford University,
* faculty-led projects at the University of Michigan, including ECoach and SLAM,
* the University of Maryland, Baltimore County s Check My Activity and
* Indiana University s FLAGS early warning system and e-course advising initiatives.
Readers will glean from these experiences, as well as from a national project in Australia on innovative approaches for enhancing student experience, an informed description of the role of feedback within these technologies, and a thorough discussion of ethical and social justiceissues related to the use of learning analytics, and why higher education institutions should approach such initiatives cautiously, intentionally, and collaboratively.
This is the 179th volume of the Jossey-Bass quarterly report series New Directions for Higher Education. Addressed to presidents, vice presidents, deans, and other higher education decision makers on all kinds of campuses, it provides timely information and authoritative advice about major issues and administrative problems confronting every institution.
1. Auflage
  • Englisch
  • Newark
  • |
  • USA
John Wiley & Sons
  • 2,30 MB
978-1-119-44383-4 (9781119443834)
1119443830 (1119443830)
weitere Ausgaben werden ermittelt
EDITORS' NOTES 5
John Zilvinskis, Victor M. H. Borden
1. An Overview of Learning Analytics 9
John Zilvinskis, James Willis, III, Victor M. H. Borden
2. Incorporating Learning Analytics in the Classroom 19
Candace Thille, Dawn Zimmaro
3. Learning Analytics Across a Statewide System 33
Catherine Buyarski, Jim Murray, Rebecca Torstrick
4. Learner Analytics and Student Success Interventions 43
Matthew D. Pistilli
5. Cultivating Institutional Capacities for Learning Analytics 53
Steven Lonn, Timothy A. McKay, Stephanie D. Teasley
6. Using Analytics to Nudge Student Responsibility for Learning 65
John Fritz
7. Ethics and Justice in Learning Analytics 77
Jeffrey Alan Johnson
8. Learning Analytics as a Counterpart to Surveys of Student Experience 89
Victor M. H. Borden, Hamish Coates
9. Concluding Thoughts 103
John Zilvinskis, Victor M. H. Borden
INDEX 109

1


The purpose of this chapter is to provide administrators and faculty with an understanding of learning analytics and its relationship to existing roles and functions so better institutional decisions can be made about investments and activities related to these technologies.

An Overview of Learning Analytics


John Zilvinskis, James Willis, III, Victor M. H. Borden

Higher education administrators and leaders have a sense that there is some magic going on in other industries and sectors, where people appear to be taking better advantage of the seemingly endless and growing amount of data available to improve sales and target customer support. There is a sense that higher education is far behind in this endeavor and so they are willing to invest significantly in new, expensive technologies. Campus leaders hear about some cases where institutions are using these methods and technologies to gain competitive advantage; they are impressed and do not want to be left behind. No one has figured out the magic formula, and even very well-known examples have not been sustained. For example, in 2012 The Chronicle of Higher Education (Parry, 2011) featured Rio Salado College for its ability to use online student behavior to predict student class performance; however, 1-year retention rates between fall 2014 and fall 2015 are only 38% for full-time students and 26% for part-time students (National Center for Education Statistics, 2017). These rates are among the lowest 2% and lowest 9%, respectively, among public, 2-year colleges, not surprising for an open access institution with large online programs, but certainly not industry leading.

The truth is, as documented in this volume, implementing and sustaining learning analytics initiatives are just not that simple, nor do they necessarily result in dramatic improvements. It's not that these new technologies and methods are unhelpful, but rather it's that they don't address the more complex aspects of higher education, including the incredible diversity and complexity of learning outcomes across the curriculum and the complex organizational arrangements. Even for institutions committed to "data-driven decision making," stakeholders often make the mistake of thinking the data will tell them what to do, as opposed to realizing that data themselves do not drive decision making; it is the interpretation of the data that creates change. At the center of these issues are three important questions: (1) What are learning analytics? (2) Who should be involved in these projects? (3) What are some important, broad principles institutions should consider when developing learning analytics? The purpose of this chapter and indeed the remainder of this volume is to provide some answers to these important questions.

Learning Analytics Defined


Though the use of data in higher education for operational and decision support is certainly not new, the computational processes involved in modelling it for prediction, intervention, and tracking have expanded exponentially in recent years. Consequently, there is a lack of maturity and consistency in the naming terms, conventions, and standards related to learning analytics. It is not uncommon to attend a higher education conference panel or sit in on a campus meeting where attendees use "analysis" and "analytics" interchangeably. Certainly, formal definitions and distinctions have been asserted (for example, between educational data mining and learning analytics), but there are notable differences in the formal literature and the usage of terms in practice is much looser and inconsistent. In their article, "Academic Analytics: A New Tool for a New Era," Campbell, DeBlois, and Oblinger (2007) wrote, "Analytics marries large datasets, statistical techniques, and predictive modeling" (p. 42). Learning analytics uses both traditional data (from student records, surveys, and so on) along with new types of data emanating from transactional systems like learning management systems, online course platforms, social networks, and so on. However, there are finer nuances in defining various types of analytics as one begins to consider the domain and types of learning analytics projects. For the purposes of this chapter, we define learning analytics as the process of using live data collected to predict student success, promote intervention or support based on those predictions, and monitor the influence of that action.

Distinguishing Learning Analytics Projects by Level of Analysis


There are two separate but interrelated domains for this work: the work that engages faculty in improving student learning within individual classes and programmatic curriculum (learning analytics), and the more holistic student support applications that do not focus directly on learning but more on student progress, persistence, and completion (academic analytics, according to Long and Siemens, 2011). In their conceptual framework of analytics in higher education, van Barneveld, Arnold, and Campbell (2012) add that predictive analytics can be used in both of these domains to draw upon historical data to predict future outcomes that can guide intervention. Later in this volume, Pistilli and Wagner describe learner analytics as using historical data from student records to predict the outcomes of current students, with the aim of intervening for students who are predicted to have a low likelihood of success. Already in this paragraph, we've drawn distinctions between learning analytics, academic analytics, predictive analytics, and learner analytics, relating to the level of analysis (student vs. class vs. curriculum), chronological characteristics of predictors (historical vs. contemporaneous), and type of outcomes (learning/behavior/development vs. retention/graduation); therefore, it is easy to see why newcomers find themselves uncertain when trying to specify the type of analytic project they wish to implement.

Distinguishing Learning Analytics Projects by Intended User


Learning analytics projects are often distinguished by the intended user or recipient of information. Previous work in analytics has led to the creation of tools to assist faculty with examining data from individual classes (Campbell et al., 2007). Chapters 2 and 5 in this volume have this focus. Other learning analytics projects primarily inform the work of academic advisors and other support staff with student guidance and coaching (Aguilar, Lonn, & Teasley, 2014; Barber & Sharkey, 2012). Chapters 3 and 4 fall into this category. Still other projects provide data directly to learners (Baker, 2007). Chapter 6 considers this target audience, at least in part. Analytics are also used to provide senior managers with management information related to teaching, learning, and student success (Buerck, 2014), which is also noted in Chapters 3 and 8. Regardless of the intended user, the chapters of this volume demonstrate that numerous campus partners must collaborate to implement a successful learning analytics project.

Relationship to Existing Roles and Functions


Implementation of learning analytics projects requires not only an understanding of the domain or the type of analytic project, but also an understanding of the amount of work and types of expertise needed. It is not feasible for even the most dedicated educators to create their own analytic systems. In addition to involving faculty as domain experts, information providers, and end-users, analytic project development often relies on collaboration among staff from several support units, such as centers for teaching and learning, student support services, institutional research, and information technology. Indeed, the development of learning analytics projects can change how these units routinely operate by providing opportunities for collaboration with new partners and new end-users. This may also lead to changes in the types of skillsets required for staff in these units.

Centers for Teaching and Learning


Staff within a center for teaching and learning (CTL) or similarly named unit typically provide faculty development programs related to curriculum and course design and assessment. These units promote communities of practice around specific pedagogies (for example, service learning) and provide support for using new educational tools and technologies. CTL staff provide required expertise in instructional and pedagogical design for developing new capacity for learning analytics among faculty (Borden, Guan, & Zilvinskis, 2014).

Student Support Services


Educators serving in academic advising, student affairs, and supplemental instruction use traditional data sources (for example, advising records, student needs assessments, registrar information) to determine the most effective interventions to recommend to individual students, such as counseling, tutoring, and peer mentoring. These staff members typically adopt a holistic view of student life and development across the curriculum, cocurriculum, and extracurriculum. Student advising and support providers often have the most extensive experience in dealing with students directly as they formulate academic and life goals (Drake, 2011). When working in learning analytics, the expertise of these educators provides perspective regarding the complex relationship between in-class and out-of-class demands on student life, while also supplying an understanding of how learning analytics...

Dateiformat: EPUB
Kopierschutz: Adobe-DRM (Digital Rights Management)

Systemvoraussetzungen:

Computer (Windows; MacOS X; Linux): Installieren Sie bereits vor dem Download die kostenlose Software Adobe Digital Editions (siehe E-Book Hilfe).

Tablet/Smartphone (Android; iOS): Installieren Sie bereits vor dem Download die kostenlose App Adobe Digital Editions (siehe E-Book Hilfe).

E-Book-Reader: Bookeen, Kobo, Pocketbook, Sony, Tolino u.v.a.m. (nicht Kindle)

Das Dateiformat EPUB ist sehr gut für Romane und Sachbücher geeignet - also für "fließenden" Text ohne komplexes Layout. Bei E-Readern oder Smartphones passt sich der Zeilen- und Seitenumbruch automatisch den kleinen Displays an. Mit Adobe-DRM wird hier ein "harter" Kopierschutz verwendet. Wenn die notwendigen Voraussetzungen nicht vorliegen, können Sie das E-Book leider nicht öffnen. Daher müssen Sie bereits vor dem Download Ihre Lese-Hardware vorbereiten.

Weitere Informationen finden Sie in unserer E-Book Hilfe.


Download (sofort verfügbar)

23,99 €
inkl. 19% MwSt.
Download / Einzel-Lizenz
ePUB mit Adobe DRM
siehe Systemvoraussetzungen
E-Book bestellen

Unsere Web-Seiten verwenden Cookies. Mit der Nutzung des WebShops erklären Sie sich damit einverstanden. Mehr Informationen finden Sie in unserem Datenschutzhinweis. Ok