1 / 28

Learning Analytics: On the Way to Smart Education

Learning Analytics: On the Way to Smart Education. Dr. Griff Richards Athabasca University. Moscow, 08 October 2012. Distance Learning In Canada - Is growing, especially in K-12 public schools 15 % of 2008 high school graduates in BC took at least one on-line course

laszlo
Download Presentation

Learning Analytics: On the Way to Smart Education

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Learning Analytics: On the Way to Smart Education Dr. Griff Richards Athabasca University Moscow, 08 October 2012

  2. Distance Learning • In Canada • - Is growing, especially in K-12 public schools • 15% of 2008 high school graduates in BC took at least one on-line course • Increase in “Blended Learning”, online activities for F2F courses. • Increasing concern for course quality and student retention 60oN 49oN Athabasca, Alberta

  3. Learning Engagement The more a learner interacts with the content and with their peers about the content, the more they are likely to internalize it, and remember it... (Richard Snow, 1980)

  4. Learning Engagement Learning engagement promotes student retention and academic success. (George Kuh,2001) • Engagement is “action based on internal motivation” • Can not directly measure engagement • Must look for external traces, actions that indicate interest, and involvement (NSSE)

  5. CAUTION: Engagement • Arum & Roska (2011) • Students who studied alone had higher marks than students studying in groups. • Engagement that is not task-focused is unlikely to improve learning.

  6. Interaction Equivalency Theory(Garrison & Anderson,1995) Course Content The quality of the interaction is more important than the source Other learners Instructor

  7. 4 places to improve learning 1. Clearer designs Course Content 2. Collaborative Learning activities 3. Less lecturing, more mentoring Other learners Instructor 4. More study skills

  8. We have insufficient data about which instructional strategies actually work best.

  9. Hybrid or Blended Learning? The use of online technologies to augment face to face delivery. -> replaces some F2F “face time” (forums save classroom space) -> uses LMS to track assessments and assignments -> uses technology in class e.g. pop quizzes, collaboration tools Face to Face On-Line

  10. Analytic Measures Engagement is inferred from activity. • Student interaction with Online systems leaves an “exhaust trail” of their activity • Learning Analytics is not a statistical sample, it is all the data for all learners • Questions: What patterns to look for? How to interpret them?

  11. Example: Snapp • LMS interaction data e.g. Moodle discussion • Extract linked list • Plot interactions in star map

  12. SNAPP Visualization

  13. SNAPP shot of Conference Students “engaged”

  14. SNAPP shot of Conference Individuals with lower engagement, 3 or less messages

  15. Does Activity = Engagement ? Beer (2010) plotted total LMS interactions with academic grades. Does activity = engagement? Is Blackboard more engaging than Moodle? 

  16. Limitations: Activity Outside LMS • As learners become engaged, they move to more engaging channels: email, Elluminate, Skype • This activity is not tracked by LMS. • No data available. LMS

  17. Interpretation of Analytics • Data patterns require investigation • Quantitative data requires interpretation --> make and test hypotheses --> create useful models • When we measure something we risk changing it. • e.g. If learners know we count hits they may make meaningless hits to fool the system.

  18. Analytics for Learners! • The same analytics should be able to provide easy to understand information dashboards for students.

  19. Analytics for Learners SIGNALS Arnold (2010) inserted a traffic light on each student’s course page to provide guidance on course success. A/B C D/F

  20. Dashboard for Faculty Arnold (2010) reported 14 % shift from D’s to C’s & B’s. Same number of withdrawals, but W’s occurred earlier before damaging student grade point averages. 12% increase in students requesting assistance. N=220

  21. Mesmotsa dashboard for learners(Richards & Sehboub, 2008) My webquest data Data for my class

  22. How to start: Analytics for online & blended learning? • Measuring something is the first step • “Better to measure something than to measure nothing” (Scrivens) • Need more data than just page hits. We also need to ask learners about their experience, what worked, what needs improvement.

  23. Dynamic Evaluation Model (Richards & Devries,2011) Analytics at the activity level • Preparation Conduct Reflection • Design • Facilitation • Learning

  24. Dynamic Evaluation Model Timely feedback enables quick fixes • Preparation Conduct Reflection • Design • Facilitation • Learning

  25. If Analytics, Then What? • If analytics show students are failing, is there a moral obligation to help them? • If analytics show a course has weaknesses, is there a business obligation to fix it? • If analytics reveal weak instruction, who is responsible to intervene? • If analytics are inaccurate, who fixes them? • What about privacy & ethics? Who owns the data? Who has the right to see it?

  26. The Analytics Box? • joannaparypinski.com

  27. Learning Analytics: On the Way to Smart(er) Education Dr. Griff Richards Athabasca University griff@sfu.ca Moscow, 08 October 2012

More Related