1 / 50

Judy Perry- MIT Scheller Teacher, Education Program Lab

“Informal Learning Using Augmented Reality Games” How can augmented reality (AR) games played on smart phones extend informal educational opportunities? What challenges arise when you put digital learning experiences in typically low-tech environments? See what happened during recent pilot projects at zoos, nature centers and living history museums who used MIT STEP lab’s TaleBlazer AR platform.

Download Presentation

Judy Perry- MIT Scheller Teacher, Education Program Lab

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. virtual  bridges:     mobile  augmented  reality  games    in  informal  spaces   Serious  Play  Conference  –  Aug.  21,  2013   Judy  Perry,  MIT  STEP  Lab   Eric  Klopfer,  Director,  MIT  STEP  Lab    

  2. How  do  we  cra6  powerful  experiences  in  real  places?   How  do  these  experiences  foster  deep  learning?  engagement?   Photo:  Town  of  Lexington  website   Photo:  PolarBearsInterna?onal.org  

  3. pieces  of  informal  learning?   •  Outside  “formal”  classroom   •  O6en  playful     •  Self-­‐directed  or  facilitated     – rather  than  “teacher  led”   •  Frequently  involve  parJcipant’s  choices   •  Voluntary  (“I  can  leave!”)   •  Ad  hoc  (aren’t  set  Jmes)      

  4. informal  learning?   •  How  to  engage  learners  informally?   •  Right  level  of  structure?  Role  of  technology?  

  5. it’s  not  easy…   •  “Explainer”  not  always   available   •  Free  exploraJon  isn’t   always  opJmal,   appealing   •  Want/need  more   scaffolding…   •  Physical  locaJons,   encourage  engagement   of  places  “off  the  beaten   path”   So  how  do  you  encourage   learning?   Image  courtesy  of  Red  BuIe    

  6. place* Visitor (s)   “bridging experience” *people, objects, structures, landscape

  7. EffecJve  “bridges”  can  be  hard  to  build…   Share  some  of  our  experiences   Challenge  to  build  interesJng*  bridges   *what  do  effecJve  bridges  look  like?     *how  do  organizaJon/visitors  benefit?    

  8. portability   social  interacJvity   context  sensiJvity   connecJvity   Individuality   mobile   WHAT  TOOLS  MIGHT  BE  USEFUL?  

  9. informal  learning  with  mobile  devices?   BUT  WHAT  DOES  IT  LOOK  LIKE?  

  10. What  I  don’t  mean   •  Flash  cards/thin  quiz  on  a  mobile  phone   •  This  ≠  “Read  the  sign  and  answer  a  mul?ple   choice  ques?on”   •  This  IS  a  toe  in  the  water,  but…   I  think  we  can  do  a  lot  be\er...  

  11. We  learn  from  games  all  the  ?me…     •  Playful,  yet  challenging   (Papert’s  “hard  fun”)   •  Pace  is  typically  set  by  player   •  InteresJng  choices   •  Meaningful  feedback   It’s  just  good  PEDAGOGY!    

  12. what  games/gameplay  offer   •  O6en  highly  social   •  Fun,  playful   •  Acceptable  to  explore  and  to  “fail”   •  Able  to  try  on  idenJJes  (role-­‐playing)    

  13. One  approach  locaJon-­‐based   educaJon…augmented  reality  (AR)   “Loca?on-­‐aware  digital  overlay  of   informa?on  in  a  real-­‐world  context”     Typically  played  on   smart  phone  or   other  mobile   computer  (GPS)   Learning   Real   World   Context   Games/ Sims  

  14. AR  toolkits   •  Non/Novice  Programmers  can  make  AR  games   using…  

  15. A  short  video…  

  16. Gaminess •  What features are important to structure games? –  Interesting decisions (Sid Meier) –  Consequences to decisions (+/- value) –  Clearly defined goals (rules/constraints) –  Visible measurable feedback (quantifiable outcome) –  Underlying model/system (coherent system of rules) Little Gaminess Little Gaminess Lots of Gaminess Lots of Gaminess Scavenger Hunt The Sims The Sims Scavenger Hunt WoW Risk Movies Dolls Books

  17. AR:  Environmental  DetecJves   •  First  Example  -­‐  Part  of  G2T   •  “Environmental  DetecJves”   –  Players  briefed  about  rash  of  local   health  problems  linked  to  the   environment   –  Need  to  determine  source  of   polluJon  by  drilling  sampling   wells,  interviewing  virtual   witnesses  

  18. AR:  Environmental  DetecJves   •  First  groups  of  students  (MIT)   –  Tried  to  plan  strategies  for  sampling   –  Competed  with  each  other  someJmes   and  collaborated  others   –  Evaluated  incoming  informaJon   –  Wanted  to  come  up  with  the  “best”   soluJon   –  Used  previous  experience  to  opJmize  in   the  face  of  constraints   •  Second  group  of  students  (HS  Field   Trip)   –  Tried  to  collect  as  many  points  as  they   could   –  Planned  route  from  one  point  to  the  next   based  on  proximity   –  Wanted  to  complete  the  experience  

  19. Gaminess   •  For the MIT students… –  Interesting decisions  ?? –  Consequences to decisions (+/-) þ –  Clearly defined goals þ –  Visible measurable feedback ý –  Underlying model/system ý Little Gaminess LiIle  Gaminess   Lots of Gaminess Lots  of  Gaminess   Movies Dolls Books Books   Scavenger Hunt The Sims The  Sims   WoW Risk Risk   Movies   Dolls   Scavenger  Hunt   WoW  

  20. Gaminess   •  For the HS students… – Interesting decisionsþ – Consequences to decisions (+/-) ý – Clearly defined goals ý – Visible measurable feedback ý – Underlying model/system ý Little Gaminess LiIle  Gaminess   Lots of Gaminess Lots  of  Gaminess   Movies Dolls Books Books   Scavenger Hunt The Sims The  Sims   WoW Risk Risk   Movies   Dolls   Scavenger  Hunt   WoW  

  21. Audience’s   perspecJve   (expectaJons,   markeJng)   *what  do  interesJng  bridges  look  like?    

  22. The  Joy  of  Gaming?  

  23. The  Joy  of  Gaming   Photos:  Phillip   Toledano’s  photo   essay  of  gamers   2 3

  24. Ecology  of  Play  and  Work   •  A  game  helps  structure  an  experience,  and  ideally   includes  open-­‐ended  play  and  structure  and   support  for  learning   Work Learning Structure Play Fun Open-ended c/o Scot Osterweil

  25. Ecology  of  Play  and  Work   •  A  game  helps  structure  that  experience  and  ideally   includes  open-­‐ended  play  and  structure  and  support   for  learning   Learning/Structure Fun/Play

  26. The Fun of Structure Structured,  goal-­‐oriented,   feedback-­‐driven  can  be  fun     In  games  we  willingly  submit  to  arbitrary  rules   and  structures  in  pursuit  of  mastery,  but  only  if   we  can  conJnue  to  be  playful.  

  27. The Fun of Structure Structured,  goal-­‐oriented,   feedback-­‐driven  can  be  fun     In  games  we  willingly  submit  to  arbitrary  rules   and  structures  in  pursuit  of  mastery,  but  only  if   we  can  conJnue  to  be  playful.  

  28. Moving  Indoors:   Mystery  @  the  Museum   •  Indoor  game  played  at  the   Boston  Museum  of  Science   •  Used  802.11  for  posiJoning   •  Defining  roles  to  enhance   collaboraJon   •  Introducing  an  element  of   Jme  to  make  it  feel  more  like   a  game   •  Solving  a  mystery  using   scienJfic  informaJon  from  the   museum  

  29. LocaJon  InformaJon   All  screens  tell  you   what  room  you’re   currently  in   Click  on  items  in  the   room  to  select  them   for  viewing  or  picking   up     Click  on  people  in   the  room  to  select   them  for   interviewing  or   showing  objects  to   To  view  an  item  in   the  room,  click  on   the  item  and  then   click  the  View   bu\on.       To  pick  up  an  item  in   the  room,  click  on   the  item  and  then   click  the  Pick  Up   bu\on.     To  show  an  item  that   you  already  have  to  a   Virtual  Character,  click   on  the  person,  then  click   Show  and  then  choose   the  item  that  you  want   to  show.     To  interview  a   virtual  character   click  on  the  person   and  then  click   Interview    

  30. Context  and  Tools   Communicate   Analyze   Decide   InvesPgate  

  31. Mystery  @  The  Museum  -­‐  Game  Play   “[The  game]  was  fun.     This  was  the  longest   I’ve  spent  with  my   [teenage]  son  without   arguing  with  him...”   Fostering  CollaboraJon  Through  Roles   Parents  and  Kids  CollaboraJng   Using Contextual Information -­‐parent   CollecJng  Virtual  Samples  

  32. Audience’s   perspecJve   (expectaJons,   markeJng)   Experiment  with   collaboraJon   *what  do  interesJng  bridges  look  like?    

  33. What  is  POSIT?   •  (Developing)  Public  Opinion  of  Science  using  InformaJon   Technologies   •  POSIT  is  a  wireless  client-­‐server  system   •  Aimed  at  creaJng  authenJc  experiences  that  explore  complex   issues  in  science,  technology  and  society.   •  Built  on  exisJng  AR  infrastructure  developed  in  the  STEP  Lab  to   combine  locaJon  based  “Augmented  Reality”  experiences  with   opinion  dynamics  scenarios  using  Windows  Mobile  devices.   •  POSIT  seeks  to  engage  people  in  STS  issues.  

  34. 1st  POSIT  Game  Overview   Game  is  focused  around  a  single  yes/no  policy  ques?on   (fic?onalized).     E.g.,  “Should  we  build  a  biohazard  level  4  research   facility  in  our  community?”     –  Briefing  -­‐  PotenJal  biohazard  facility  in  Boston   –  Roles  -­‐  Playing  realisJc  roles  from  scienJst  to  resident   –  IniPal  Opinion  -­‐  Opinions  “in  role”  are  registered   –  CollecPng  Data  -­‐  Players  collect  informaJon  from   virtual  characters,  and  real  arJfacts/places   –  Sharing  Opinions  -­‐  Players  share  informaJon  that   they  have  collected  to  convince  others  of  their   [character’s]  point  of  view   –  Influencing  Others  and  Changing  Opinions  -­‐   Influence  key  individuals  to  sway  the  vote   –  Final  Decision  -­‐  voJng  

  35. POSIT  Walkthrough   •  Players  are  assigned  roles  and  receive  a  briefing   –  A  decision  must  be  reached  on  whether  or  not  to  build  the  research  facility.   –  Every  role  has  a  story  (e.g.  out  of  work  biotech,  concerned  parent)   •  Register  iniJal  opinions   •  Team  up  with  allies  and  target  the  opposiJon  

  36. POSIT  Walkthrough   Players…   •  conJnue  to  collect  informaJon,  and  change  their  own   opinions  accordingly.   •  interact  with  each  other,  use  their  evidence  to  sway  players   their  way.   •  constantly  monitor  the  way  the  group  is  leaning  and  target   parJcular  other  players  with  informaJon  that  they  collect.  

  37. AR  Tracking  Progress  

  38. POSIT  -­‐  End  Game   •  Players  in  the  end  vote  (based  on  their   final  opinions)  on  the  issue  at  hand.   •  In  a  discussion  following  the  game,  the   facilitator  leads  a  discussion  around:   •  The  real  controversy  on  which  the  game  is  based   •  The  experience  of  role  playing  different  opinions,  and   how  it  relates  to  their  personal  opinion   •  Evidence,  scien?fic  arguments,  and  persuasiveness   •  Factors  that  caused  opinions  to  change  over  ?me   •  A  histogram  of  the  opinions  of  the  group  at  different   ?mes  during  the  game   • This  was  a  lot!   • But  players  liked  it!  

  39. Audience’s   perspecJve   (expectaJons,   markeJng)   Experiment  with   collaboraJon   Deeply   engaging,   challenging   “hard  fun”   *what  do  interesJng  bridges  look  like?    

  40. Timelab 2100 •  It is the year 2100 and Climate Change has spun out of control •  Head back to 2012 to make small changes to the past •  Location specific information – e.g. –  By the river - Location is now under water –  By the bus stop – Consider incentives to increase use •  Also includes face to face discussion

  41. Zoo  Scene  InvesJgators  (ZSI)   During the AR game ZSI, students… ª Play during field trips to Columbus Zoo & Aquarium ª Navigate the Asia Quest area to gather evidence mysterious nighttime intruder ª Learn about the “illegal wildlife trade” ª Collaborate across roles to share information ª Observe real animals and exhibits ª Gather virtual evidence and consult virtual experts   41  

  42. Audience’s   perspecJve   (expectaJons,   markeJng)   Experiment  with   collaboraJon   Deeply   engaging,   challenging   Leverage   physical   environment;   “hard  fun”   ConnecJons,   contrasts   exhibits/spaces   *what  do  interesJng  bridges  look  like?    

  43. Old  Sturbridge  Village   •  Dollars  &  Sense,  an  economic  history   game   •  Living  history  museum  (costumed   interpreters,  buildings,  animals,  etc.)   •  Take  perspecJve  of  1830s  rural  New   England  farm  family   •  Role-­‐playing  game  (factory  work,   investments,  buy  property)  

  44. Audience’s   perspecJve   Deeply   engaging,   challenging   (expectaJons,   markeJng)   Choices,   consequences   “hard  fun”   ParJcipaJon  is   the  reward   Leverage   physical   environment;   Experiment  with   collaboraJon   “no  one  fails   museums”   ConnecJons,   contrasts   exhibits/spaces   *what  do  interesJng  bridges  look  like?    

  45. Learning   Real   World   Context   Games/ Sims   Moving  beyond  playing  games…   TO  MAKING  GAMES  

  46. CSI  (Community  Science  InvesJgators)   Making  AR  Games  1.0   Collabora?on  with  Missouri  Botanical  Garden   Funded  though  NSF  ITEST  Award  #0833663  

  47. iCSI  (Informal  Community  Science   InvesJgators)   3-­‐year  NSF  funded  partnership  between:  Missouri   Botanical  Garden  (MO);  MIT  (MA);  Columbus  Zoo  (OH),   San  Diego  Zoo  (CA),  Red  Bu\e  Botanical  Gardens  (UT)   1.  Engage  informal  visitors  playing  AR  games   2.  Tweens/teens  making  AR  games  (summer  camps)   – Leverage  AR  games  to  promote  STEM  knowledge/ engagement,  showcase  research,  foster  community   engagement  (ciJzen  science),  develop  21st  cent.  skills.   – Create  replicable  model,  best  pracJces  &  tools  for   other  informal  learning  insJtuJons   Funded  by  NSF  ISE  Grant  #  1223407  

  48. Send  out  the  pioneers!   So  as  we  conJnue  to  explore  this  space:   •  Best  pracJces  (game  design)   •  Powerful  tools  (making  powerful  games)   •  Research  findings   Fall/Winter  2013  –  TaleBlazer  will  be  available   coming  to  an  App  Store  or  Google  Play  near  you!     •  If  you’re  a  “pioneer”  (esp.  Android),   please  talk  with  me…  

  49. Thanks!   QUESTIONS?  

  50. Thanks!   Judy  Perry,  MIT  STEP  Lab  <jperry@mit.edu>   h\p://TaleBlazer.org   h\p://educaJon.mit.edu   •  Many  thanks  to  Eric  Klopfer  (Director,  MIT  STEP  Lab),  Lisa  Stump    &  the   TaleBlazer  development  team  (MIT),  Josh  Sheldon  (MIT/STEP);  Renata   Pomponi  &  Drew  Buckley  (Mass  Audubon);  Rhys  Simmons  and  his  team   (OSV);  Columbus  Zoo  &  Aquarium   •  Funded  in  part  by  NSF  Grants  #  0833663,  #1223407;  and  the  Columbus   Zoo  &  Aquarium.  

More Related