1 / 43

Building Synergistic Relationships between Teaching and Research

Building Synergistic Relationships between Teaching and Research. Randall Groth, Ph.D. Salisbury University Teaching and Learning Conference February 21, 2014. What this talk is not about (though these are also all good things…). Doing research and then discussing it in class lectures

darryl
Download Presentation

Building Synergistic Relationships between Teaching and Research

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Building Synergistic Relationships between Teaching and Research Randall Groth, Ph.D. Salisbury University Teaching and Learning Conference February 21, 2014

  2. What this talk is not about (though these are also all good things…) • Doing research and then discussing it in class lectures • Reading research about a teaching strategy and then applying it (though this could be a fist step…) • Having students do research (in-class or as a special undergraduate research project)

  3. Research apart from teaching Research becomes another burdensome task or something you have to “escape” from teaching to do.

  4. Research as a part ofteaching Teaching and research help improve one another!

  5. Products of the synergistic cycle: • Evidence of effective teaching for personal portfolio • Evidence of scholarly activity for personal portfolio • Systematic reflection on teaching and learning • Positive influence on SU curriculum • Positive influence on SU student learning • National and international influence on teaching and learning in your field

  6. Some Preliminaries • IRB approval • Dig into related literature • http://scholar.google.com • http://www.eric.ed.gov • Do what is best for the student first – the most interesting studies emerge when this principle is kept in the forefront

  7. Idea 1: Task-Based Research This is not quite what I mean by task-based research… …but it does bring up a foundational element of task-based research: not taking it for granted that students are well-acquainted with fundamental ideas in your field – even if their academic credentials suggest it. https://www.youtube.com/watch?v=zOw8aiyMUAU

  8. Intriguing tasks do not have to be complex, but should probe students’ fundamental ideas deeply This is what I mean by “task-based research…” …although respondents in most studies of this nature do not have their identities associated with their responses http://www.youtube.com/watch?v=TrXaQu_qGeo

  9. What makes something task-based research rather than just JayWalking? Bloom’s Taxonomy JayWalking questions

  10. Tasks associated with higher levels in Bloom’s Taxonomy • Write a metaphor for the statistical idea of sample. Explain how and why your metaphor works. Explain how and why your metaphor does not work (Groth & Bergner, 2005). • How are the statistical concepts of mean, median, and mode different? How are they similar? (Groth & Bergner, 2006) • Construct two different sets of data that have the same mean. The data sets should have different numbers of values. Compute the SAD and MAD for each set of data. Show your work. Explain what the SAD and MAD tell you about the sets of data (Groth, in press)

  11. Why is task-based research valuable to teaching? • Keeps you mindful of asking questions fitting higher levels of Bloom’s Taxonomy. • As formative assessment: It alerts you to prevalent misconceptions that may need extra attention as you teach. • As summative assessment: Used as pre- and post-assessments, tasks provide a degree of feedback about how much students have learned during a course. • As a formative and summative assessment for the broader universe of instructors also teaching your discipline under similar circumstances.

  12. Guiding question for task design • What broad concepts in your discipline are as fundamental as the earth/sun relationship, yet may be problematic for students? • This is not an easy question – many times students cover up misconceptions with procedural knowledge or by mimicking the instructor (recall training cited by students in “Private Universe” clip).

  13. 5-minute brainstorming session- Talk with those around you: What is an intriguing task from your discipline that might serve as the foundation for task-based research? I will check in after a few minutes to see if anyone is ready to share.

  14. A tool for qualitative analysis of task responses: The SOLO Taxonomy (Biggs & Collis, 1982, 1991) Even with a well-defined structure, multiple coders help

  15. More tools for qualitative analysis

  16. Idea 2: Use online conversations as a means to gather research data

  17. “Listen in” (with students’ permission) to MyClasses conversations about… • Required class readings • One another’s work • A complex task • A case analysis • Professional standards • Research pertinent to their field of study

  18. Sample complex task to spark rich conversation and debate This task was actually widely used in individual task-based research and it proved to be a great item for collective conversation.

  19. How do you write up this research? 1. Engage in teaching cycle 2. Produce a narrative describing all elements of the cycle 3. Draw implications for the broader field

  20. Where can you publish research like this? • Play with keyword searches related to your research on the following websites: • http://scholar.google.com • http://www.eric.ed.gov • Note titles of journals you find • Google the websites of the journals • Read sample articles from the journals and guidelines on style, length, etc. • Pick one journal and submit! Don’t be discouraged by rejections!

  21. SU Faculty Handbook Excerpt Seems like a fairly typical handbook statement

  22. Have tenure track faculty in the U.S. done a good job providing “evidence of effective teaching?”

  23. Source: http://www.insidehighered.com/news/2013/01/22/michigan-lets-community-colleges-issue-four-year-degrees-amid-controversy

  24. Conjecture: To provide policy makers and the general public convincing evidence of “exceptional teaching” we need to show more directly how our teaching impacts student learning

  25. Bull's-eye: Assessment of student leaning Course syllabi Qualitative tasks Peer evaluations Quantitative measures of learning Student evaluations

  26. SU Math 150 Development Example • Statistics course for teachers developed collaboratively with mathematics and education departments. • Students’ learning assessed with a pre/post-design: Statistical Knowledge for Teaching Test • If feasible, establish comparison groups for firmer evidence of causation. • Results provide evidence of teaching effectiveness directly linked to student learning

  27. Locating Quantitative Instruments Mental Measurements Yearbook (MMY) with Tests in Print (TIP) • MMY & TIP are produced by the Buros Institute of Mental Measurements at the University of Nebraska. Mental Measurements Yearbook provides users with a comprehensive guide to over 2,000 contemporary testing instruments. Designed for an audience ranging from novice test consumers to experienced ...

  28. Moving from Current Curriculum Committee Conversations to “Dream World” Conversations

  29. Current Conversations • Is the right box checked on the form? • Is the activity code correct? • Are all of the signatures there? • Does the syllabus have a WAC and Fire Safety statement? • Are the forms on the right color paper? • Is this proposal going to make anyone angry?

  30. Dream-World (i.e., research-driven) Conversations • Has this new course been piloted? • If so, what were its effects on student learning of the targeted content? (syllabi do not tell us this) • If not, what theory of learning or related research would lead us to believe the course would be valuable? (so innovative ideas are not choked off) • How will the course be assessed for the purpose of continuous improvement? (not just assessment for the sake of assessment).

  31. Current SU Curriculum Committee Structure Is it possible to have research-based conversations within this structure????

  32. Current SU Curriculum Committee Structure This is not what most new faculty think they are getting themselves into when they set out to improve college courses and teaching!

  33. Dream-World Curriculum Committee Structure • Core group of faculty collaboratively design and assess course (courses might be called “experimental sections,” possibly for several semesters). • Core group summarizes assessment results. • Assessment summary is sent out for peer review to a qualified group of internal (and external?) faculty. • Faculty reviewing the course make suggestions for improvement and issue a decision. • Clerical work (e.g., fire safety statement, “box checking”) is handled in an appropriate office.

  34. Advantages of Dream-World Structure • Faculty are allowed to do what they do best: teach and make judgments about the best methods for teaching students (as opposed to editing fire safety statements and checking for the right color paper). • Evidence of effective teaching is gathered through direct assessment of courses designed. • Collaboration on course design and assessment naturally encourages collaborative scholarship.

  35. Living the dream even if not fully realized: Lesson Study (Hurd & Licciardo-Musso, 2005)

  36. Starting with Lesson Study – Advice from an SU peer institution

  37. Connecting research and teaching has the power to improve student learning at SU and beyond

  38. Related studies Groth, R.E., & Bergner, J.A. (2005). Preservice elementary school teachers' metaphors for the concept of statistical sample. Statistics Education Research Journal, 4 (2), 27-42. Groth, R.E. & Bergner, J.A. (2006). Preservice elementary teachers' conceptual and procedural knowledge of mean, median, and mode. Mathematical Thinking and Learning, 8, 37-63. Groth, R.E. (2008). Assessing teachers' discourse about the Pre-k-12 Guidelines for Assessment and Instruction in Statistics Education (GAISE). Statistics Education Research Journal, 7(1), 16-39. Groth, R.E., Spickler, D.E., Bergner, J.A., & Bardzell, M.J. (2009). A qualitative approach to assessing technological pedagogical content knowledge. Contemporary Issues in Technology and Teacher Education, 9 (4), Available: http://www.citejournal.org/vol9/iss4/mathematics/article1.cfm Groth, R.E. (in press). Prospective teachers' procedural and conceptual knowledge of mean absolute deviation. Investigations in Mathematics Learning

More Related