1 / 20

Asking Research Questions

Asking Research Questions. Sally Fincher Building Research in Australasian Computing Education: Second Workshop 26 th -29 th January 2005: Sydney. Questions. Questions do not spring fully-formed Undergo iterative refinement Questions do not stand alone It’s not enough to ask

vidah
Download Presentation

Asking Research Questions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Asking Research Questions Sally Fincher Building Research in Australasian Computing Education: Second Workshop 26th-29th January 2005: Sydney

  2. Questions Questions do not spring fully-formed • Undergo iterative refinement Questions do not stand alone • It’s not enough to ask • implied in the asking is the expectation of an answer • you must want to know (evidence) • you must be able to answer (technique)

  3. 4 Loops Clarity Evidence Technique Significance

  4. 4 Loops Clarity Evidence Technique Significance

  5. Loop: Clarity • Specificity - is it vague? • Language – is it understandable to your community (jargon, buzzwords)? • Recognizable as a question – is it a sentence? • Doable – can you imagine how you might study it?

  6. Loop: Clarity (i) • Specificity – is it vague? “What makes a good programmer?” • Language – is it understandable to your community (jargon, buzzwords)? “using the CoP framework can we explain the learning environment “

  7. Loop: Clarity (ii) • Language – is it understandable to your community (jargon, buzzwords)? We are planning a conference in December 2005, to explore alternative approaches to educational research. What we mean by 'alternative' is still rather unclear. Our starting point is a feeling that there is a bigger variety of practices being used by researchers and practitioners to understand learning and teaching than is apparent in the literature, and that some of these approaches could be very valuable if they were more widely known. I certainly think the conference idea is a good one. You seem to be mainly thinking about 'alternatives' in terms of empirical methodologies. But wouldn't it be worth extending to alternatives in terms of ontological and epistemological perspectives? Allied to that would be the methods of conceptual analysis. What I would appreciate is a translation for uninitiated people like myself. I tripped up on "ontological and epistemological perspectives? Allied to that would be the methods of conceptual analysis." I think the rest of Leonard's message rings some useful bells, and I would like to hear more, but in language I can understand.

  8. Loop: Clarity (iii) • Recognizable as a question – is it a sentence? “whether handset usage predicts final outcome” • Doable – can you imagine how you might study it? “When teaching students their first course on programming should the emphasis be on problem solving or on the language” “As programmers mature from novice to expert, do the pass through stages, where they read and reason about code in different ways? If so, in what ways do the reasoning strategies differ? “

  9. 4 Loops Clarity Evidence Technique Significance

  10. Loop: Evidence Empirical investigation • What makes you believe your claims? • What would you need to convince a colleague – in AND out of “the choir”? • What would a sufficient answer look like? • What would an unsatisfactory claim look like? • What might contradictory examples look like?

  11. Loop: Evidence (i) Empirical investigation • What makes you believe your claims? • What would you need to convince a colleague – in AND out of “the choir”? “Is success in team-based work related to students’ epistemological beliefs (EB)?” “given the approach is phenomenography it's interview transcripts talking for about an hour with developers”

  12. Loop: Evidence (ii) • What would a sufficient answer look like? • What would an unsatisfactory claim look like? • What might contradictory examples look like? “What kinds of learning and learning contexts is peer assessment suited to? “ “Can explicit problem solving skill instruction be incorporated into a programming curriculum at the cost of time from practical or other explicit instruction?”

  13. 4 Loops Clarity Evidence Technique Significance

  14. Loop: Technique • What technique(s) might produce the evidence you need? • How do you know? • What’s involved with using this technique? • Assessing “costs” – opportunity, time/resources, available knowledge • Do you know how to do it?

  15. Loop: Technique (i) • What technique(s) might produce the evidence you need? • How do you know? “I am interested in the way experts and novices use 'programming tools' to represent their emerging conceptualisations of a programming task. “ • What’s involved with using this technique? • Assessing “costs” – opportunity, time/resources, available knowledge • Do you know how to do it? • phenomenography, CoP analysis etc.

  16. Loop: Technique (ii) • What’s involved with using this technique? • Assessing “costs” – opportunity, time/resources, available knowledge “Do novice programmers exposed to explicit instruction in problem solving skills produce problem solutions: Faster; With greater accuracy; and with more consistency (between solutions)?”

  17. 4 Loops Clarity Evidence Technique Significance

  18. Loop: Significance (aka “so what?”) For whom (audience)? • To you, others? • For both computing AND education? For a disciplinary community • Relevance - Why might they be interested? • Justification - What is the contribution? • Context - How does this fit in the known landscape? • “Edgy” – Does it open up new avenues?

  19. Questions in context (reprise) • Pose significant questions that can be investigated empirically • Link research to relevant theory • Use methods that permit direct investigation • Provide coherent and explicit chain of reasoning • Replicate and generalize across studies • Disclose research to encourage professional scrutiny and critique

  20. Remember … You can always rely on the basics: • Why am I doing this? • What am I doing? • Is this doable?

More Related