1 / 13

Evaluation

Evaluation. COMP362—Computer-Based Learning I Charalambos Vrasidas http://vrasidas.intercol.edu/comp362 vrasidas.c@intercollege.ac.cy. Develop Evaluation. Goal Development. Implementation. Target Population. Strategies Interactivity. Summative Evaluation. Needs Assessment. Design:

caelan
Download Presentation

Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation COMP362—Computer-Based Learning I Charalambos Vrasidas http://vrasidas.intercol.edu/comp362 vrasidas.c@intercollege.ac.cy

  2. Develop Evaluation Goal Development Implementation Target Population Strategies Interactivity Summative Evaluation Needs Assessment • Design: • Interface Design • Flowchart • Storyboard Limitations Content Analysis Collect Material Task Analysis Objectives Develop Prototype Usability Testing Formative Evaluation--Revisions Systems Approach to ID Phase I- Analysis Phase II - Design Phase III - Evaluation

  3. Think!!! • What is evaluation? • Why is it important?

  4. What is evaluation? • Evaluation is defined as the systematic investigation of the worth or merit of an object or program. • It is essential that systematic procedures are used to evaluate the conceptualization, design, implementation, impact, and utility of projects. • Only then, can evaluations gather valid and reliable evidence to document a project’s impact, merits, and challenges.

  5. Kinds of evaluation • Formative Evaluation—provide information on how to improve the program • Summative evaluation—determine whether to continue the program or eliminate it.

  6. Steps to follow • Identify audience • Establish evaluation goals • Determine data collection procedures • Collect data • Analyze data • Present findings and determine action

  7. Usability • Usability is the attribute of the website that allows the user to complete tasks with effectiveness, efficiency, and satisfaction. • In traditional software design, customers pay first and experience usability later. That is, they first buy the software they need and then use it and experience the product’s usability. • On the Web, users experience usability first and pay later • After one minute … they are gone!

  8. Norman (1995): Things that makes us Smart Environments and products that are conducive to optimal experience should: • ·Provide high interactivity. • ·Provide feedback to the user. • ·Have specific goals. • ·Have established procedures. • ·Motivate the user. • ·Provide a continuous challenge to users • ·Provide a sense of direct engagement with the task. • ·Provide tools that fit the user and task. • ·Avoid distractions that interfere with the user’s experience

  9. What to consider? When choosing the right method for evaluating a website/CD one should consider: • The nature of the tasks users are asked to perform, • The number of users available for participation in the evaluation, • The budget available for completing the evaluation, • The timeline and deadlines for the project, • The experience, knowledge, and skills possessed by the usability personnel assigned to conduct the evaluation.

  10. Reeves—Evaluating CBL • Pedagogical philosophy; Objectivist—constructivist • Goal orientation: Sharply focused—Unfocused • Teacher role: Didactic—Facilitative • Program flexibility and learner control • Value of errors: Errorless learning—Learning from experience • Motivation: Intrinsic—Extrinsic • Accommodation of individual differences • Cooperative learning • Cultural sensitivity • http://www.educationau.edu.au/archives/cp/reeves.htm

  11. Major usability attributes According to Nielsen (1993), the five major usability attributes are: • Learnability: The degree to which the system is easy to learn. • Efficiency: The degree to which the system can be used efficiently to increase productivity. • Memorability: The degree to which you can learn the system, remember its functions, and can use it with no or little difficulty. • Errors: The degree to which it has a low rate of errors and build in undo and confirmation commands for risky tasks. • Satisfaction: The degree to which the system is subjectively pleasing to the user.

  12. Typical usability measurements • Time needed to complete a task. • Number of user errors. • Number of tasks completed within a timeframe. • Ratio between successful interactions and errors completed by the user. • Number of features never used during a session. • Frequency of using help material. • Number of positive comments the user provides about the system. • The amount of time during which the user is not interacting with the system (dead time). • Number of features the user can recall during debriefing

  13. Heuristic Evaluation • Simple and Natural Dialogue • Speak the Users’ Language • Minimize the Users’ Memory Load • Consistency • Feedback • Clearly Marked Exits • Shortcuts (accelerators) • Good Error Messages • Prevent Errors • Help and Documentation

More Related