1 / 38

ECE 4336 Capstone Design II

ECE 4336 Capstone Design II. Project Design Process University of Houston Diana de la Rosa-Pohl. ADDIE Design Model. ADDIE Design Model. A nalysis D esign D evelopment I mplementation E valuation. Some Comments. This process is an iterative process! Everyone has models they use.

chaeli
Download Presentation

ECE 4336 Capstone Design II

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ECE 4336Capstone Design II Project Design Process University of Houston Diana de la Rosa-Pohl

  2. ADDIEDesign Model

  3. ADDIE Design Model • Analysis • Design • Development • Implementation • Evaluation

  4. Some Comments • This process is an iterative process! • Everyone has models they use. • This is just one.

  5. Analysis

  6. ADDIE Design Model • Analysis • Design • Development • Implementation • Evaluation

  7. Analysis • get background information • identify problem/need (needs analysis) • user analysis • conduct research • identify expected deliverables • formulate goal • identify desired performance outcome sometimes called front-end analysis

  8. Analysis • get background information • mostly given to you by client/manager • supporting materials • product specs • design constraints • meetings • web sites • What are other ways to get background information?

  9. Analysis • identify problem/need • problem: (What’s broken?) • need: (Why is it a problem?)

  10. Analysis • conduct research • Is there even a problem that you can solve? • Is the problem worth solving? • Why?  Significance • “Who Cares?” factor • Has the problem (or similar problem) been solved before? • What data/resources do you have to work with?

  11. Analysis • identify expected deliverables • What (specifically) did you deliver to your client/manager in December? • What (specifically) will you deliver to your client/manager in April? • answer this as best you can • this may change depending on your results in December (the status of your project will be known to all in January) • you may need to renegotiate this

  12. Analysis • user analysis • WHO is going to be using your widget? • What important characteristics will impact your design? • HOW are they going to be using this? • WHERE will your user be using this? • WHAT is the skill level of your user?

  13. Analysis • identify/formulate goal • Write your overall goal statement. • i.e., your goal for April

  14. Analysis • identify desired performance outcome • Target Objective • what do you want to happen?

  15. Design

  16. ADDIE Design Model • Analysis • Design • Development • Implementation • Evaluation

  17. Design • brainstorm alternative solutions • decide on optimum solution • within engineering constraints • using engineering standards • overview diagram • goal/task analysis • test plan • schedule • budget

  18. Design • brainstorm alternative solutions • different shapes • different sizes • different sensors • different protocols • different power supplies • …you get the idea

  19. Design • decide on optimum solution • you should have done this already

  20. Design • overview diagram • someone should be able to look at your overview diagram and know what you are about to build (or have already built) • if they close their eyes, they should see it • ALL OF IT! • include what you are building and the environment that it is in (if it’s important) • it should be clear what part(s) you are building and what’s been given to you • label all important parts

  21. Design • goal analysis • what should the SYSTEM be able to do? • start with the end in mind! • i.e., the target objective • work your way backwards • this should be graphical • every box is an objective • all objectives must measureable and observable!

  22. Design • task analysis • what do YOU need to do to complete the objective? • these are YOUR tasks • list as many as necessary for each box • you may also include smaller tests here • these DO NOT need to be measurable/observable

  23. Design Should include target dates in your test plan. • test plan • ? • ? • ? • ? • ? • ? • ? • ? • ? • ? • TARGET OBJECTIVE:

  24. Design • Project Objectives (Milestones) • Must be observable AND measureable! • Says who? Says ME! (and Gagné and Mager and others) • Robert Mager(Preparing Instructional Objectives, 1997) • outcomes vs. process • specific vs. general • measureable vs. unmeasurable • student vs. instructor  functionality of system vs. task of engineer

  25. Design • Project Objectives (Milestones) • Robert Gagné (Principles of Instructional Design, 2005) • situation • what is the stimulus? • environmental conditions? • capability verb • [discriminates, identifies, classifies, demonstrates (a rule), generates, adopts, states (or displays), executes (a motor skill), chooses (an attitude)] • object • the “what” of the capability verb • “what” is the capability? • action verb • how the performance is to be completed • tools, constraints, or special conditions • specific equipment? • limited time, budget, error?

  26. Development

  27. ADDIE Design Model • Analysis • Design • Development • Implementation • Evaluation

  28. Development • build it! • piece by piece • the blood, sweat, and tears • work through goal analysis • work through test plan • formative evaluation

  29. Implementation

  30. ADDIE Design Model • Analysis • Design • Development • Implementation • Evaluation

  31. Implementation • doing what you said you were gonna do • testing the full system • full integration • full-blown testing • alpha testing • beta testing • record test data

  32. Evaluation

  33. ADDIE Design Model • Analysis • Design • Development • Implementation • Evaluation

  34. Evaluation • interpreting your results data • “How’d we do?” • make design decisions based on current implementation • formative evaluation • make decisions based on final implementation • summative evaluation • determine implications for future work • suggest next steps • documentation • publication

  35. Presentations Capstone I

  36. Presentations Capstone II

  37. Written Report Outline* • Introduction • Significance • Deliverables • Background • (properly cite sources using APA style: https://owl.english.purdue.edu/owl/resource/560/03/ ) • Methods • Engineering Constraints • Engineering Standards • Goal Analysis • Budget • Results • Conclusion • References • Please note that the written reports should now represent the progress of the project. However, each report should be a complete document i.e. with all required sections written individually. Common materials/contents (data, results etc.) that belong to the team can be included. Consecutive reports should follow on the preceding individual reports (copied directly or modified according to changes made in your progressing work). So the focus is on your progress from one stage to the next in your project development. • The main body of the reports should be ~ 8 pp long (not counting abstract, references, and supplementary materials if any)

  38. The End

More Related