1 / 27

Human Factor Evaluation for Marine Education by using Neuroscience Tools

MASSEP 2013 May 2013. Human Factor Evaluation for Marine Education by using Neuroscience Tools. N. Ν ikitakos D. Papachristos Professor Ph.d. candidate Dept. Shipping, Trade and Transport University of Aegean. CONTENTS. Human Factors Evaluation

sumana
Download Presentation

Human Factor Evaluation for Marine Education by using Neuroscience Tools

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MASSEP 2013 May 2013 Human Factor Evaluation for Marine Education by using Neuroscience Tools N. Νikitakos D. Papachristos Professor Ph.d. candidate Dept. Shipping, Trade and Transport University of Aegean

  2. CONTENTS • Human Factors Evaluation • Research Methodology • Case Study

  3. HUMAN FACTORS EVALUATION

  4. Human Factor Evaluation (1) • Maritime education • user’s satisfaction • objective criteria • satisfaction phenomena

  5. Human Factor Evaluation (2) • mixed approach to Human Factor evaluation • ship’s bridge equipment • usability and educational evaluation • ship bridge interactive systems • neuroscience tools of gaze tracking & speech recording for measuring emotional user responses • Usability testing

  6. Human Factors Evaluation (3) Neuroscience tools

  7. Human Factor Evaluation (4) • Human Factors evaluation • ship manipulation systems • design (interactive technologies) • ergonomic • few applications in industry • cognitive ergonomics

  8. Human Factor Evaluation (5) • Usability has been defined by ISO 9241 as “the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use”

  9. Human Factor Evaluation (6) • Effectiveness means accuracy and completeness with which users achieve specified goals • Efficiency means resources expended in relation to the accuracy and completeness with which users achieve goals • Satisfaction means freedom from discomfort and positive attitudes towards the use on the product

  10. RESEARCH METHODOLOGY

  11. Research Methodology (1) Defining satisfaction concerns the following parameters, which are being investigated: • concerning software and educational scenarios, • system usability as far as the system per se is concerned (total functionality), • as well as the individual training and technical characteristics that complete the teaching act.

  12. Research Methodology (2) The research questions (RQ) set by the suggested research framework as follows: • RQ-1: There is a relationship between the user’s optical attention to the user’s satisfaction either for the software or the scenario? • RQ-2: There exists a relationship between the training characteristics, user satisfactions and by extension optical attention?

  13. Research Methodology (3) • The suggested research method aims at interpreting, determining and evaluating the data of the biometric tool in combination with the conventional methods (qualitative, quantitative) results based on the factors (relationships) that is possible to influence the user’s satisfaction

  14. Quantitativedata (questionnaires) Measurements software Relationships between parameters-factors scenario Qualitative data (interview) Research Methodology (4) Interpretation procedure

  15. CASE STUDY

  16. Case study (1) The case study aims the following: • the evaluation of the user satisfaction from using the ECDIS software and scenario and • educational evaluation of ECDIS from the user’s point of view (opinions).

  17. Case study (2) • first (random) sampling (January 2012 until May 2012), in the Information Technologies Laboratory of the National Marine Training Centre of Piraeus • 31 Marine officers • video recording of ~23 minutes per student

  18. Case study (3) experiment ECDIS operation ECSIS Lab room Eye tracker “Face Analysis“ software - station

  19. Case study (4) • Stage 1: Information about the experiment, Presentation of the acceptance document by the user-trainee (estimated time duration 5 - 10 minutes) • Stage 2: Completion of a user’s profile and of the assessment survey concerning educational and technical characteristics (questionnaire, T3) by the trainee (estimated time period 10 - 15 minutes) • Stage 3: Equipment installation (gaze tracking device) and configuring the parameters (T1) • Stage 4: Video recording (T1) in combination with filling in a work sheet (T3) by the researcher (estimated time duration 20-25 minutes) • Stage 5. Completion of the process (device disconnect) through a semi structured interview & questionnaire (T2, T3, T4) with the user (estimated at 5 – 10 minutes)

  20. Case study (5) • Tool-1 (T1): the optical data registration will be conducted by the “Face Analysis” software in connection with a Web camera set on the computer in which there is the subject of the research (ECDIS) • Tool-2 (T2): Use of a microphone for voice recording (interview) • Tool-3(T3):Questionnaires using for opinion/attitudes/expectation/self-evaluation and observing • Tool-4 (T4): Usability assessment tool SUS

  21. Eyes Quality parameter (eye gaze trucking)values (horizontal)>0: mean out of screen values (vertical)→-18 view of the center of the screen Eye gaze vector Schedule of eyes & head pose ~0 attention in screen ~1 & >1 no attention Distance from monitor >1 close to the screen <1 away from the screen Eye Level, EL Values >10o degrees, (high mobility) Values <10o degrees (attention depending on the scenario EL HR= --------- HL Head roll (angle), HR Xo Horizontal Level, HL Case study (6) Βiometric tool parameters interpretation (‘Face Analysis’)

  22. Case study (7) The data of experiment come from three sources (by using SPSS, Excel): • questionnaires, • SUS Tool, • optical data (gaze tracking) and • Interviews (voice recording)

  23. Case study (8) The results are shows: • a relationship between Gaze parameter and Usability assessment of users. The gaze parameter depending from SUS score. It shows attention increases as assessment from ECDIS software (RQ-1), • a relationship between SUS score and training characteristics (total assessment, time schedule) and a strong relationship between ECDIS satisfaction and Scenario Satisfaction (RQ-2). It seems the scenario operation depending from software environment (navigation, interface), • High usability for ECDIS software (questionnaires evaluation, SUS tool) and high score for Training program evaluation (National Marine Training Centre of Piraeus).

  24. Case study (9) Sample’s structure

  25. Case study (10) ECDIS – Training program Evaluation

  26. Case study (11) Correlations between variables of research tools

  27. Thank younnik@aegean.gr

More Related