1 / 53

Selection and Classification for Enlisted Service

Selection and Classification for Enlisted Service. David L. Alderton, Ph.D. Armed Services Vocational Aptitude Battery. Used by all services (3 hour P&P, 1.6 hour CAT) Serves both selection and classification duties 10 tests measuring 4 constructs Verbal Ability

Download Presentation

Selection and Classification for Enlisted Service

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Selection and Classification for Enlisted Service David L. Alderton, Ph.D.

  2. Armed Services Vocational Aptitude Battery • Used by all services (3 hour P&P, 1.6 hour CAT) • Serves both selection and classification duties • 10 tests measuring 4 constructs • Verbal Ability • Word Knowledge (WK), Paragraph Comprehension (PC), General Science (GS; variance split with Technical) • Mathematical Ability • Math Knowledge (MK), Arithmetic Reasoning (AR) • Technical Knowledge • Automotive and Shop Information (AS), Electronics Information (EI), Mechanical Comprehension (MC) • Clerical Speed and Accuracy (dropped in FY02) • Numerical Operations (NO), Coding Speed (CS) • Armed Forces Qualification Test (AFQT) • WK+PC+MK+AR rescaled to cumulative distribution (1-99) • Tied to national sample of 16-24 YO non-institutionalized youth 2

  3. Recruit Quality Matrix • Congress / DoD set recruit quality standards for selection in terms of school diploma and mental group or category • 90% HSDG • 62% CAT I-IIIu • High school diploma status defines the columns • Controls early attrition • AFQT is a measure of intellectual ability • Controls school failures, increases human productivity 3

  4. Cumulative Attrition by Educational Credential Source: FY95-98 accessions.

  5. AFQT and Job Performance • AFQT is related to • school success • valid and reliable measures of actual job performance. N=9,215P. 2-4 of Report to the House Committee on Appropriations, Joint-Service Efforts to Link Military Enlistment Standards to Job Performance, Office of the Assistant Secretary of Defense (Force Management & Personnel), April 1992. 5

  6. Available US Youth Population Less than 1/4 of America’s 17-21 year old youth meet Navy enlistment Standards and are available for enlistment Source:, CNRC 6

  7. Propensity of Youth to Enlist in Navy 7

  8. 7 Contact with Recruiter Swear-in/ DEP Selection Classification • Ability screens • Moral, financial, and educational screens • ASVAB/CAT • Background affidavits • Medical exam • Job & Career Info • School guarantee • Ship date • Swear-in • Delayed Entry Program • Ship to RTC Accessioning Process 9

  9. Consequences of Bad S&C Extensive civilian literature from industrial/organizationpsychology and sociology reveals that … • When selection is done poorly • Early attrition is high • Disciplinary problems are high • Fleet/staff shortages occur • When classification is done poorly • School attrition is high • School seats are under utilized • Job satisfaction is low • Morale is low, unit and fleet readiness are low • Retention is low 10

  10. Accession Pipeline: Basis for S&C Research • Lose 10,000+ in training • Lose 8,000+ from Fleet 52,000 Enter -7,000 -2,600 -2,500 -3,100 -1,600 -800 17,000 Re-enlist RTC A/App 12 mo 24 mo 36 mo 48 mo Recruits Training Fleet Reenlistments Composite estimates of cohort losses based on CNA report: Projected Attrition and Reenlistment for First-Term Sailors: FY02 Q1 update (22Feb02) 11

  11. Conclusion: Poor Selection and Classification • Data indicate a poorly functioning selection and classification system • Poor Selection indicators • Early attrition is high (RTC) • Special programs burgeoning to control RTC attrition (HP3, FAST, PACE, CCU) • Lose 3,700 to pre-existing psychological problems (FY99) • Lose 2,500 to pre-existing medical problems (FY99) • Poor Classification indicators • Retention is low • Setbacks/special programs increasing to control school attrition • Reported job satisfaction is low (Enlisted 47%; 24.2% drop between 1997 and 2000) • Fleet complains about skilled performance 12

  12. Operational Problems and Approaches • 37% first-term attrition leads to excessive costs, workforce turbulence, and under-manning; increases CNRC goals • We need to better select among Navy applicants and better understand attrition • Assigning 52,000 adolescents (recruits) into over 100 jobs • Assignments, entire 20/30 year careers, are based on ASVAB scores (1.5 hour test), manpower needs, anda 7-10 minute interview with a classifier • Obtain greater breadth of information on applicants • Utilize more information in making Sailor-job matches, increase Sailor’s choice in career decision • Validate decisions against outcomes other than A-School academic attrition 13

  13. S&C Measures Navy Outcomes Qualification Early Attrition A/C-School Completion Now Whole Person Assessment Public Background Medical Fitness AFQT ASVAB Verbal Mathematical Technical Perceptual Speed Future New Cognitive Tests Spatial Ability Multi-Tasking Working Memory Achievement/Knowledge Job Skills/Psychomotor School lab grades School classroom grades Job Knowledge Job Proficiency Job Dependability Organizational Adjustment Job Satisfaction Promotion Rates Reenlistment Rates Personality Interest Biodata Conscientiousness Emotional Stability Leadership Motivation Extroversion 14

  14. Vision AFQT (0.5-1.0 Hour) Medical FitnessBackground Check Whole Person Assessment • Intellectual • Personality • Interests Training SuccessJob Success andSatisfaction Career Potential Reduced Attrition and Improved Retention Selection & Classification Today CAT-ASVAB (1.5-3.0 Hours) Medical FitnessBackground Check Entrance Processing DEP and Boot Camp ClassifyFor: Training Success 15

  15. S&C Research Area Goals • Improve the quality and durability of Sailors who are selected and classified • Broaden range of personal characteristics assessed for S&C • Ability, personality, interests • Increase range and quality of outcome measures • Training, job performance, job satisfaction, career progression, retention • Improve quality of classification decisions • Use more information, allow applicant more choice in classification decisions • Reduce unwanted attrition • Understand attrition processes to mitigate individual and organizational dissatisfiers 16

  16. R&D Outline • Attrition Reduction Technologies (6.2, 6.3, 6.5) • 1st Watch • Argus • Job-Skill Matching (6.2, 6.3, OM&N) • RIDE • JOIN • RIDE DSS • ASVAB validation • New Measures (6.2) • Screening for Adaptability • Non-cognitive measures • Aptitude and Interest models, Usability and Contents 17

  17. Traditional Approach Focused on attrition Concentrated on parts of the problem Viewed attrition as a selection failure Used existing administrative data files to track and predict retention/attrition behavior Used “exit” survey to measure satisfaction 1st Watch Approach Broad, comprehensive view Follows a 1-year cohort of recruits (Apr 02 thru Mar 03) Examines career decisions across entire first-term NPRST teams with CNRC, NTC Great Lakes to produce unique personnel data Recruiting Training Fleet Attrition/Retention Research Approaches 18

  18. + + ComprehensiveQuestionnaires UniqueMeasures LongitudinalDesign Understand what affects careeracross first-term of enlistment 1st Watch Objective • Identify recruits at risk • Identify intervention points • Assist CNET with interventions • Reduce unwanted attrition • Increase reenlistment • Reduce costs associated with recruiting, selection and training 19

  19. Methods Use New Instruments and Existing Data with Longitudinal and Cross-Sectional Samples Begin RTC End RTC End “A” School FleetTransition Points • Database Links: • PRIDE • NITRIS • EMF New SailorSurvey RTC GraduateSurvey “A” School Graduate Survey Exit Training Survey Argus Transition Survey 20

  20. New Questionnaires and Common Components New Sailor Survey Influence to Join Navy Recruiting and DEP Classification Navy Fit Scale Stress Coping Scale RTC Graduate Survey RTC Command Climate Recruiting and DEP Classification & Re-class Navy Fit Scale Training Experiences Problems During Training Navy Commitment Scale A/Apprentice Graduate SSC Command Climate Training Evaluation Satisfaction With Rating Navy Fit Scale Training Experiences Problems During Training Navy Commitment Scale 21

  21. Unique Measures • Navy Commitment Scale to predict attrition/affiliation • Allen & Myers (1997); modified affective commitment, continuance commitment, and values similarity (new) • Navy Recruit Stress and Coping Scale to identify recruits with poor coping skills for potential intervention • Based on Vitaliano (1989) 5 core coping strategies • Navy Fit Scale to identify recruits “at risk” and develop interventions to retain quality sailors • Based on E1-E6 evaluation 22

  22. Navy “Fit” Scale • How well do people “fit” into the Navy (P-N Fit) • What Navy wants and needs from recruits and Sailors • What recruits and Sailors want and need from Navy • Cutting-edge methodology • Uses evaluation form as basis • Multiple administrations to measure change over time Hypothesis 1: “Fit” improves from entry through training Hypothesis 2: “Fit” is lowest for those who exit training 23

  23. Argus Career Decision Milestone System (Mottern, White, & Alderton) • Evolved from Separation/Retention survey to fill critical Navy need • Web based (7Feb02 start) • Required complete by OPNAV instruction • Collects data from leavers and stayers at decision points throughout career: +35,000 respondents to date • Replaces “satisfaction” scales with “influence to stay/leave” • Tied to PERSMART attrition and retention personnel database • Evaluate quality of leavers/stayers • Evaluate impact of specific events on attrition/retention • Assess relationship between intentions and behaviors • Provides commands with attrition and retention data 24

  24. Attrition Reduction Technologies: Next Steps • Execute cohort study (1st Watch) • Rewrite Argus survey • Increase utilization • Integrate 1st Watch, PerSMART, Argus data into longitudinal cohort database 25

  25. R&D Outline • Attrition Reduction Technologies (6.2, 6.3, 6.5) • 1st Watch • Argus • Job-Skill Matching (6.2, 6.3, OM&N) • RIDE • JOIN • RIDE DSS • ASVAB validation • New Measures (6.2) • Screening for Adaptability • Non-cognitive measures • Aptitude and Interest models, Usability and Contents 26

  26. Job Skill Matching • Largely a conscription-era enlistment process • Navy classification process fails to encourage enlistment and combat attrition • Short-term recruiting quotas over Sailor-rating match • Does not consider job satisfaction/interest as key variables • Makes 20/30 year career decisions based on minimal information • Antiquated classification algorithm • Research done 1960, implemented 1978 • Difficult to maintain, understand, and describe 27

  27. Three Faces of “RIDE” • RIDE classification algorithms • Develop and validate new job classification algorithms • RIDE DSS and interface • Develop a flexible interface and decision support system for use by Recruiters, Classifiers, and enlisted applicants • JOIN interest inventory development • Develop interest inventory that captures the breadth of Navy entry-level jobs and is comprehensible to naïve applicants 28

  28. RIDE Model: Efficient Resource Allocation • Considers First Pass Pipeline Success (FPPS) as the training success measure • FPPS: Pass entire training pipeline, no setbacks • Reduces exaggerated “best” test score • Developed plateau relationship between training success and cut score, vice simple linear relationship • Penalizes for “over-qualification” of applicant • AFQT based for a given program/rating, to minimize resource “wastage” • Increases number of ratings applicant “optimally” qualified for • Increases number of ratings “tied” for the top of the list • Produces opportunity for interest and preference based vocational guidance 29

  29. RIDE Model Studies • Original Study (Watson & Folchi, 1999; Blanco & Watson, 1999; Blanco, Shedlock & Watson, in review) • 1996-1998 Data had 18% FPPFs (First Pass Pipeline Failures) ($59.8M) • RIDE red flagged 40% of FY96-98 FPPFs as misclassifications, identified better job match. Potential cost avoidance 40% * 2390 Person Years = 956 Person Years ($23.9M) • Most Recent Study (Zaki, Crookenden, Yeh, & Watson, 2001) • RIDE performance surpassed competing assignment algorithms (Shadow Pricing, Efficient Frontier, CLASP and actual Navy) using FY99-2000 data • Most important comparison: FPPS Unassigned • RIDE (no interest component) 85.8% 799 • CLASP (existing Navy algorithm) 79.6% 3931 30

  30. RIDE DSS Features • Fast, accurate, web-based (connected or disconnected) • Tied to live quota availability • Displays jobs, ship dates, enlistment incentives, and required waivers on same screen • Allow filtering • Stated preference (proxy for interest) • Critical 15 Navy jobs • Ship dates • Community 32

  31. RIDE DSS Studies • Usability Study (Completed 1 Apr 01, NORU, Pensacola FL) • 70% reduction in required system interaction • more time vocational counseling, less system operations • 100% user acceptance • less repetitive tasks, more information • Pilot Study (Began, 10 Jul 01, MEPS, San Diego, CA) • 100% Classifier acceptance • 100% utilization (when possible) on Mission Day • 661 of 678 (97%) applicants responded “Yes” to the question “Do you feel you got the job that you wanted?” 33

  32. RIDE DSS Status • 11 Oct 01: CNRC, SITC, IBM, NPRST adopted the RIDE algorithm and functional characteristics as NRAMS functional requirements • 26 Oct 01: CNRC, NPRST met and formally agreed • NRAMS will leverage RIDE to replace OCEAN, ONBOARD, and CLASP components of PRIDE. • Control Analyst functions will be included • E.g., centralized maintenance of mental, moral, and physical classification standards and policies • Display more information on incentives, jobs available, and OSGs to emphasize applicant preferences 34

  33. JOIN Background Job Satisfaction Interest based job assignment Retention Job Performance • Selection and classification decisions based on applicant preferences and interests lead to: • Less “buyer’s remorse,” greater job satisfaction, greater job performance, and lower turnover (civilian jobs) • In addition, the Navy should expect … • Fewer administrative and disciplinary problems • Lower first-term attrition • Greater second-term reenlistment 35

  34. MT MT MT CTR CTR CTR AE AE AE JO JO JO PN PN PN RIDE EW EW EW HT HT HT CTT CTT CTT MM MM MM YN YN YN ET ET ET DC DC DC BT BT BT AT AT AT RP RP RP JOIN RIDE used with JOIN • RIDE produces a limited set of ratings a person is qualified for and which have saleable quotas. • JOIN filters RIDE selections to program/ratings a person is likely to be satisfied and interested in. 36

  35. JOIN Development: Jobs & Occupational Interests in the Navy • Dismissed civilian and other interest measures • Collected 100s of task statements and descriptions from Enlisted Community Managers on all entry level jobs • Developed a broad descriptive model consisting of… • Community (e.g., submarine, aviation, surface) • Work environment (e.g., industrial, office, out-of-doors) • Content (e.g., customers, engines, media) • Process (e.g., design, repair, operate) • Developed software to collect data that is appropriate for a naïve military job applicant • Data collection to begin in spring followed by additional modifications in the model and software 37

  36. 38

  37. 39

  38. 40

  39. RIDE/JOIN Status • RIDE algorithm developed, tested, and adopted • Continue simulations for validation • Live pilot test continues in San Diego • RIDE DSS development • Usability and live-pilot test successful • NRAMS integrating critical RIDE components • JOIN instrument development • Develop dimensional models for all Navy ratings • Define Navy job groups/clusters • Develop dimensional models for job groups/clusters • Data collection on interest JUN02 • Determine how to integrate with RIDE model 41

  40. ASVAB Validation • During the drawdown, there was little money applied to validation work • Revalidation of in-place standards has been only on demand • In some schools, 8+ years passed between validations • Multiple cut scores propagated to control attrition • School validity declined • Revalidation was usually initialized by a crisis 42

  41. ASVAB Validation Study Steps • Request from CNP via Enlisted Community Managers • Attend Training Task Analysis meeting if rating merger • School visit • Obtain curriculum outline and testing plan • Observe laboratories and collect data • Meet with school officials on curriculum and nature of academic problems (Academic Review Board) • Conduct validation study (based or regression techniques) and receive feedback from ECM and school officials • Submit letter report • N-132 approves and distributes letter directing ASVAB changes in Navy systems 43

  42. Yearly ASVAB Standards Review • Events precipitating ASVAB standards reviews • New or merged ratings (CTT/EW) • Multiple cutscores (NF) • High attrition rate or setback rate • Critical ratings that are hard to fill (EOD FY02) • Ratings with suspected low ASVAB validity (IT) • Higher aptitude requirements of advanced training pipeline • Major curriculum revisions • Formation of occupation groups 44

  43. Management Decision Support System • Ensures all ratings are reviewed at least every 3-years • Comprehensive collection of rating dispositions to flag when ratings need to be reviewed (multiple data sources) • Curriculum revision • Last time ASVAB validated • Training pipeline and sites to coordinate school visits • Attrition rate and setback rate trend over years • Merger status 45

  44. Payoff from ASVAB Program • Ensure high leave of validity and relevant selection composites (and cut scores) • Reduced A-school attrition and setback rates • Fully assigned training seats (qualification rates) • Better person/job fit 46

  45. R&D Outline • Attrition Reduction Technologies (6.2, 6.3, 6.5) • 1st Watch • Argus • Job-Skill Matching (6.2, 6.3, OM&N) • RIDE • JOIN • RIDE DSS • ASVAB validation • New Measures (6.2) • Screening for Adaptability • Non-cognitive measures • Aptitude and Interest models, Usability and Contents 47

  46. Screening for Adaptability Alderton, Farmer, NPRST; Larson, Kewley, NHRC • 40% of all RTC attrition is for pre-existing psychological reasons • Largest single category • Lose 3,500-4,000 to pre-existing psychological problems • Largest category of first-term patient visits are for mental health • 30% of all first-term in-patient hospitalizations are for mental health • Should • Screen for precursor behaviors to maladjustment • Screen for protective behaviors to stress 48

  47. Adaptability Objectives • Produce a short (~30 minute) instrument appropriate for use in selection • With adequate statistical sensitivity and specificity • Produces a single score on a continuous scale so “cutscore” can be adjusted • Is not transparent in purpose; resistant to coaching, faking • Makes sense to leadership and professionals • Avoid past mistakes • Applicant Screening Profile (ASP) was biodata based and considered “coach-able” • NAFMET and BEST 49

  48. Adaptability Conceptual Framework • Based on research literature, interviews with discharged recruits, academic and Navy mental health professionals, and Navy training specialists, develop an instrument that incorporates: • Personality • Negative affect • Positive affect • Relevant biographical data • Motivation • Social functioning • Adaptability • Develop scale that represents the multidimensional nature of the constructs into a single continuous score • Develop concise and effective instrument suitable for selection 50

  49. Adaptability Status • Designed construct validation study focused on positive affectivity • Incorporates negative affectivity, personality, coping styles, childhood experiences, delinquency, and mental health crises • Continue support of CNRC/Eli Flyer work with revised ASP 51

  50. Non-Cognitive Measures(Farmer, Bearden, PRDI) • New start • Look at practical, proven, non-cognitive measures • Personality, interest, tacit knowledge, situational judgments • Determine their interrelationships and their relationship to aptitude / intelligence • Determine what is practical in a military setting • Test combinations of measures • Develop relevant outcome measures (e.g., behaviorally anchored ratings) 52

More Related