1 / 49

MORS Irregular Warfare II Analysis Workshop

NATO UNCLASSIFIED. MORS Irregular Warfare II Analysis Workshop. Final Outbrief for SOCOM. Chair: Renee G. Carlucci, CAA Co-Chair: Don Timian, ATEC. 3 - 6 February 2009 Davis Conference Center MacDill AFB, Florida. Agenda. Overview Tutorial and Plenary Session Observations

candy
Download Presentation

MORS Irregular Warfare II Analysis Workshop

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NATO UNCLASSIFIED MORSIrregular Warfare IIAnalysis Workshop Final Outbrief for SOCOM Chair: Renee G. Carlucci, CAA Co-Chair: Don Timian, ATEC 3 - 6 February 2009 Davis Conference Center MacDill AFB, Florida

  2. Agenda • Overview • Tutorial and Plenary Session • Observations • Key Issues and Discussion Items • Gaps • Recommendations

  3. Overview • Title: Irregular Warfare Analysis Workshop • Organization • 3-6 Feb 09, held at SECRET/REL to AUS/CAN/GBR/USA level at MacDill AFB, Tampa, FL • Program chairs: Ms. Renee Carlucci, Mr. Don Timian, and LTC Clark Heidelbaugh • 5 Working Groups: Global Engagement; Stability, Security, Transition, and Reconstruction (SSTR); Information Operations (IO)/Psychological Operations (PSYOP)/Social Sciences; Counterinsurgency (COIN); and Thinking Models About IW; • Synthesis Group • Objectives • To frame the toughest IW problems that SOCOM and DoD are facing • To bring special operators, analysts, and problem solvers together to help define IW analysis problems, explore techniques to deal with these problems, identify what has worked and not worked, and determine recommended ways ahead • Context • Changing strategy with greater emphasis on Irregular Warfare (NDS, QDR, IW JOC, etc.) • Continuing emphasis on building partnership capacity (BPC Roadmap, TSC, etc) • Limited policy, process, tools and methods for Irregular Warfare activities • Focus on partnering operators with analysts and problem solvers to gain operator perspectives • Scope: • Military dimension of conflict in Phases 0-5 of DoD’s campaign planning construct • Tools/methods, algorithms, historical/current data sources, on-going analysis, opportunities for collaboration on future analysis and tool development

  4. Irregular Warfare Joint Operations ConceptMr. Jeffery (Gus) Dearolph, Deputy Director Internal,SOCOM J10 • Lessons from the Irregular Warfare Methods, Models, TechniquesCOL Jeff Appleget, TRAC • Summary of Improving Cooperation Among Nations for Irregular Warfare Analysis WorkshopDr. Al Sweetser, Director, OSD-PAE SAC • Operational DesignLTC Reb Yancey, SOCOM SORR-J8-Studies • Sponsor’s WelcomeMr. E. B. Vandiver III, FSDir, Center for Army Analysis • Keynote SpeakerMr. William J. A. Miller Dir, Strategy, Plans, and Policy, SOCOM Tutorial, Plenary, and Keynote Speakers

  5. There is continuing friction with the IW definition across Services, agencies, interagency, and among allies • There is a lack of grand strategy and a failure to understand population • Key IW factors are: indirect, enduring, persistent, proactive, population-centric, respect of legitimate sovereignty linked to over-arching strategy • Consists of : • Key missions (e.g., FID, UW, COIN, CT, Stab Ops) • Key activities (e.g., Strategic communications, IO, PSYOPS, Intel, Counter- intel, Support to law enforcement) • IW Military Leadership • JFCOM for General Purpose Forces (GPF) • SOCOM for Special Operations Forces (SOF) Irregular Warfare Joint Operations ConceptMr. Dearolph

  6. “IW focus is on the population” • “COIN” is the Key when insurgents exert more influence on local populations than the national government” • IWMmAWG Study established a 7-element framework • Identified 35 gaps, 34 related to data and social sciences • Analytical Approach • Now, Top-down, Western perspective (DIMEFIL-PMESII) • Soon, Bottom-up employing social sciences expertise • Track strategic level Methods, models, Tools (MmTs) • Iterative development of “key data” is central • Over-all needs • Create credible, relevant MmTs to address decision maker issues • Make social scientists integral members of the analysis team • Continue community-wide dialogue through IW Working Group Lessons from the Irregular Warfare Methods, Models, TechniquesCOL Jeff Appleget

  7. Improving Cooperation Among Nations for Irregular Warfare Analysis Workshop (NPS)Dr. Al Sweetser, Director, OSD-PAE SAC • There is value having international participants from many different nations • Emphasized importance of “Whole of Government“ approach • Useful to conceptualize the problem as “Complex Adaptive Systems” (e.g., act, react, re-react,…) • Consider a hybrid approach (e.g., wargame – model – wargame)

  8. IW is a “wicked problem” • Akin to relearning COIN analysis approaches (Vietnam / Iraq) • SOD employs a structured method of inquiry that enables a commander to: • Make sense of a complex situation • Capture understanding • Share the resulting visualization • SOD is a method of inquiry, is based on discourse, creates a learning system • Requires accepting humility and valuing heresy • Means challenging the information and the boss • To deal with a dynamical complex system, one needs to explore the interactions among the key parts (“hermeneutics’) Systemic Operational Design (SOD)LTC Reb Yancey, SOCOM J8-Studies

  9. “IW is about populations” • In analyzing IW issues, a Lanchester view is not useful • “Behave” not kill our way to victoryShape vs. exploit, synthesis not analysis, transforming is satisfising whereas solutions are optimizing, presence changes the problem • Be as “un-wrong” as can be in conceptualizing a global perspective on issues • Globalization challenges and threats to the US—Migration, Crime, Extremism • SOCOM Challenges: Be up-stream (leverage), turn down the heat (affect), engage in dialogue with senior decision makers Keynote SpeakerMr. William J. A. Miller, SOCOM Dir, Strategy, Plans, & Policy,

  10. Working Group Observations • The working groups (WG’s) were highly partitioned by their titles and topics areas (tough to find overlap) • WG’s employed from 4 to 9 presentations in their sessions—a total of 30 different workshop presentations • WG’s ranged in size from 16 to 50 members—the “modeling IW WG” had the highest numbers • WG’s recognized that they have more challenges and tasks then they can handle in a three-day workshop • WG’s have heart and intellectual energy but are limited by clock time and “soak time” • WG’s would like to “sit in” on other working groups (series vs. parallel information meetings)

  11. General Observations • We are still struggling with the exact meaning and breadth of irregular warfare (bounding and characterization) • We are not familiar with the agencies that understand or have jurisdiction for DIMEFIL and PMESII • “Models and Tools” do not equal “computer programs and computer models” • Wargaming with the right players offers a powerful technique for discovery • Graphics in a storyboard approach has a prominent place in IW for displaying and understanding influences • Everyone is talking about data, its definition, its meaning, its form, who is collecting it, processing it, and storing it • No consensus on what information does exist, should exist or who is or should be responsible—regardless, the complexity of the situation transcends the data • VV&A is still the topic on IW models and data

  12. WG 1:Global EngagementFindings/Recommendations • WG1: Global Engagement Charter: Provide recommendations on appropriate analytical techniques to prioritize, plan, and assess Theater Security Cooperation activities to assist the COCOMs in addressing the analytical challenges that they currently confront. • Key Questions: • (1) How should objectives and indicators be structured? • (2) How does an assessment division with four to ten people measure and maintain the baseline? • (3) How can you determine the right activities to support partner nations while making the most progress towards desired end-states? • (4) Is it possible to measure cause and effect in a complex system? • (5) What is missing from this process? • Takeaways: • Many of the effects are potentially unquantifiable (and will remain so). The challenge remains informing decision makers given this constraint. • Interagency analytical resources can assist and are essential • Don’t just accept objectives or rush to create them. Need to focus on shaping objectives as well as measuring progress. Reframe. • IW analysis will affect traditional analytical paradigms • Messy data, Cause & Effect, No easy “one-size-fits-all” toolset • Effective Security Cooperation exceeds the boundaries of DoD’s authorities and capabilities

  13. Assessment Techniques Baseline Trends Forecasting

  14. Combined/Multinational Education Combines/Multinational Exercises Combined/Multinational Experimentation Combined/Multinational Training Counter/Non-Proliferation Counter Narcotics Assistance Defense & Military Contacts Defense Support to Public Diplomacy Facilities & Infrastructure support projects Humanitarian Assistance Information Sharing/Intelligence Cooperation International Armaments Cooperation Security Assistance Cross-Cutting Programs Activities (14 engagement tools)

  15. WG1 Findings-- Suggestions Q1.How should objectives and indicators be written, structured and prioritized? • Ideally, a comprehensive set of metrics should be identified, where that is not possible indicators should be MOE rather than MOP. • Beware, decomposition can be endless. “If you can’t measure the objective then you have no objective!” • Involvement of the analyst in structuring the specific language used in objectives is essential. It must mean something analytically. Embed analyst in the strategy division? • SME qualitative indicators. Are they valid and consistent between different experts? May not be able to trend the data, there will be limits here. Prefer something that is quantitative but don’t take judgment out of the process. • Consider prioritizing indicators, on basis of which are most important, not by which are easiest to collect. (Embedded analyst can assist). • Don’t forget to re-evaluate what indicators you are using. Iraq experience of looking at MOP rather than MOE to assess progress. And reframe the problem at the objective level, reprioritize when necessary, goals should be achievable.

  16. WG1 Findings-- Suggestions Q2. How should a baseline be established and maintained? Identify the indicators before looking for data, this allows you to identify gaps in the data you collect. Have to put effort into thinking about inclusive measures – no laundry list End states & steady-states. Use, wherever possible, existing government or reliable data sources. Be aware of the origin of data where sources may not be reliable. Be aware of dangers of active verses passive data collection Some indicators useful for forecasting, others not. There will be some universal measures, such as child mortality rates which will be good indicators across a range of objectives. Aggregate diverse data elements into composite index. Will show trends. Need correct SMEs to interpret the data. Most data will be messy. If you measure too often you may affect the system. If you can’t present your data reliably you’ve failed. Map background and cartographic display and trends work well. For stop-light charts define criteria.

  17. WG1 Findings-- Suggestions Q3. How should developing partner nations’ security forces be evaluated and supported? Focus on sustainability (institutional change, 15 years) Trust and confidence Build the professional military education school house before going out on the rifle range Target/create/instill/develop the cadre of professionals Assessment methods for building security institutions Defense Resource Management Study Project (DRMS) difficult to implement for under-developed institutions Comprehensive baseline surveys must be conducted. E.g. U.S. Country Team or SOF site survey. Consider host nation’s security forces – not just military. Can we do that with other U.S. government institutions? Authorities and treaties are issues. Other allies where required. Assessment measures must be tailored to each country’s unique security requirements, authorities and situation Existing U.S. assessment measures may be considered for establishing baseline or appropriate framework A negotiation on suitable role/end-state for each partner nation’s forces Leverage capacity of other allies to help build regional capacity You don’t necessarily need a U.S. level of performance to be successful

  18. WG1 Findings-- Suggestions Q4. How would you begin to address analyzing cause and effect? Can’t easily get to cause and effect. Is measuring effect enough for the COCOMs to make good decisions? Without cause and effect how do we build models? Need to be realistic about the level of perfection that can be achieved, “better than a coin toss” may be an appropriate standard Make more structured use of SMEs Use techniques to add scientific rigor to SME contributions: pair-wise comparison, gaming, structured interviews, role-playing, value focused thinking, SME selection remains important, encourage diversity of opinion - groupware Try and think through to potential second & third order effects Other techniques that may be valuable Historical analysis, electronic markets, risk-consequence management Near real time data required for insights on causal relationships. Modeling needs to be issue specific, at least initially. Need to be able to look under the hood (no black boxes, we need insights not just answers) Need to understand the lag between action and response in the system System dynamics What is the ideal refresh rate for indicators and reframing objectives? It may be different from one indicator or objective to the next.

  19. WG1 Findings-- Suggestions Q5. What is missing from the process? Consider the link between the indicators required for the baseline and measuring the effects of activities. Is there a common set? Activity Identification is immediately resource constrained Need to identify unconstrained requirement to estimate risk Where in the process do we do the risk evaluation? Policy incentives to encourage regional development NATO was a strong incentive for development Stronger links between COCOMs and OSD/ PA&E and Policy Understanding resource constraint earlier in the process will assist with assessing IPL requests and creating new authorities, policies and funding vehicles. Design new engagement tools to meet regional security challenges Potential misalignment of assessment resources to assessment requirements – Continue to prioritize objectives and indicators.

  20. WG 2:Stability OperationsFindings/Recommendations • WG2: Stability Operations Charter: Identify SO challenges and problem areas to be solved and identify analytical methods that might help solve those areas • Critical Insights: • No single method, model or simulation will provide complete answer, but many can provide results to help inform decisions in one or more areas. • Several of the tools can be used immediately and many under development have promise. • Identification of metrics is absolutely critical. • Identification and collection of relevant data is difficult but must be done. • Key Takeaways • Though Stability Operations is only a part of Irregular Warfare, it still presents a large problem space • Challenge areas presented by different agencies had some common threads: Determination of demand/requirements,Prioritization of efforts/risk management, Determination/use of metrics, Attaining “whole of government” approach • Many challenge areas are not adequately addressed by current analytical methods, models, and techniques • Many promising methods, models, and techniques are in development

  21. Working Group # 2 Challenge Areas

  22. Working Group # 2 Challenge Areas

  23. Working Group # 2 Challenge Areas

  24. Working Group # 2 Challenge Areas

  25. Working Group # 2 Challenge Areas

  26. Working Group # 2 Challenge Areas

  27. Working Group # 2 Challenge Areas

  28. Working Group # 2 Challenge Areas

  29. Working Group # 2 Methods, Models, Simulations

  30. Working Group # 2 Methods, Models, Simulations

  31. Working Group # 2 Methods, Models, Simulations

  32. Working Group # 2 Methods, Models, Simulations

  33. Working Group # 2 – Session 2 Assessment (A)=Available Tool

  34. WG2 Findings & Suggestions • ♦Findings: • −Even though everyone agrees that Stability Operations requires whole of government, non-government, coalition, and host nation/public participation, most of our methods, models, and techniques do not account for all of them • −It appears that many of the challenge areas are indirect results of an absence of overarching strategies and goals • −It is hard to understand how some tools, methods, and models work without common terms of reference – the same is true for data • ♦Suggestions: • −Develop common terms of reference for understanding how tools, methods, and models work and for describing data • −Ensure future collaboration efforts continue and expand to include the entire SO community-of-interest

  35. WG 3:IO/PSYOP/SocSciencesFindings/Recommendations • WG3 Charter: Improve the foundations of information operations /PSYOP analysis; identify existing analytic capabilities and shortfalls; explore the application of quantitative and qualitative methods for improving analytical capabilities; evaluate and recommend concrete applications. • Key Takeaways: • A coherent taxonomy and lexicon of IO is required: Analysts and operators must use the same set of definitions • Models, methods, and tools must provide mechanisms for learning, understanding of the problem, not prediction • Coordinate PSYOP across related combined, joint, and inter agency arenas • Develop robust case studies which capture a full problem set to greatly benefit exercises, education, and training • Non-kinetic assessment (MOP, MOE) must be in the initial plan • Key gaps in PSYOP capabilities must be resolved by other means (traditional social sciences, ORSA approaches may assist): Red teaming, Evolutionary development of M&S, Enhanced Wargaming (Phase 0), Human terrain and media analysis

  36. Analytic Design Issues How do we appropriately choose models methods and tools for OD in PSYOPS? Generic tools that can be fine-tuned to the situation through social discourse (like the MpiCE project – Measuring Progress in Conflict Environments which provides a list of MOEs for organization’s SMEs to choose from to tailor to specific situation). Develop different solutions that you can test Know the TYPE of your problem Test and compare using same data sets Get a conformal standardized data set What disciplines should be on the team? How do we choose the right ones and access them? Analytic Ability/skills regardless of field Open-minded and able to work across disciplines Familiar with both military and OD process Have both field and background analysis capabilities

  37. Analytic Design Issues What is the appropriate approach to measure effectiveness? What else needs to be measured? Step 1: Know the intent of campaign or conditions to be changed Step 2: then you can set measures up front and constantly refine over time (iteratively) How should we study outcomes of our actions? COORDINATE – form friendly network of interservice, interagency, govt, private partners Tailor to sub-groups and integrate Do in steps – eg – how much closer did I get to the goal? (eg – goal 50% positive polling – track trends from beginning) Give your partners the collection requirements so they can collaborate Don’t rely on a single measure (eg – not just polling) There should be different measures for different timeframes – short/medium/long Short – single behavior events (eg – vote, obey curfew, etc) Medium – trends in behavior (eg. Calling a reporting hotline) Longer term – attitudes underlying (Must understand what attitudes underly your objectives and then what behaviors reflect these attitudes iot measure them) Address both good and bad outcomes Cannot measure attitudes directly (polling can help but is not entirely reliable)

  38. Analytic Design Issues Gap: need to fund longer-term studies on what kinds of observable behaviors reflect the attitudes we are likely to seek (eg – what behaviors underly acceptance of a “market democracy”?) Further issue: giving people something positive, something to say “yes” to –something which reflects their self-interests and values. This approach might be more effective (can sponsor studies to determine) but also more likely to provide the types of objectives which lend themselves to observable/measurable behaviors.

  39. WG 4:CounterinsurgencyFindings/Recommendations • WG4: Counterinsurgency Charter: Explore Various Analytical Tools And Methods For Use In Planning And Conducting Counterinsurgency • Findings/Recommendations: • COIN analytical techniques applicable to general purpose forces (GPF) are equally applicable to special operations forces (SOF) • Strengths and weaknesses of COIN analytical support for GPF are the same for SOF • Tools include: Deployed analysts, Human-in-the-Loop (HITL) computer-supported wargaming, COIN M&S not mature (caution on ability to model human behavior) • Recommend USSOCOM develop a structure to provide analytical support to COIN forces • Recommend USSOCOM consider interdisciplinary teams • Recommend SOF training/education/familiarization with benefits of analytical support

  40. COIN analytical techniques applicable to general purpose forces (GPF) are equally applicable to special operations forces (SOF) IED Analysis, Polling, Social Network Analysis, ISR Network Analysis, Assessment Analysis, Trend Analysis, Criminal Activity Profiling, RIO Analysis, Etc. Strengths and weaknesses of COIN analytical support for GPF are the same for SOF Assessing influence on population Problem Area – at times data obtained from host nation untrained collectors Present – COIN Execution

  41. Present – COIN Planning • Human-in-the-Loop (HITL) computer-supported wargaming • Adequate way to provide insights now • Federations of specialized simulations • Wargame Integration toolkits • Must use caution; not mature enough for some contexts • Models and Simulation • Warm and fuzzy – not! • Emerging but still in its infancy

  42. Present – COIN Planning • Substantial efforts ongoing • M&S as well as Non M&S • Data is problematic across the board • Context specificity • Strategic, operational, tactical • Difficult to separate analytical implications between levels of war (“Strategic Corporal”) • Realize need for a conceptual framework for understanding and integrating causality across all levels of analysis • Iterative process/dialogue

  43. Recommend USSOCOM develop a structure to provide analytical support to COIN forces Established during planning – every operation is different Diverse operating environments – varying footprints Reachback analytical support Support through GPF - when GPF are available Recommend SOF training/education/familiarization with benefits of analytical support Future - COIN Execution

  44. Future – COIN Planning • Recommend USSOCOM consider interdisciplinary teams • Centralized • Decentralized • Hybrid • Recommend USSOCOM look into a conceptual analytical framework to provide analytical support to USSOCOM COIN planning • Mr. Miller’s Trinity (crime, migration, extremism) • Left of boom • Forecast next hot spot • Correlation, not causality

  45. WG 5:Thinking ModelsFindings/Recommendations • WG5 Charter: Frame the context of the IW problem properly, break down IW operations into its natural components, and investigate the subject through discourse and the application of systems thinking. • Findings: • Many ways to see/represent IW – different languages/logic • Lack of common terms/understanding about IW • IW analysis at strategic/operational/tactical may require different cognitive models/techniques/representations • Modeling is difficult – must learn to think differently • Focus on uncovering indirect opportunities • Need tools to improve research capabilities that enhance thought and shared understanding • Need decision makers to shape/provide guidance: • frame problem • visualization – make the whiteboard a “group thinking pad” • acquire a depth of understanding • The Operational Design process: • requires continuous learning • provides insight, not answers • expect some risks • Identifies what we know and don’t know about the problem

  46. There is a gap between our analytical capability and our commander’s operational needs • The repository of the IW “body of knowledge” has not been clearly identified (IW online Library) • There is a relational, supportive, and authority gap between the military and “the interagencies” on IW • We do not understand interagency lines of communications • We don’t understand how to balance government capacity for “restoration of services,” security, or economic development • We do not know the modeling requirements for IW analysis • Many do not know about IW Community Hubs, Potential Data sources or samples of IW Activities available by Joint Data Support Gaps

  47. Our current metrics don’t capture the qualitative aspects of conflict that commanders need • We have voids in our data and very little cause and effect data (e.g., temporal effects require years/decades of observations) • There is no “owner” of a common lexicon • We lack sufficient analysts/SMEs with DIMEFIL (Diplomatic, Informational, Military, Economic, Financial, Intelligence, Law Enforcement) experience • Identifying the differences between “indicators” and “effects” and understanding some effects are not quantifiable (e.g., measuring persuasion and influence) • We have not retained our history of IW, how do we bring it back—we need to leverage that operational experience and those earlier insights • There are different levels of IW that require very different tools Key Issues & Discussion Items

  48. Recommendations • Identify, create and sustain credible IW data • It will require iteration to decide on the data needed and to characterize it (e.g., metadata; pedigree) • Develop a lexicon of key terms • Current definitions are not acceptable to the interagency, coalition partners • Continue the dialogue on MmTs to support IW analyses • This workshop represents a significant step forward • More dialogue is needed w/ whole of government participation • MORS provide a forum to help organize the needed information • Create a common template to compare and contrast key IW models and tools • Continue to support efforts to identify key gaps and priorities to guide future actions • MORS and Sponsors assist in bringing the various IW Communities of Interest (COI) together; e.g., • IW Working Group • Human, Social Cultural Behavior (HSCB) modeling • MORS Social Science Community of Practice (COP) • Support Service initiatives to put Operations Research Analysts in SOF operational staffs • Invite more allies and the interagency to these meetings

  49. Questions?

More Related