1 / 28

Auditing Section Doctoral Consortium 2006 Auditing Section Midyear Conference January 2006

Experimental Research in Assurance & Auditing. Auditing Section Doctoral Consortium 2006 Auditing Section Midyear Conference January 2006 Linda McDaniel University of Kentucky. Acknowledgements. Many colleagues, but particularly Jane Kennedy Bill Kinney Laureen Maines Mark Peecher.

harlow
Download Presentation

Auditing Section Doctoral Consortium 2006 Auditing Section Midyear Conference January 2006

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Experimental Research in Assurance & Auditing Auditing Section Doctoral Consortium 2006 Auditing Section Midyear Conference January 2006 Linda McDaniel University of Kentucky

  2. Acknowledgements Many colleagues, but particularly • Jane Kennedy • Bill Kinney • Laureen Maines • Mark Peecher

  3. Experimental Research Session: Objectives • Review strengths and weaknesses of experimental research • Discuss how to capitalize on these strengths (i.e., summarize elements of good experimental design) • Illustrate with an example arising from SOX reguations

  4. Experimental Research is … Systematic, controlled, empirical, critical investigation of phenomena guided by theory and hypotheses about the presumed relations among such phenomena (Kerlinger) controlled, guided by theory presumed relations characterized by, • active manipulation of variables of interest to generate new data • the random assignment of participants to specified conditions

  5. Comparative Advantages • Ability to test causal relations, not just associations • Manipulate variables of interest • Internal validity • Control/randomize effects of other variables • Disentangle variables confounded in natural setting

  6. Comparative Advantages • Timeliness: no need to wait on the real world to create data See McDaniel and Hand (CAR, 1996) • Ex ante research is possible • Conditions that do not exist in natural settings can be created in the lab • Gaynor, McDaniel, and Neal (TAR, forthcoming) • Hirst and Hopkins (JAR, 1998)

  7. Comparative Advantages • Examination of sub-judgments (determinants of decisions) and processes • Kadous, Kennedy, and Peecher (TAR, 2003) • Hoffman, Joe, and Moser (AOS, 2003) • Maines and McDaniel (TAR, 2000) Thus, experiments can answer how, when, and why important features of the accounting process and environment influence behavior as well as decisions

  8. Relative Disadvantages • External validity • Task is abstraction from real world • Variables manipulated at discrete levels • Participants may not be representative • Small sample size • Limited access to participants for real-world, complex auditing/accounting issues • Reduced ability to replicate • No second chances(without significant costs)

  9. Designing an Experiment • After the researcher identifies an interesting, relevant question that calls for an experiment … (i.e., post Kinney 3 paragraphs) • … he/she must develop an effective (good) research design, i.e., to draw causal inferences • Use theory to guide predictions “It is the theory that decides what can be observed.”Albert Einstein • Minimize threats to construct, internal, and statistical validity

  10. Independent (X) Dependent (Y) Concept X Concept Y Conceptual Operational Definition X Operational Definition Y Operational Prior-influence & contemporaneous factors (alternative explanations) Vs and Zs Libby et al. (2002) Predictive Validity Framework 1. Theory 2. Construct Validity 3. 4. Statistical Validity 5. Internal Validity

  11. Designing an Experiment • What variables will you manipulate? • Number & levels of independent variables • Interactions? Libby et al. (2002) Predictive Validity Framework

  12. Designing an Experiment • What variables will you control? (internal validity) Account for Vs and Zs (see Kinney (TAR, 1986)) by • Random assignment of participants • Hold variables constant by design/ selection (within-participant design; match on Vs) • Measure covariates / statistically remove effects (e.g., covariate analysis, regression)

  13. Other Necessary Design Choices • Professional participants? • Incentives? • Within- versus between- participants design? See Libby, Bloomfield, and Nelson (AOS, 2002) for a discussion of each

  14. Professional Participants? • Theory should dictate this choice • Libby and Kinney (TAR, 2000) • Maines and McDaniel (TAR, 2000) See also Libby and Luft (AOS, 1993) • Professionals are a limited resource • Libby, Bloomfield & Nelson (AOS, 2002) • Professionals exhibit stronger selection bias relative to non-professional groups • Peecher & Solomon (IJA, 2001)

  15. Incentives? • Why? “…no skin in the game…” • When? • Camerer & Hogart (JRU, 1999)

  16. Within- versus Between- Participants? • Enhanced statistical power as participants serve as their own control See Schepanski, Tubbs, and Grimlund (JAL, 1992) • Increased salience of treatment effects • Vulnerability to carry-over effects • Requires proper statistical analysis

  17. Turning Observations into a Researchable Question Do the new SOX regulations related to NAS result in improved audit quality? Why is this interesting or important? How can we examine?

  18. Real World Problem and Regulatory Actions • After corporate abuses, SEC seeks to ban all auditor-provided NAS • Concerns about auditor independence, audit quality, and investor confidence • Conceding certain NAS improve audit quality, SEC limits NAS auditors can provide to clients butrequires • ACs to pre-approve services after considering auditor independence and audit quality • Registrants to disclose AC pre-approvals and fees paid to auditor (by category)

  19. An Example The Effects of Joint Provision and Disclosures of Non-audit Services (NAS) on Audit Committee (AC) Decisions and Investor Preferences

  20. Theory • Pre-approval process makes ACs directly accountable to 3rd parties for auditor independence and audit quality • Disclosures (of pre-approvals and audit fees) makes ACs publicly accountable to investors for perceived independence • anecdotal reports suggest ACs are avoiding allowable NAS

  21. Predictions / Research Hypotheses • ACs will be • morelikely to recommend joint provision when the NAS improves audit quality (AQ) • less likely to recommend joint provision when public disclosures are required • The disclosure effect holds even when ACs believe joint provision improves AQ

  22. Independent (X) Dependent (Y) Conceptual Operational Experience with NAS approval; beliefs about effects of NAS on auditor independence and synergies with audit; audit experience, etc. Vs and Zs Predictive Validity Framework NAS/AQ relation & required public disclosures ACs pre-approval decisions Account-ability Type of NAS and type of company Joint provision recommendation

  23. OperationalIndependent Variables • Type of Service: Effect of NAS on Audit Quality • Risk Management Services • Joint provision improves audit quality • Human Resource Services • Joint provision has no effect on audit quality

  24. Measured Independent Variable • Measured Independent Variable: • Belief about NAS and audit quality relation • See Libby et al. (AOS, 2002) for when this approach is justified and implications for interpretation

  25. Operational Independent Variables • Type of Company: Disclosure Requirement • Publicly-traded company • Company is required to make public disclosures • Privately-held company • Company is not required to make public disclosures

  26. Operational Dependent Variableand Controls • Measured Dependent Variable: • Joint provision recommendation • Reasons for and against firm selection • Manipulation Checks / Control Variables: • Audit quality manipulation check • NAS quality by provider • Effect of joint provision on auditor objectivity • Demographic information

  27. Participants & Other Choices • Participants were Corporate Directors attending a KPMG Audit Committee Institute Roundtable • No monetary incentives • Between-participants design

  28. Some Lessons Learned • Work on projects that really interest you and for which you have a comparative advantage! • Good experimental papers require a lot of up-front time and effort. This pays off! • Always: Write Kinney 3 paragraphs / have others review! Prepare Libby boxes Pilot test (as many times as necessary) Share with colleagues often throughout the process

More Related