1 / 45

Session One – Advancement Data: Metrics and Communication

Session One – Advancement Data: Metrics and Communication.

mserna
Download Presentation

Session One – Advancement Data: Metrics and Communication

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Session One – Advancement Data: Metrics and Communication Your data may be complete and thoroughly clean; your metrics may be perfectly designed to measure what matters; yet without a thoughtful approach to getting that information into the right hands at the right time with the right context, your efforts may still fall short. This session will provide an organizational framework and proven techniques for success.

  2. Advancement Data: Metrics and Communication Lisette Clem ‘85 ‘92MBA Director of Advancement Services Bryant University Smithfield, RI

  3. Agenda • Define “internal constituents” • Why share information? • What advancement data/metrics do we share? • When do we share? • How is the information shared (in what format)?

  4. Internal Constituents • Outside of the Advancement division (be sure to include those folks with whom you share external constituents!): • Controller’s Office • President’s Office • Other organizational divisions to whom donations are being directed

  5. Internal Constituents • Within the Advancement division: • Alumni/Constituent Relations • Development • Marketing/Communications • Within Advancement Services: • aka “the Cool group” • aka “Team Awesome”

  6. TODAY’S FOCUS: Using the Power of Information Sharing in engaging and informing our divisional constituents (and getting them to pay attention!)

  7. Our Goal: To CONSISTENTLY provide a Proactive vs. Reactive approach to information sharing and analysis, focusing on both detailed reporting (for Annual Fund managers) and high-level summaries (for VPs) to accommodate everyone’s needs* *In this case study: without the benefit of a Business Intelligence system or built-in dashboards (it’s coming!)…

  8. WHY should we proactively share information/data? (What does it matter?)

  9. Why? • Communication and analysis of the usefulness of the data in fact justifies renewed investments in technology • Fewer data requests in to our Report Writing staff; enables multi-tasking • Enhanced stewardship (for soft credit gifts) • More effective prospect management (air of friendly competition at monthly PM meetings) • Less chaos prior to Trustees’ meetings (!)

  10. WHAT do we share? • General Development Performance Reporting: • Gift and Pledge Processing (daily transaction reports) • Fiscal Year Status • Campaign Reporting • Prospect Management • Annual Giving • Fiscal Year performance and Trend data • Alumni (Constituent) Participation Rate • Data Mining model performance • Definition/Results • Strategy Recommendations • Event Management • Budget Reporting • Expenses/Revenue • Return on Investment • Alumni Engagement Tracking

  11. Campaign Reporting

  12. Prospect Management

  13. Prospect Management: Key areas of reporting • Prospect Pool • Visit Reports • Tickler Reports (to support Moves Management) • Results Reports • Pledges and Gifts • Planned and Pending Solicitations • Ongoing Program Management

  14. Visit ReportVisits in FY14 vs. FY13 (as of X date)

  15. Summary Results Report – Part I

  16. Summary Results Report – Part II

  17. Ongoing Program Management: The “RED” Report

  18. Annual Giving: Fiscal Year performance and Trend data

  19. Fiscal Year Annual Giving

  20. Fiscal Year Annual Giving

  21. Fiscal Year Annual Giving FY14 vs. FY15 New PledgesMonthly Comparison(Excludes Planned Gifts)As of 5/31/15

  22. Constituent Participation Rate • The Politics of Participation • One size does NOT fit all • CASE , VSE, and US News have standards for their comparisons • These standards may or may not be useful, or even accurate, for your purposes • When submitting for them, make every attempt to understand their standards, to preserve the value of benchmarking with other institutions

  23. Alumni Participation Rate (at Bryant University) • In both our external and our internal alumni participation rates, senior giving donors are counted as alumni donors, as anyone who attends Bryant for 2+ semesters has the option of calling themselves an alumnus/ae. The entire senior class is included in the denominator. • Our “lost” alumni percentage is approximately 5.5%, down from approximately 10% four years ago. ALL NOT-LOST Alumni are considered “of record” per CASE standards, regardless of giving history. • In our internal donor count, all soft credit donors ARE included; however, approx. half of these are alum/alum spouses, which CAN be counted per CASE. (Counting the remainder of our soft credit donors accounts for approximately +0.4% of our internal participation rate.) • Our internal alumni participation rate excludes those on a “Do Not Solicit (DNS)” code (approx. 650 DNS alums of our 36,500 alumni of record). • Including ALL degree holders vs. only including UG degree holders decreases our alumni participation rate by just 0.2%.

  24. Data Mining: Definitions/Results and Strategy Recommendations

  25. Current Models • Affinity Insight Acquisition Scoring Model: The scores from this model can be used to identify those never donors most likely to be responsive to annual fund appeals. • Predictive Affinity Retention Scoring Model: The scores from this model can be used to identify and retain those alums who have given in the past but have not made a gift in the current fiscal year. • Predictive Affinity Donor Scoring Model: The scores from this model can be used to identify alums most likely to make “leadership-level” annual gifts, as well as new major giving prospects.

  26. FY12 Results – Acquisition Model (Recall that each score, or decile, represents 5% of our alumni population, so expected results would be 20% for four scores.)

  27. FY12 Results – Donor Scoring Model

  28. Data Mining Strategy: Recommendations • Acquisition Model: • “Acquisition Campaign” – New Donor Drive • Targeted DM to scores 15-20 – “flyer” format (not letter!) • Telefund focus on scores 15-20 • Targeted e-solicit messages to scores 15-20 • Retention Model: • “Retention Campaign” – Save a Donor • Intense Telefund and E-Solicit focus to scores 15-20 • Follow up with personal outreach to all those not yet renewed by 5/31/13 – Firm Goal is 100% renewal for scores 15-20 • Donor Scoring Model: • Leadership Giving Campaign – Raising Sights • Assign scores 17-20 to Bryant Fund Leadership Giving officer • Personal Outreach

  29. Data Mining: Coming Soon!! • Predictive Affinity Discovery Scoring Model: This model will focus on those alums who are most likely to accept a Discovery Visit invitation. The model will be built from alums who have been asked to accept a Discovery Visit. The focus of the model will be on alumni database variables that discriminate between those alums who accepted a Discovery Visit and those who didn’t. The result of the modeling process will be a score for each reachable alum in the database.

  30. Event Management

  31. Budget Reporting: Expenses/Revenue and Return on Investment (ROI)

  32. Cost per dollar raised: Revenue basis

  33. Benchmarking: ROI

  34. Alumni Engagement Tracking

  35. WHEN do we share? “when it happens” Daily Weekly Monthly Quarterly Annually

  36. HOW do we share the information? Paper Email (Internal) Web site Regular Meeting Common drive Other? (Soon: Dashboard!)

  37. Other Useful Examples: • Advancement Services Information, Reference and Forms Guide • Advancement Services Information and Resources Web page (live quick links) • Advancement Services Annual Report

  38.  Fails  • TOO MANY prospect management reports – negative impact on perceived relevance • Don’t save copies of everything (maxed out our server allotment!) • Edits/customization by person will be requested – saying no is OK! (unless it’s the VP)

  39. Other suggestions/ideas, management tips, best practices -- what works for you to keep colleagues, constituents and staff informed?

  40. Thank you! Have a great day!  Lisette Clem, Bryant University lclem@bryant.edu 401-232-6805

More Related