1 / 48

DEFINE

Six Sigma Project : IRS -Library Project Title : Maximization of Electronic Databases Usage Team Leader : Sharon E. Henry / Lei P. Correo Members : Daisy Boro Dorothy Geronimo Gina San Buenaventura. PROJECT DESCRIPTION. DEFINE. PROBLEM STATEMENT

nona
Download Presentation

DEFINE

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Six Sigma Project : IRS -LibraryProject Title : Maximization of Electronic Databases UsageTeam Leader : Sharon E. Henry / Lei P. Correo Members : Daisy Boro Dorothy Geronimo Gina San Buenaventura

  2. PROJECT DESCRIPTION DEFINE • PROBLEM STATEMENT • Recent developments in IT and users’ demand for electronic sources of information resulted in the increased number of available online resources. This development has changed the information preference of ADB staff and has triggered the growth of the Library’s subscription to electronic databases. • The Library is faced with the challenge of ensuring that these resources are being utilized through accurate usage statistics in order to justify its renewal. • Three to four people are handling the monitoring and we wish to have one focal person to consolidate the statistics reports • Current method for monitoring is simply to add the figures given by the suppliers and we hope to determine if these data are what we should monitor or not. • The terms used differ from one supplier to another so we need to standardize the terms to be consistent and clear on meanings.

  3. DEFINE • GOAL STATEMENT • The Library aims to: • Identify e-resources that are highly used and least used • Ensure that we are effectively managing these resources • Standardize terms used in the measurement and monitoring of these resources • Establish standards in recording usage statistics to be used in measuring comparability of electronic resources • Determine which databases should be renewed or cancelled.

  4. DEFINE BUSINESS CASE The Library exists to serve the information needs of ADB and its staff. A number of online subscriptions have been added to its collection for easy and fast access to vital information needed by the clients in their day to day activities. These resources are expensive so the Library needs to be careful in choosing what to add to the collection, what to remove, and what to maintain. New resources are becoming available every year and the Library needs to keep up with the increasing demands of its clients for more options and balance them with the Library’s available resources.

  5. High Level Process Map DEFINE

  6. DEFINE Detailed Process Map

  7. DEFINE Acquire dbases Promote Dbases Make Dbases Available in IRS Website & Lib. Catalog Monitor Usage

  8. DEFINE Stratification Matrix *On-site *Password *IP Does type of access matter? With or without alert? Will alerts help? *HQ *RM *RO Does location matter? DOCUMENT ACCESS *Staff *Consultants *Dependents *Researchers Does type of user make a difference? *Gen. Reference *Aggregators *Single Publisher *Specialized Subj Does type of eResources matter? *Single / Password *Simultaneous *Enterprise-wide Is cost a factor?

  9. MEASURE

  10. Green (New) Red (Not current)

  11. Data Collection Form MEASURE No Statistics Red (Below $1000)

  12. MEASURE 14 Titles amounting to $298,373.68 or 80.65% have data 7 Titles that constitute 80% of our budget which have been monitored

  13. MEASURE Pareto Chart shows that around 80% of our budget was spent on 6 of our electronic subscriptions.

  14. Data Collection MEASURE *Data for 2004 were included in the EIU report **Data is based on number of users who signed in the log sheet Note: Reported figures do not match the total of available data on record which means there had been some inconsistencies / errors in reporting.

  15. MEASURE Critical to Quality

  16. MEASURE Data for Project Y

  17. MEASURE Total Cost of Current Subscription for eResources = $ 369,948.17 12 Titles have not been monitored due to unavailability of usage report statistics QUICK WIN 1: Assign somebody to coordinate closely with the supplier to provide report or coordinate with OIST if they can help. This eliminated duplication of work QUICK WIN 2: Define what data we can do without and what data we must have. Now we can compare outputs for similar services and determine which will best address our clients’ needs

  18. ANALYZE QUICK WIN 3: We have now identified a focal person to ensure that we have the figures on record and ensure that data are regularly provided by the supplier, where already established Further investigation showed that since some data represents the no. of users (Bloomberg & CEIC), we have not really been monitoring usage data. Also, we discovered that since the terminal is not always visible to the Reference staff on duty we cannot guarantee that the figures (no. of users) are correct. On top of that, even Librarians who use the database do not consistently sign in the log sheet when in a hurry or attending to multiple clients. *Based on available data, we cannot determine what was measured – users or usage or access. *Since we have not previously defined our data gathering policies / procedures, how can we say we have the right kind of data? *If we have difficulty with the data gathering and not sure of the reliability of data available, can we aim to MAXIMIZE usage or should we concentrate on our data gathering capability and establish reliability of data first?

  19. Environment / Others Users Awareness Various access points Lack of Training Internet connection Internal or external Usage Statistics Orientation / briefing User friendliness Reliability/ Credibility Videoconference Desktop access Alerts / Current awareness Various features Reference services Statistics availability Processes Fast server Supplier / Services ANALYZE Prioritized List of X’s Establish consistency and reliability of data first

  20. Data Collection Form ANALYZE FACTIVA REPORT This is just the data for the “Free Text Search” category. Similar data is given for “News Page, Other, Track, Track Delivery, and Quotes.” Total figures for number of items constitute the Factiva reported statistics

  21. Data Collection Form ANALYZE OXWEB REPORT Bloomberg – data is based on the number names on the log sheet. Emerald – statistics generated online is classified according to TOCs viewed, abstracts viewed, full-text downloads, full-text turn-aways, and user sessions.

  22. IMPROVE ProposedSolution Based on the presentation of statistics provided by the suppliers, it appears that available data differ from one supplier to another. We needed to identify similarities and establish standards. A data collection template was designed to capture the data available, whenever possible:

  23. IMPROVE With the newly designed template, we tried to gather data starting January 2005. Most suppliers were able to give us the data we needed, but not all and we needed to make clarifications first to ensure we’re referring to the same type of access. Next slide will show sample data record using the template created

  24. IMPROVE Piloted Solution OXWEB EIU Points to consider: * if some data are missing for most of the databases, is it still worth monitoring * are these the data we want / need * can most suppliers provide us with the data / can we monitor the data ourselves

  25. Modified Template IMPROVE

  26. IMPROVE

  27. IMPROVE ANALYZE

  28. IMPROVE

  29. IMPROVE

  30. IMPROVE • With the Usage statistics defined and established, next step is what we need to do to ensure that future transactions / subscription can be managed / monitored effectively. • Points to Consider: • What can we control • What are essential / important • Who should be responsible

  31. SOLUTIONS MATRIX IMPROVE

  32. CONTROL Detailed Process Map Critical Points Not needed if monitoring is effective

  33. Subscribe to service Req. usage report from supplier Record / link in OPAC / website Announce to concerned dept / ofc Statistics available? Demo? Incorporate in the monitoring sheet Identify other means for monitoring Y N Conduct demo / tutorial Monitor use / Usage Statistics CONTROL Improved Process Map

  34. Statistics available from supplier? Monitoring Usage statistics No Monitor by Library and in coordination with users Yes Determine what type of data is available and set LSL Determine reasonable use and set LSL? Above LSL ? Renew Cancel Yes No CONTROL Improved Process Map

  35. CONTROL * Although the process appeared to have been longer, with the new steps in place we now : 1. have better control for over our subscriptions 2. know what supplier can or cannot provide 3. are aware of critical areas in our process 4. have concrete evidence to show usefulness / relevance of our subscriptions 5. became more focused on what we need in order to effectively evaluate our resources

  36. CONTROL LESSONS LEARNED / BEST PRACTICES: 1. standardization of necessary data is important 2. not all e-resources are the same and not all suppliers can provide requested data 3. work with clients from the start so they are aware of Library limitations and requirements 4. ensure that Library staff are aware of the different resources available 5. orient staff on the importance of statistics as solid support to different activities

  37. CONTROL BENEFITS: 1. We’re able to save about ¼ employee time after we’ve identified a focal person to monitor usage statistics 2. We’ve standardized our statistics so we now know what we’re monitoring and what to ask from suppliers 3. We can determine return on investment for our subscriptions 4. Once the project is completed we will be able to determine useful resources, understand clients’ needs better, and provide better services 5. We were able to establish the process / procedure for determining resources that should be renewed or cancelled

  38. Environment / Others Users Awareness Various access points Lack of Training Internet connection Internal or external Usage Statistics Orientation / briefing User friendliness Reliability/ Credibility Videoconference Desktop access Alerts / Current awareness Various features Reference services Statistics availability Processes Fast server Supplier / Services CONTROL

  39. Revised CTQ CONTROL Since 2004 data gathered were not based on identified type of access, we need to revise the Lower Specification Limits for the resources based on relevant and reliable figures gathered in 2005. We reviewed the available figures from January to May 2005 and decided to use the lowest data and added a 10% increase as our LSL.

  40. Revised CTQ CONTROL

  41. Usage Chart CONTROL Usage Data for Jan-Sep 2005

  42. Usage Chart CONTROL Bloomberg’s average for 9 months is 15, 44% more than the set limit.

  43. Usage Chart CONTROL Usage average is 781, 56% more than the set limit. The data show that usage is lower during the first quarter of the year. We may consider monitoring this in the next year to determine if this is a trend.

  44. Usage Chart CONTROL Average = 14885 which is about 77% higher than the set lower limit. Although there were 2 instances of usage below the limit, we noted that it was the time when EIU was having techinal problems and we can’t access the site for a few days.

  45. Usage Chart CONTROL Average for Jan-Sep is 93, 33% above the set limit.

  46. Usage Chart CONTROL Average usage for the past 9 months is 14,236 or 91% higher than the limit.

  47. Usage Chart CONTROL Average (Jan-Sep) = 843 or 58% above the set limit.

  48. Closing CONTROL Maximization of Usage to Decrease Cost per use and Increase ROI: 1. Close monitoring of some resources is necessary to determine if performance has the tendency to go below the specified limit and also see if there is a trend in use for certain months / quarter. 2. Detailed analysis is recommended for some resources, like EIU which has various sub-parts 3. Process should be replicated for all other resources to ensure that the Library maintains only the highly used resources and ensure that we get good value for our money

More Related