1 / 21

The Use & Abuse of Usage Measures

The Use & Abuse of Usage Measures. Ian Bannerman Managing Director, Journals. The Use & Abuse of Usage Measures. COUNTER and the Usage Factor Consistency, credibility, compatibility Usage as an indicator of quality “The Observer Effect” Recommendations. COUNTER and the Usage Factor.

lavina
Download Presentation

The Use & Abuse of Usage Measures

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Use & Abuse of Usage Measures Ian Bannerman Managing Director, Journals

  2. The Use & Abuse of Usage Measures • COUNTER and the Usage Factor • Consistency, credibility, compatibility • Usage as an indicator of quality • “The Observer Effect” • Recommendations

  3. COUNTER and the Usage Factor • Launched in March 2002, COUNTER is an international initiative serving librarians, publishers and intermediaries by setting standards that facilitate the recording and reporting of online usage statistics in a consistent,credible and compatible way

  4. Usage Factor: • J. Bollen & H. Van de Sompel (2006) Usage Impact Factor: the effects of sample characteristics on usage-based impact metrics. arXiv:cs.DL/0610154 v2 • Oct 06 - COUNTER director Peter Shepherd interviewed 7 authors, 9 librarians, 13 publishers • March 07 – web survey of 155 librarians, 1,400 authors • June 07 – UKSG report published www.uksg.org • April 08 – Invitation to tender “to investigate and test the feasibility of developing a new metric, the Journal Usage Factor, based on COUNTER usage data”

  5. Usage Factor: • Thomson Scientific Impact Factor: Total cites in 2007 for items published during 2005/6 Total items published during 2005/6 • Usage Factor: Total usage over period “x” of items published during period “y” Total items published online during period “y”

  6. Lies, damned lies and [usage] statistics • Implicit Assumptions: • That usage data is consistent, credible and compatible • That the Usage Factor would be a meaningful indicator of something (Utility? Readership? Quality? Value?)

  7. Consistent, credible, compatible • COUNTER: guidelines on filtering for robots and “pre-fetching” are in draft (release 3) • “By the end of 2007, bepress predicts that, without filtering, one out of every two logged downloads from academic sites will be made by machine or mistake” bepress press release • “I’ve successfully downloaded my own article thousands of times without setting off any alarms” – Phil Davis, Cornell (Lib-license) • Most known robots won’t get past access control on subscribed content, but it’s the unknown ones that distort the numbers

  8. Consistent, credible, compatible Downloaded 6,372 times on 25/4/07 by a Russian institution

  9. Consistent, credible, compatible Every article from vol 5-7 downloaded ~58 times by a Korean institution

  10. Consistent, credible, compatible Downloaded 1,183 times

  11. A meaningful indicator? • Davis & Price (2006) eJournal interface can influence usage statistics: implications for libraries, publishers, and Project COUNTER. JASIST v57 n9,1243-1248

  12. A meaningful indicator? • Reading or just Viewing: `power browsing’ form of information seeking: skimming and bouncing along the information surface. CIBER • “In particular, we found a general negative correlation between the CSU IF [California State Usage Impact Factor] and the ISI IF, which indicates usage over the entire CSU community is inversely related to general citation impact” Usage Impact Factor: the effects of sample characteristics on usage-based impact metrics, Johan Bollen and Herbert Van de Sompel, October 2006 arXiv:cs.DL/0610154v2

  13. Lies, damned lies and [usage] statistics • “Any economy that is based on non-transparency, blind-trust, and few (if any) consequences for unethical behaviour is wide open to abuse. When cancellation decisions ride solely on measures for article downloads, librarians should approach these figures with a healthy degree of caution.” – Phil Davis, Cornell (Lib-license)

  14. Lies, damned lies and [usage] statistics • Google's share price fell ~4% on 26th Feb, 2008, following the release of comScore data that showed a 7% decline in clicks • “…the evidence suggests that the softness in Google’s paid click metrics is primarily a result of Google’s own quality initiatives that result in a reduction in the number of paid listings.”

  15. Observer Effect • Refers to changes that the act of observing will make on the phenomenon being observed • “In 1955 it did not occur to me that “impact” would become so controversial” – Eugene Garfield, ISI

  16. Potential Observer Effect: Impact Factor • Self-citing the journal in other articles and editorials • Alerting authors to content they “should” cite • Seeking out prolific, high quality authors – who will self-cite • Building editorial boards that will attract citations • Publishing themed issues with prestigious Guest Editors • Publishing the most citable papers early in the year • Keeping review times short so citations don’t miss the 2-year impact factor window • Targeting topical areas rather than long-term studies • Publishing review articles • Publishing news, letters, obituaries, book reviews, editorials (that get cited without counting as citable articles)

  17. Potential Observer Effect: Impact Factor “The Number That’s Devouring Science.” Richard Monastersky, Chronicle of Higher Education, October 2005. “Academia is being held to ransom by these citations” - Alan Nevill, Biostatistics, Univ. Wolverhampton “It is easy to catch attention when one describes a previously unknown gene or protein… follow-up studies, to uncover the true functions of the molecules or sometimes to challenge the initial analysis, are typically more difficult to publish” - Yu-Li Wang, Physioliogy, Univ. Massachusetts “The impact factor maybe a pox on the land because of the abuse of that number” - Robert H Austin, Physics, Princeton

  18. Potential Observer Effect:Usage Factor • Getting your friends, your dog and your mother to download articles… or writing a bot to do it for them • Leaving usage data unfiltered… or worse • Publishing for students, not for researchers • Sexing-up title and key-words • Putting the HTML in the way of the PDF • Using the abstract to tease, not to inform • Stopping printed journals • Encouraging online coursepacks, discouraging printed ones • Blogging it, tagging it, posting it • Broadcasting metadata but keeping articles where they are counted – not in OA repositories!

  19. Impact Factor: Not all attempts to improve impact factor are “bad” They leave an audit trail in the literature The act of citing is usually meaningful Citation requires significant investment of time, effort and reputation Usage Factor: Would attempts to improve usage factor be “bad”? They are largely untraceable The act of downloading is often meaningless Downloading requires little investment and is practically anonymous Potential Observer Effect

  20. The Use and Abuse of Usage Measures Recommendations: • Extreme caution in (over) interpreting usage data • Consider whether the available data can address the question • Compare like with like • Further research into the factors that influence article downloads • Improved guidelines on detection, blocking and filtering of robots and other aberrant use • Careful attention to the Observer Effect when developing usage-based metrics

  21. Ian Bannerman Managing Director, Journals ian.bannerman@informa.com

More Related