1 / 54

WISER: Bibliometrics II The Black Art of Citation Rankings

WISER: Bibliometrics II The Black Art of Citation Rankings. Angela Carritt Juliet Ralph November 2011. These slides are available on http://www.bodleian.ox.ac.uk/services/training/wiser/presentations. Overview of Session. What are bibliometrics ? Why bother? Problems

grice
Download Presentation

WISER: Bibliometrics II The Black Art of Citation Rankings

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. WISER: Bibliometrics IIThe Black Art of Citation Rankings Angela Carritt Juliet Ralph November 2011 These slides are available onhttp://www.bodleian.ox.ac.uk/services/training/wiser/presentations

  2. Overview of Session What are bibliometrics? Why bother? Problems Bibliometric measures for …a researcher …an institution …a country …a journal using Web of Science, Scopus, and other analytical tools

  3. What are bibliometrics? Citation analysis

  4. Why bother • Benchmarking of departments and research groups • Grant applications • Recruitment of individuals • Where to publish

  5. Bibliometrics & the REF Citation data will not be as important as initially anticipated Some panels will use citation data Panel A – Life Sciences(1) Some parts of Panel B - Earth and environmental sciences, chemistry, physics and computer science(1) Some parts of panel C – some areas of geography, environmental studies and archaeology but not all, economics/econometrics(1) Expert review will be the primary way of measuring impact even where citation data is used REF team will provide citation data (from Scopus) More information - HEFCE, Ref 02.2011 Assessment framework and guidance on submissions (www.hefce.ac.uk/research/ref/pubs/2011/02_11/) (1) Information from Gibney, Elizabeth ‘REF panels to differ on impact and citation use’ 29-07-2011www.ResearchResearch.comhttp://tinyurl.com/6bv69u2

  6. http://www.slideshare.net/guest633b30/bibliometrics-and-scientometrics-1065282http://www.slideshare.net/guest633b30/bibliometrics-and-scientometrics-1065282

  7. The h-index: to quantify an individual’s research output Aims to measure productivity and impact. h-index = number of papers you have published which have that number of citations or more. e.g.an h-index of 5 means you have 5 papers which have been cited at least 5 times each.

  8. Your h-index in Web of Science Do an author search. Get list of papers by them. Sort by Times Cited. Click on Create Citation Report - to analyse batch of papers.

  9. Calculation of h-index Analysing 208 articles in WoS. H-index = 53

  10. 53 articles cited 53 times or more

  11. Citation tracking & analysis in SCOPUS • Scopus covers 18,000 journals & conference proceedings • Science, Medicine, Social Sciences & Humanities • Each record for a paper shows the number of times it has been cited in Scopus since 1996 • Similar analytical tools to Web of Science • www.scopus.com

  12. Your h-index in Scopus Do an author search. Tick author name. Click on View Citation Overview.

  13. h-index = 21. Why? Based onpapers in Scopus published after 1995.

  14. Lower if exclude self-citations h-index = 17

  15. Your h-index and Google Scholar www.harzing.com

  16. Calculate it with Publish or Perish • Free download. • Uses citation data from Google Scholar.

  17. Bibliometrics for an institution • In Scopus, Affiliation search allows search by institution name - university of oxford retrieves same results as oxford university

  18. Overview of institution

  19. Publications by a department Search Address field. Can search by postcode or name of department/college. NB. Some authors use dept postcode, some use main University one (OX1 2JD).

  20. Re Research Committee Recommendation (1) All authors shoulduse ‘Oxford University’ or ‘University of Oxford’ in their publication address, to ensure that the publication may be captured by citation databases. (2) [may also] cite either dept or college.

  21. Oxford & the REF

  22. Symplectic Elements • Oxford’s tool for gathering references and submission for the REF. • Automatic searching of databases such as Web of Science & Scopus. • Is your department using it? • For more information go to • www.admin.ox.ac.uk/pras/research/symplectic/ • Contact symplectic@admin.ox.ac.uk

  23. Bibliometrics for journals • Bibliometrics can be used to measure the influence of academic journals • Help you to decide where to publish

  24. Bibliometrics & journals: Tools • ISI Journal Citation Reports (JCR) (ISI Web of Science) • SCImago Journal Ranking (SJR) and SNIP (Elsevier Scopus) • Eigenfactor (sponsored by the Bergstrom Lab in the Department of Biology at the University of Washington

  25. Journal Citation Reports (JCR) Based on citation data from Web of Science Covers > 5,900 journals in science and technology > 1,700 journals in the social sciences

  26. JCR on Web of Knowledge

  27. JCR on Web of Knowledge

  28. Immediacy IndexMeasures how quickly articles are cited. Calculated: no. of citations to articles published this year ÷ no. of articles published this year. JCR Impact Factor - Number of times the “average” article published in the previous 2 (or 5) years was cited this year. Calculated: no. of citations to articles published in the last 2 (or 5) years ÷ no. of articles published in same period. Cited Half-Life - How many years you have to go back to account for 50% of citations to the journal. e.g. 50% of citations were to articles published in the last 3.5 years. The rest cited earlier articles.

  29. Detailed view

  30. Detailed view continued

  31. Detailed view continued Citations TO the journal by year of cited article (e.g. 333 of this year’s citations to Biological Review were to articles published in in 2005 )

  32. Detailed view continued Citations from Biological Review (to other journals and self cites) by year of cited article E.g. 334 citations from Biological Reviews journal cited articles published in 2007

  33. Type of articles included

  34. Eigenfactor Metrics http://www.eigenfactor.org/ Take into account prestige of citing sources Use “Google style” algorithms Attempts to measure how often the average researcher would encounter the journal 2 metrics Eigenfactor – increases with the size of the journal Article Influence– Takes into account number of articles published. More comparable to the JCR impact factor Google’s PageRank from http://en.wikipedia.org/wiki/PageRank

  35. Eigenfactor

  36. Eigenfactor

  37. Eigenfactor

  38. SJR – an alternative impact factor • http://www.scimagojr.com/ • SCImago Journal Rank developed by Elsevier in partnership with Spanish academics. • Citation data from Scopus • 50% more journals than ISI Web of Science. • Weights citations according to the status of the citing journal • Updated every 2 months • More info at http://www.info.sciverse.com/scopus/scopus-in-detail/tools/journalanalyzer/

  39. SCImago and the SJR

  40. SCImago Journal Ranking

  41. SCImago journal search • Orange line = average number of citations (i.e. same as JCR) • Purple line = SJR More metrics

  42. Comparing journals

  43. Measuring impact for a country

  44. SNIP – also by Elsevier • Source Normalized Impact per Paper. • Weights citation counts according to the total number of citations in the subject area • SNIP aims to account for differences in citation potential and topicality across research fields. • ‘Citation potential’ (citation frequency) higher in life sciences than maths, engineering, social science. • Higher in basic science than applied or clinical journals. • Scopus is again the data source. • http://www.journalmetrics.com/

  45. SNIP (and SJR)

  46. SJR Use tabs to see different metrics Choose journals here Use slider to select different time periods

  47. % of review articles Total number of citations SJR, Snip… SJR – takes into account prestige of citing journal SNIP – contextual impact. Weights citations according to total number of citations in the subject Number of articles published % of articles not cited at all

  48. Favourably reviewed in Nature SJR & SNIP freely available – not dependent on subscriptions to Scopus. Worthy challenger to ISI.

  49. Journal Impact Factors: Problems Use with caution…Results are skewed by many factors… • Size • Frequency / time of publication • Type of content - review articles are more heavily cited than original research… • Journals that are not indexed by WOS / Scopus are disadvantaged • Non English Language journals disadvantaged • Problems when journals change names • Results are not comparable across discipline (some journals in the wrong discipline) • Journal impact factors should NEVER be used to assess impact of researchers / groups etc

More Related