330 likes | 464 Views
LIR Annual Seminar 2006. Needles in Haystacks : Metasearching and Management of Miscellaneous Material. Quick introduction NUI, Galway experience MetaLib in action Metasearch workflow Google Scholar Information Literacy Points to consider. Introduction. Also known as:
E N D
LIR Annual Seminar 2006 Needles in Haystacks : Metasearching and Management of Miscellaneous Material Ronan Kennedy; NUI, Galway
Quick introduction NUI, Galway experience MetaLib in action Metasearch workflow Google Scholar Information Literacy Points to consider
Introduction • Also known as: Multisearching, cross-searching, broadcast searching, integrated searching • Metasearching is a process in which a user submits a query to numerous information sources simultaneously. Each of these resources has its own search engine. The metasearch system transmits a query to the search engine and directs it to perform the actual search. [Tamer Sadeh, Ex Libris] • A metasearch system ‘piggy-backs’ several other database search engines to search them at the same time. The results are grouped together and displayed in one group
Introduction • Z39.50 protocol • Approved by NISO in 1988 • “Z39.50 makes it possible for a user in one system to search and retrieve information from other computer systems (that have also implemented Z39.50) without knowing the search syntax that is used by those other systems” http://www.loc.gov/z3950
The NUI, Galway Experience • MetaLib from Ex Libris • Used in conjunction with SFX Link Resolver • Went ‘quietly’ live February 2005 • Subject groupings to help usability • 45 out of a possible 137 resources are cross-searchable
Behind the scenes - metasearch Search syntax translated Z39.50 protocol applied Interface A Resource A Interface B Resource B Interface C Resource C Interface D Resource D Interface E Resource E Grouped results displayed Results gathered by metasearch engine
With a metasearch engine & a link resolver… • Users don’t have to know where to go to begin a search • Users don’t have to know what resources to perform a search in – subject categories • Users can save a massive amount of time searching • Users can compare results of several resources simultaneously • Users don’t have to know where to go to find the fulltext of the article • The link resolver saves time by ‘building a bridge’ to the articles • ---- knowledge of Library resources no longer necessary
Stumbling Blocks… • Google Scholar • Information Literacy
Google Scholar • With a unique ranking process, unparalleled hardware resources, sophisticated crawling techniques, and access to published materials, Google Scholar is being positioned as an essential resource [Tamer Sadeh, Ex Libris] • Google Scholar is already proving that a harvested, centralized search approach is more useful [the digital librarian blog] • If Google Scholar becomes a better provider of scholarly articles and information than a typical university library, then we’re going to struggle to justify not only our budgets but our role in the academic process [the digital librarian blog]
Google Scholar is good because… • easy to use • the branded look and feel • excellent search algorithm • extremely fast • no authentication issues • covers a large broad range of resources
… but keep in mind … • At what level does it make most sense for resources to be aggregated for more effective use. Think of two poles – the fractured response available to a library user, and Google. [lorcan dempsey’s weblog] • Roy Tennant: it’s a good principle to centralise when possible like Google Scholar – but that ignores the need to segregate when possible too. To use WorldCat as an e.g., a user may only wish to see items they can get their hands on. Google has no plans to offer any criteria to segregate. A central argument against Google Scholar – finding good stuff is as much what you don’t search as what you do
… and it’s bad because • does not disclose information about its content, or define scholarly • some big publishers e.g. Elsevier, ACS and Emerald are not yet included • updates not frequent enough • material that Google Scholar incorporates from a publisher does not always provide complete coverage • Google Scholar ‘plays fast and loose’ with hits (Peter Jasco) (Google Scholar arranges results by relevance, taking into account the number of times that the items has been cited in scholarly literature)
Relevance – What is defined as scholarly? • Mercury – Freddie Mercury; Venus – Venus Williams
Google Scholar Vs Dedicated Metasearch Engines? ---- no contest!
But who really cares? • Roy Tennant: “Only librarians like to search; everyone else likes to find”. Google Scholar has taught us, quite powerfully, that the user just wants a search box. Arguments as to whether or not this is ‘best’ for the user are moot – it doesn’t matter if it’s the best if nobody uses it [‘Federated Searching: Put In Its Place’; Miller, Todd; LibraryJournal.com; 15/04/2004] • What students want: to save time, fewer choices, easy to use Google-like interfaces, good enough results, fulltext [‘Brick and Click Libraries Symposium’; Cox, Christopher; Library Hi-Tech News; Vol 22 Issue 10]
Metasearching & Information Literacy • To be able to search so many different resources, a metasearch engine has to sacrifice something …. …. Advanced searching e.g proximity operators, value-added features e.g saved searches • Some claim that libraries should take advantage of metasearching to serve patrons content with a ‘good enough’ answer. Good enough pretty much sums up metasearching quality, but should our schools and colleges be satisfied with mediocrity? Do we pride ourselves with producing good enough library researchers and good enough graduates? [‘Do Want Or Need Metasearching?’, Frost, William, Libray Journal 1/04/2004]
Negative impacts of metasearching on Information Literacy • The value-added features of a database are now lost – alerts, saved searches, proximity operators etc • No more critical evaluation of databases – just click them all and hope for the best • Users become less aware of our resources: “students do not know what resources we have, have little interest in learning about alternative titles … feel overwhelmed … lack the time and inclination to learn more”[‘Metasearching in Boston College Libraries’; Tallent, Ed; New Library World, Vol 105, No 1196] • What we have to work against :-Are we sacrificing learning skills for a convenient search interface? Does metasearching adovcate google-type thinking? Does it discourage advanced information literacy skills?
Issues for training • Should libraries focus on training users to use native interfaces more effectively, or use portals to increase overall usage? • More effective searching – less results but of a higher quality – less risk of information saturation • Can we separate users’ needs – undergrads, postgrads, researchers – metasearching will have a separate impact on each one – training implications • Separate strategy for non-compliant resources?
Information Literacy Standards • ACRL Information Literacy Competency Standards for Higher Education(http://www.ala.org/ala/acrl/acrlstandards/informationliteracycompetency.htm) • determine the extent of information needed • access that information effectively and efficiently • evaluate information and its sources critically ‘Information Literacy For The Real World’; Terrell John; Lifelong Learning Conference, CQU; 2004 [http://lifelonglearning.cqu.edu.au]
Information Literacy Standards • ACRL Standard 1 – determine the extent of information needed(amount and type) • Grouping several database types together makes it difficult to distinguish between those types & increases threat of information overload. • resultsmay highlight more types than were originally anticipated – journal articles, standards, proceedings etc.
Information Literacy Standards • ACRL Standard 2 - access that information effectively and efficiently • Metasearch engines sacrifice ability to do complex searching. Users cannot do advanced searches to the same extent as on native interfaces. • Efficiently accessing the information is taken care of, and is aided by link resolvers
Information Literacy Standards • ACRL Standard 3 – evaluate information and its sources critically • Some native interfaces allow clustering of results • Having multiple resource results listed together does allow for critical evaluation of sources
Advantages of metasearch systems • Can search multiple databases concurrently [time and ease of use] • Related databases can be grouped per subject • Only one interface – less learning required • Automatic removal of duplicates (allegedly) • A locally controlled and branded system that enables librarians to control levels of access • For Librarians: increased usage of expensive/obscure databases
Disadvantages of metasearch systems • Compliancy! 45 out of 137 in NUI,G • Metasearching is costly – hardware, maintenance, training • Information Literacy implications • Problems with relevance ranking • De-duplication technology not perfect • Slow • Risk of too many results – evaluating results more difficult • No database specific search features – applying limits etc
What does the future hold? • Technological developments: proximity operators, thesauri, alerts etc • More compliant vendors • Improved speed
In Conclusion - for the moment… • Metasearching will not help experts find information with greater precision. • It’s great for the average user – speed and simplicity • For information experts it’s an excellent starting point for evaluation & source identification • The Librarian’s Role? • Accommodate everyone’s needs! • …… who knows what future metasearching developments will hold??