1 / 33

The Evaluation of the Academic Library Web Sites Evaluation Checklist

The Evaluation of the Academic Library Web Sites Evaluation Checklist. Ms Sasipimol Prapinpongsakorn Srinakharinwirot University Bangkok, Thailand Email: sasipimol@swu.ac.th. Education Background & Current position. Background of research problem.

tucker-webb
Download Presentation

The Evaluation of the Academic Library Web Sites Evaluation Checklist

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Evaluation of the Academic Library Web Sites Evaluation Checklist Ms Sasipimol Prapinpongsakorn Srinakharinwirot University Bangkok, Thailand Email: sasipimol@swu.ac.th

  2. Education Background& Current position

  3. Background of research problem As for Thailand, although most libraries have already had their own web sites, there were no standardized rules for evaluation and no speculations on web site’s contents or presentation (Saman Loyfar, 2001: 2). Moreover, since the objectives of each library web site are different, there are significant differences on the contents and information standard of these library web sites, for example, insufficient information, outdated information, and complex information organization which is difficult to access. These all confuse the users.

  4. Background of research problem Then in 2004, Chimplee Wimonturm (2004), a Srinakharinwirot University graduate student in Library and Information Science, began her research on constructing an evaluation model for academic library web sites. However, this research was not completed because it was not tested in an authentic situation.

  5. Research Objective  To evaluate and develop the academic library web sites evaluation checklist. Expected Benefits  get a new, revised evaluation checklist with accepted quality for libraries to use as criteria for improving and developing their web sites. help either academic libraries or other libraries that haven’t had their own web sites, plan and design a web site that will be of most beneficial to their users.

  6. Research Methodology

  7. Website Evaluation Checklist’s Evaluating Process Diagram

  8. Research Methodology This research emphasizes on evaluating Chimplee Wimonturm’s academic library web site evaluation checklist (2004). First Step: An assessment for the content validity of the evaluation checklist. 1. Examiners. Three library web site content specialists were chosen by purposive sampling method as the evaluators. 2. Evaluation tool for assessing content validity. A form of content validity checklist was designed. The content checklist is divided into 8 main categories (a total of 75 sub-categories).

  9. Research Methodology (1) library’s general information; (2) information service; (3) searching service; (4) links; (5) user education service; (6) navigating system design; (7) content structure design; and (8) presentation. 3. Data collecting. The result was calculated for IOC value. The criterion for this assessment was that, if the IOC value of any items was above 0.5, those items were considered as having acceptable content validity. On the contrary, if the IOC value of any items was below 0.5, those items were adjusted and edited for appropriateness. (Puangrat Thaweratana, 2000: 116-117; Office of the National Research Council of Thailand, 2004: 130).

  10. Research Methodology 4. Evaluation Checklist Improvement. Incongruent items that could not be assessed for the “existence” and “nonexistence” on Chimplee’s checklist were taken out. The language, phrasing, and content categories were edited. Revised evaluation checklist, the content categories were revised into 5 main categories(a total of 46 sub-categories): (1) library’s general information; (2) service information; (3) searching service; (4) user education service; and (5) links. Also, the calculation of the score weighting, the scoring criteria, the evaluation criteria in the score form and the evaluation manual were changed.

  11. Research Methodology Second Step: An assessment for the face validity of the revised evaluation checklist. 1. The examiners. Three library experts were chosen by purposive sampling method as the examiners. 2. Tools for assessing face validity. This refers to the revised evaluation checklist, the revised manual, and the new evaluation score form. 3. Data collecting. The examiners assessed the revised evaluation checklist for face validity. Then, it was revised before being put to test with librarians in the third step.

  12. Research Methodology Third Step:Revised evaluation checklist usability testing. 1. 2 Sample groups. 3 librarians with library web site designing experience and with at least 10-year working experience were chosen by purposive sampling method as the evaluators.  2 academic library web sites with the most web site service contents and with the most various presentation techniques, according to Nawaporn Chonwanich’s research (2003), were chosen by purposive sampling method. : Ramkhamhaeng University library web site & Khon Kaen University Academic Resources Center 2. Tools. The revised evaluation checklist, the revised manual and the new score form.

  13. Research Methodology 3. Data collecting. 3 chosen librarians tried out the revised evaluation checklist  starting from studying the manual, assessed the 2 chosen web sites according to the categories on the revised evaluation checklist to find out “existence” or “nonexistence” on the web site of those characteristics from the list.  filled in the score form and tallied up the scores for summary.

  14. Research Methodology 4. Evaluation Model Reliability assessment.The reliability of the revised evaluation checklist was assessed by using equivalence assessment method. This was to determine the inter-rater reliability of the 3 librarians’ answers and was done by comparing their answers.A consistency of any topics was found, it was marked and counted. Then each pair was analyzed for reliability. If the average assessment result of every pair is above 0.70, the reliability of that revised checklist is considered acceptable (Office of the National Research Council of Thailand, 2004: 137).

  15. Research Methodology Fourth Step:Interview for suggestions and comments on the revised evaluation checklist testing.1. Sample group. 3 librarians who tested the revised checklist in the third step. 2. Tool. A semi-structured interview 2.1 Contentof the questions covered 5 categories: library’s general information, service information, searching service, user education service, and links. 2.2 Evaluation checklist manual’s instruction, evaluation criteria, evaluation method, and evaluating period. 2.3 General comments on the checklist and suggestions.

  16. Research Methodology 3. Data collecting. The interview was recorded. 4. Data processing and analysis. The recorded interviews were transcribed, summarized and presented as descriptive data. Fifth Step: Summary of the revised evaluation checklist. The result from the checklist testing and the interview was processed and summarized.

  17. Summary and Discussion

  18. Summary and Discussion • The analysis summary of the revised evaluation checklist that had passed adjustment and quality assessment are as follows: • 1. The revised evaluation checklist reliability is acceptable. According to the result, the reliability value of the • revised checklist that had been adjusted and tested is 0.70 in • every pair, which is considered above the criteria. Therefore, • it can be concluded that thischecklist is reliable, authoritative • and practical. The reasons are possibly because:

  19. Summary and Discussion 1.1 The checklist was evaluated by many systematic assessments—the assessment for the content validity, the assessment for content accuracy by the experts for two times, and the web usability testing. This process is a similar approach to Raward’s development of the evaluation checklist for academic library web sites (2001), which also had the checklist content assessed by the experts before the testing.

  20. Summary and Discussion 1.2 The evaluators, the 3 librarians, have similar opinions. This could due to the fact that they are responsible for library web site designing and have more similar experiences on the library web site, the evaluators are able to understand and follow every step. This is similar to Chao’s research (2002), Clausen’s research (1999),Maquignaz and Miller’s research (2002),which aimed to develop a new list of criteria for evaluation of academic libraries.

  21. Summary and Discussion 2. The librarians’ comments and suggestions after the testing of the revised checklist. 2.1 General comment  The revised checklist is practical to use.  The main categories in the checklist are inclusive and complete. The sub-categories are quite complete and consistent to the main categories. The arrangement of the main categories and sub-categories is consistent. The phrasing of the main categories and sub-categories is clear.

  22. Summary and Discussion However, there is a problem with the clarification of some items. This issue is consistent with the findings of other researches: Maquignaz and Miller (2002), Raward (2001), Detlor and Lewis (2006), Crowley et al. (2002), McGillis and Toms (2001), Wright (2004), McMullen (2001: 11).

  23. Summary and Discussion 2.2 The comments on the sub-categories of the revised checklist are as follows: 2.2.1 On library’s general information, the Thai and English names of the libraries should be in separate items. 2.2.2 On service information, there are 3 points. Statement of library service policies and standard is not necessary.  Thai and English current content service should be in separate items.  English title for email-based reference service should also be added. 2.2.3 On user education service, the item on “How to write citation and bibliography” should be added.

  24. Summary and Discussion 2.2.4 On links, there are 3 points.  Link to other libraries’ OPAC should be clearly stated that it links to only the OPAC page of those libraries or to the Office of Higher Education Commission’s Union Catalog. Link to “Help Menu” should be clearly indicated that it is the library web site’s “Help Menu” not the browser’s.  Link to other free online references such as dictionaries, encyclopedias, and directories, it should be clearly indicated that they are subscribed online references or free online references.

  25. Summary and Discussion 2.3 The librarians’ comments on the revised evaluation checklist. 2.3.1 The instructions in the checklist manual are clear, comprehensible, and achievable. 2.3.2 The scoring system and evaluation method are suitable. However, the evaluation standard should be higher. 2.3.3 The evaluating process does not take a long time. This is because the tools are clear and easy to understand. 2.3.4 This revised checklist is practical even to the librarians who are not web librarians.

  26. Summary and Discussion 2.4 General suggestions. 2.4.1 For convenience, the revised checklist, which includes an evaluation checklist, a manual, and an evaluation score form, should be in computer file. 2.4.2 For convenience and faster calculation, the evaluation score form should be in an Excel file, with the formulas already filled and also links to the checklist. 2.4.3 The revised checklist should be further developed, so it can be used to assess the content quality of the web site.

  27. Research Limitations  The items on this revised evaluation checklist emphasize more on assessing the main contents and services that should be presented on library web sites at the present than assessing their designing techniques.  Some of the items may not be valid to assess future library web sites. This is because the technology advances all the time. So, some of the items may be outdated and need to be adjusted to suit the current situation they will be used.

  28. Suggestions for Improvement of Evaluation Checklist The unclear terms of some items on the checklist should be changed to make it clear enough for the evaluators to give an accurate score. The evaluation standard should be raised higher.

  29. Suggestions for Future Research There should be a study on the evaluation criteria for other aspects of web site designing, i.e., designing techniques. So, every area involved with library web site designing can be assessed. It is also a tool to help develop a perfect library web site. The evaluation method that emphasizes more on the content qualities than the “existence” and “nonexistence” of the items on the web site, which is a quantitative approach, should be developed.

  30. Examples: • Manual • Evaluation checklist for Academic Library Websites • Score form

  31. Thank you for attended

More Related