4. Ongoing Evaluation and Access

Many resources can take at least one academic year before they become embedded into the curriculum or research process. Usage data and user feedback may not be positive from the outset and a resource should not be cancelled after the first year unless there are very good budgetary reasons for doing so. Sometimes there is a significant lag time between purchase and access establishment of a resource that also impacts the first year of usage. Given the development cycles of most electronic products and services, the first twelve months of release tend to be accompanied by significant changes and upgrades to both the product and service. [1] The concept of “soft launch” or “soft roll-out” have become predominant even in libraries. [2] The best evaluation of a product or service happens within a three to five year time frame. The arc of usage and user behavior is not fully realized until the third year of activity for any given resource or service. Evaluation of user behavior and usage data is important in building up a detailed picture of the appropriateness of the resource over time, and is invaluable when it comes time to review the resource in the future. Despite all good faith efforts, activation and establishment of access to electronic resources at any given institution is sometimes overlooked or missed. Part of this stage should be spent of double-checking that access is available for all purchased resources and if access to a collection of resources has been purchased, that the collection is still comprised of the same titles and/or make-up of the product initially purchased. Patron driven ebook packages require a more frequent management scale and there other considerations to be made regarding A&I and fulltext databases.

Types of Evaluation

There are various means to look at when evaluating electronic resources and how they may be used locally. As electronic resources grow to assume the majority of library collections budgets, determining which evaluation strategy best captures the usage profile at your institution is key to creating a successful evaluation model. Presently, most electronic databases and journals can be evaluated using COUNTER based statistics (3)

However, COUNTER data is just one mechanism for evaluating electronic resources. Journal publishers like to promote and use ISI Impact Factors to exemplify and depict content relevance.(4, 5) Another measure that also provides and depicts citation related data is an Eigenfactor score (6) The USKG (7) and COUNTER are working together on the Journal Usage Factor (JUF) project, which is assessing how online journal usage statistics might form the basis of a new measure of journal impact and quality (8).

Lastly, many libraries also choose to develop an aggregation of web page statistics, discovery tool statistics, openURL usage, and integrated library system usage to add to the use evaluation of any given title or resource. Explanation of this aggregated evaluation approach will be given in more depth in Annual Review. In order for the evaluation to be most beneficial to your institution, the librarians at any given institution must first agree on which data points they would like to use to evaluate their electronic usage and then set consistent methods of collecting and reporting these figures from one year to the next. One way to determine the criteria to be used for evaluation of your electronic resources is the balanced score card approach. This allows for the set-up of various factors to contribute to the evaluation of your electronic resource collection.(9)

Checking the implementation

Many electronic resource librarians set-up review periods to check access to resources on a schedule. With new purchases, it is best to check the established access points form your institution about a month after purchase to insure that access is working correctly from web pages and the library OPAC. Part of this evaluation should include checking the remote authentication process as well as the links. If an institution has purchased an electronic resources management system, then a tickler can be established to remind staff to perform this check for access provision. Depending on the resource or package purchased, once it has been determined that access is fully set-up, then a monthly, quarterly, half-year, or annual review of the resource should occur to make sure that the content has remained the same and all of the access points are working correctly.

Ask your users

In addition to gathering data from the sources listed above, it is vital for any library to also ask their users what their electronic resource needs are and if they feel that their needs are being met by the electronic resources provided. This type of information gathering can occur in a highly structured way by using an evaluation tool such as LIBQUAL (10), a standard set of survey questions that are distributed each quarter or semester of classes, or in a more informal evaluation of an open-ended comments section on a library’s web pages or via tracking mechanisms for access problems/issues faced by end-users.

Again, the librarians at any given library should come to an agreement on which strategy to use to gather information from users and make sure a consistent process is used at each evaluation period to insure coherent reporting of the feedback.

Changes to coverage of resources or platform migration

For A&I databases, a yearly or biannual check is normally sufficient to insure that access is occurring as it should and that the platform still fully supports the functionality of the content given. Databases are bought and sold and do move from one supplier to the next. This is a good way to catch these changes.

Sometimes, an A&I database may be available from more than one provider and may or may not come with a full text component available. Part of this evaluation stage should include looking at the other platforms available to make sure the best use of the resource is being leveraged. It may be that moving an A&I database to another platform results in more direct linking to purchased full text or a more robust controlled language is employed. There are times when an A&I database of full text database has moved from one provider to another and this move has shifted either the focus of the content or the available access of the content. The annual review can catch these more subtle changes and perhaps land a resource on your review list described in Cancellation and Replacement Review.

Journal titles move fairly regularly between different hosting services as well from one publisher to another. An initiative begun by the UKSG to set guidelines for journals moving from one place to another has made some headway in getting publishers and providers to announce these changes in advance of the move occurring. This protocol has become known as Transfer.(11) However, not all publishers and platform providers follow the Transfer recommended guidelines which means that spot checking of journal titles by any given publisher is a worthwhile endeavor for an electronic resource management team to perform.

Journal packages coverage can be checked on a biannual or quarterly basis depending on the package purchased to catch any content coverage changes that might have occurred. This can be done in coordination with reports provided by your OpenURL provider that capture coverage/holdings data changes. The most common checking of packaged content usually occurs at the renewal cycle to verify what titles should and should not be part of the package. The major subscription agents have created package support services and package title verification is a good reason to enlist the use of a subscription agent, especially if you have multiple packages that renew at roughly the same time. By having staff selectively check titles in various journal package, confirming the coverage/holdings can be done in a routine manner. (12)

Track downtime/availability

Downtime can be checked or evaluated in a number of ways. One way is to save email alerts/messages from the producer of scheduled downtime/maintenance and tally these up annually. If possible, it is wise for electronic resource managers to set-up some form of electronic resource troubleshooting mechanism either through email, electronic resource management tool, or software application. This way you again do an annual accounting of down-times or significant service interruptions with any given journal package, platform, or provider. It is extremely important for electronic resource managers to report these findings back to the provider, especially at the renewal period. Although rare, it may be possible to request discounts or receive other forms of compensation such as free months of access.

With any purchase of a journal collection or package of electronic access, there is a strong likelihood that journals have moved from one year to the next. However, most journal publishers allow for a two-three month grace period at the beginning of every year before terminating access. Therefore, it is best to establish a routine check of your journal package access in April or May of any given year and not January or February which was routine for print subscriptions.

For patron driven ebook plans, content is normally added and subtracted on a monthly or quarterly basis through the record loads performed in the OPAC. It is wise to spot check URLs when the record loads occur to insure that proxy scripts are running correctly and that access from these records is represents the established profile of titles.

Other ebook packages may also update on a monthly or quarterly basis and again, it is best to coordinate the access check when any record loading may be occurring just to insure that the records loaded do represent the content purchased and against the knowledge base held by your openURL resolver.

Communication with the vendor

The electronic resource manager should keep a file or dossier on each resource provider that includes all pertinent correspondence that has occurred along with notifications of routine maintenance, and specific troubleshooting problems that have arisen. If there is a place to capture this information in your electronic resource management tool either through a notation system or file upload system, this information can be stored there as well. With each renewal, an overview of performance and issues that have arisen during a given year should be shared back with the vendor or provider. Specific feedback form your end-users may help with future developments and changes to improve the product or service offered. Many vendors and electronic resource service providers do have user groups and user group meetings as part of major conferences or as stand-alone events. It is highly recommended that if you are using these services, you join the user group and become involved in committees of interest since this is the best way librarians have to partner with service suppliers to help define the directions of tool development and provide much needed feedback on the user experience. This information can also be used when negotiating the cost for the next fiscal cycle or can be used as part of the overall review of a product or service for retention.

References

  1. Abrams, Stephen.”Product Development Life Cycle.” Weblog entry. Stephen’s Lighthouse, 14 December 2010. Accessed: 26 August 2011.
  2. Smith, S.E. “What is a Soft Launch.” Weblog entry. WiseGEEK, last modified 25 April 2011. Accessed 26 August 2011.
  3. COUNTER Statistics
  4. Impact Factor. In Wikipedia. Retrieved 31 August 2011.
  5. Cross, J., Impact Factors: the basics. In: E-Resources Management Handbook. UKSG. ISBN 978-0-9552448-0-3
  6. Eigenfactor
  7. UKSG
  8. Journal Usage Factor
  9. Bielavitz, Tom. “The Balanced Scorecard: A Systemic Model for Evaluation and Assessment of Learning Outcomes?” Evidence Based Library and Information Practice, 2010, 5 (2) 35-46.
  10. LibQUAL
  11. Transfer
  12. Collins, Maria and Murray, William T. SEESAU: University of Georgia’s Electronic Journal Verification System. Serials Review, 2009, 35 (2) 80-87.

Other documents

  1. Workflow: publisher changes platform. Created by Allison Larkins, University of Huddersfield
  2. Workflow: renewal of an e-journal subscription. Created by Allison Larkins, University of Huddersfield

Go to other sections

  1. Investigating New Content for Purchase/Addition
  2. Acquiring New Content
  3. Implementation
  4. Ongoing Evaluation and Access
  5. Annual Review
  6. Cancellation and Replacement Review