In my last column, we left off with how Project Counting Online Usage of Networked Electronic Resources (COUNTER) is transforming the usage statistics landscape, making usage reports a simple and effective tool for comparing and contrasting patron behavior and preferences in databases across the board. A great weakness in legal databases, however, is the lack of COUNTER-compliant reporting, and sometimes the lack of any reporting at all, leaving us with a big question. How can I take the statistics available to me, regardless of compliance, and transform them into a viable assessment tool, especially in regards to the subscription cancellation decisions that currently haunt the landscape of law librarianship?
There are obvious difficulties in choosing whether to cancel or keep databases when usage data is not standardized. Unless reports are COUNTER-compliant, there is no way of knowing if “Searches” on one report is equivalent to “Searches” on another report, much less how to compare “Searches” to completely dissimilar terms. What about “Visitors”? Are these the total number of separate accesses, the number of “unique” visitors, or something else entirely? One of the first steps during and after gathering your usage reports is to identify the definitions of particular words. If you cannot find definitions for the terms on your report somewhere in an FAQ or a Help link on the database site or administrative portal, then reach out to customer service or your own school’s representative and find these definitions.
But, an even more practical step is gathering the reports to begin with. Some databases have usage reports available in an administrative portal, others have a portal dedicated specifically to usage reports, others have an interface built in to the database itself, and still others require you to directly contact your representatives. The best options are those which let you create custom reports or at least run standardized reports yourself on demand, from both off and on site; contacting representatives presents its own brand of unique challenges. Any time your representative changes, either your interim or replacement representative will need to figure out how to complete your request, and this may produce a set of entirely different reports. If you have a particular way you are accustomed to seeing the data, send that to your new representative when making requests. This tactic can also be useful with older reps if you only ask for this data annually and want to ensure consistency.
While your institution can run monthly statistics and create a variety of visually appealing comparison charts, at the very least, you should gather and compile available usage statistics for each database at least a month before renewal. When COUNTER reports are available, some of the most popular and useful reports include Journal Report 1, which contains successful full-text article requests by month and journal, and Database Report 3, which contains searches and sessions by month and service. COUNTER 4 compliant sites offer Platform Report 1 instead of Database Report 3, containing total searches, result clicks, and record views by month and platform. Turnaways are useful when you’re assessing potential ways to grow your collection, as they demonstrate what content your patrons are trying to access that is unavailable with your current subscription. When COUNTER reports are not available, you’ll have to familiarize yourself with each database’s unique capability and assess your preference individually.
Here are some of the major players in law schools and how to access their reports. I cannot claim that this is a completely accurate list, as it has grown, morphed, and changed over the years to reflect database capabilities. Despite all of my efforts to stay current, I’m always learning things I didn’t already know could be done or my representatives weren’t aware could be done. If you have anything to add or information on other databases that would be of interest to law librarians, please send it to firstname.lastname@example.org and I’ll add an update to my next column, as well as my own personal records!
- Bureau of National Affairs (BNA) – BNA requires contacting the representative directly, and reported usage is broken down by quarter, including email alerts, visits, and page views. The cumulative quarterly report makes it challenging to add BNA into any sort of comprehensive monthly database usage report. Bloomberg BNA does not have usage reporting capabilities at this time.
- Center for Computer-Assisted Legal Instruction (CALI) – With CALI, you must contact your representative directly, and you can obtain the total number of lessons accessed by month.
- Commerce Clearing House (CCH) – CCH has changed over the years. When I first started my usage statistics journey eight years ago, reports came through your representative. Then, there was a separate interface where, after requesting access, you could create your own reports. At this point, CCH again requires you to request reports through your representative. Representatives can create a variety of reports, so I usually send our preferred report when I send in a request; this report indicates page hits, users, days, and devices, all separated by month.
- Chronicle of Higher Education (CHE) – CHE has a link for reporting (http://chronicle.com/campuswide/reports/, requires login), and it gives you a HTML summary of page views, searches, and visitors by month.
- Chicago Manual of Style – You can request these statistics directly at email@example.com or access the report yourself at http://www.chicagomanualofstyle.org/reports/index.epl (requires login). The linked report shows successful title requests by month.
- eLibrary – This database has a variety of options available at http://elibrary.bigchalk.com/reports (requires login), such as database activity and document usage, as well as a handful of reports in COUNTER format. It’s important to note that eLibrary is COUNTER-compliant to the 2003 code of practice, not the current code, meaning that even comparisons with a current COUNTER-compliant report from another database are flawed. eLibrary also allows you to schedule regular delivery of specific reports directly to your email inbox.
- HeinOnline – You can request usage statistics directly from firstname.lastname@example.org, and they are also delivered automatically in quarterly installments. Statistics reported include hits, articles, page views, visits, and searches by month. Unfortunately, HeinOnline does not separate by library within their statistics, making it impossible to support cancellation decisions within the database using these reports. These reports also provide a good example of the difficulties of matching reported terms with COUNTER-compliant terms for comparison across databases. A natural inclination would be to match Page Views with Record Views, due to similarity in language, but “Page Views” counts each and every page view. If a user reads three pages of an article, it counts as three page views. However, these Page Views are from the same article, creating a single “Article” count, which makes “Articles” more consistent with the concept of “Record View” under COUNTER 4 compliance.
- JSTOR – Historical JSTOR usage reports, running through 2009, are available at http://stats.jstor.org/. Access newer usage statistics directly from your individual JSTOR login. In order to be set up as an administrator of the system and have the “Usage Statistics” feature available to you, contact JSTOR support at email@example.com. COUNTER 4 reports are available from January 1, 2015, and earlier reports are compliant with COUNTER 3 standards. In addition to the COUNTER reports, you can run a variety of other custom reports, and you can schedule your reports for automatic delivery to your email inbox.
- Gale (includes LegalTrac, Making of Modern Law, United States Supreme Court Records & Briefs) – Reports are available at http://admin.galegroup.com/ (requires login). Be sure to click on “Location” instead of “Institution” to get your full range of admin features, including reporting. Reports are COUNTER 4 compliant, and additional reports are available as well. Gale also links to helpful resources, definitions of the reports, and more; and you can schedule automatic reports.
- ProQuest – You can access usage reports and schedule reports to run automatically at http://admin.proquest.com (requires login). Some reports are currently COUNTER 4 compliant, while others are COUNTER 3 compliant, and you have a variety of other reporting options, including frequency format. ProQuest also has an informative LibGuide on gathering usage information for ProQuest libraries (http://proquest.libguides.com/gis_usage). One tricky facet of these reports at my institution is that certain data is broken up by library. While it reports the number of regular searches and federated searches consistently across collections, result clicks and record views are specific to the portion of the database clicked and viewed. In cumulative reporting, we total these values for the entire database.
- LLMC Digital – Statistics are available at http://admin.llmcdigital.org/public/usagereport.aspx, and you can choose a date range for your report. The reports break down activity by events such as Advanced Search, Citation Search, Download, IP Login, and Volume Search. Definitions of these activities are available at http://admin.llmcdigital. org/public/Client_Usage_Report_Definitions_new.pdf.
- Oxford Products, such as the Oxford English Dictionary and Oxford Scholarship Online – Reports for usage of online resources are available at https://subscriberservices.sams.oup.com (requires login), and https://subscriberservices. sams.oup.com/report/counter.html (requires login) links you to COUNTER 4 reports for usage dating back to January 2013. Access usage reports specific to journals through http://access.oxfordjournals.org/oup/login/local.do.
- LexisNexis & Westlaw products – These reports create a special challenge for law schools because they bill student accounts and law firm accounts differently, and it is rare for law schools to debate cancelling these products in their entirety. Students typically need to be familiar with both products, as they don’t know which platforms their future employer may have available to them. In addition to providing access to our students, my institution also provides legal resources to the public through a public patron account at Thomson Reuters. For this account, our representative gathers total usage annually, patrons’ monthly usage, and warning screens for content outside the plan. We also request usage statistics of specific academic products, such as the West Study Aids collection, by emailing our West Academic representative directly. These reports contain document views by month, unique visitors by month, top functionality, unique students accessing or favoriting, and the top five document views by series. For Lexis products, I contact our representative for reports on document views, searches, alerts, Lexis.com links, printouts, and Shepard’s. It’s important to note that usually the representative is getting these statistics from someone else at the company, so it’s a slow process. Contact them far enough in advance of your deadline to avoid being stuck in a crunch without the data.
At my institution, we not only use these reports internally, but we also compare our statistics to the other schools within the consortium. We have a Best Practices group for Electronic Services, and one of our deliverables is a quarterly report for our consortia Library Best Practices team, comparing usage of shared databases such as HeinOnline, Gale, and ProQuest across the databases and across the consortium. We also include data on our schools FTE (full time enrollment) and the costs of databases, which may be drastically different for each school and should be accounted for in summary statistics. A bonus of sharing data, either formally or informally across schools, is tracking usage trends and marketing practices. Certain schools may have higher usage on specific products due to marketing initiatives or training provided through the library, and the group can discuss best practices on promotion. Members should house and maintain shared spreadsheets on a system such as SharePoint, Google Docs, or Dropbox, where all members using the report can drop in their data at their own convenience.
For those libraries offering federated searching through the catalog, please note that certain reports do not separate out federated searching from their statistical reports. These searches may inflate hits and accesses, so pay close attention to metrics available that indicate patron usage of the information. Again, the key to using reports successfully is identifying what available activity metrics are best at reporting the true value of the database to your patrons.
Web Access Management Reports and similar reports, available to you through your integrated library system, usually track every page clicked. Keep in mind that clunky or difficult to navigate databases, and even users working in the database that are not particularly savvy at searching, can inflate these statistics easily. Despite their problems, Web Access reports are more useful than not having any metrics at all, and they provide unique supplemental data to traditional reporting, especially if you can track specifics about the patron types accessing information.
Database costs are a simple way of turning the numbers you obtain in these reports into measures of value. Take the amount you pay for the database each year and divide it by the number of searches, giving you a cost per search metric to compare across databases. Again, when you’re trying to compare usage without COUNTER compliance on both sides, do your best to define equivalents in terms and be transparent and mindful about the assumption of equivalents when weighing out your numbers.
If no similar equivalents exist in usage reports between two databases you wish to compare, take a look at the statistics available and isolate which one is the best measure of value. Take this factor and calculate “cost per x.” Supplement this value comparison with qualitative data, such as direct surveys of patrons and experience testing from staff and faculty.
Finally, take a look at training materials and database support available. Perhaps you could cancel an expensive database with middling usage and use it as an opportunity to market a similar, lower priced product, thus increasing its usage and value. At that point, any increase in your usage and value statistics supports the value of the database itself, as well as the value of your efforts.
You can then use these increases in usage statistics to prove the value of the library staff to leadership. For example, after subscribing to West Study Aids at my institution, we did a marketing campaign including regular blog posts highlighting specific subjects, digital signage throughout the school, promotional bookmarks in printed study aids, a dedicated LibGuide, and more. Within the first semester, ninety-nine percent of our students had logged in to the Study Aids, giving us powerful statistics demonstrating the value of the librarians and staff that assisted in the promotion, especially when compared to usage of students at other schools who did not engage in similar promotions.
Overall, the legal usage statistics world is behind the curve in many ways. However, we can still transform the usage data we do have available to us as law libraries into an effective tool for assessing and proving the value of our expenditures, our resources, and sometimes even our own activities.
Technical Services Law Librarian (TSLL) is an official publication of the Technical Services Special Interest Section and the Online Bibliographic Services Special Interest Section of the American Association of Law Libraries. This article originally appeared in the June 2015 issue and is reprinted here with permission.