Tag Archives: Ashley Moye

Database Usage Statistics and Challenges of Determining the Value of Electronic Expenditures in the Legal Realm

tsll

In my last column, we left off with how Project Counting Online Usage of Networked Electronic Resources (COUNTER) is transforming the usage statistics landscape, making usage reports a simple and effective tool for comparing and contrasting patron behavior and preferences in databases across the board. A great weakness in legal databases, however, is the lack of COUNTER-compliant reporting, and sometimes the lack of any reporting at all, leaving us with a big question. How can I take the statistics available to me, regardless of compliance, and transform them into a viable assessment tool, especially in regards to the subscription cancellation decisions that currently haunt the landscape of law librarianship?

There are obvious difficulties in choosing whether to cancel or keep databases when usage data is not standardized. Unless reports are COUNTER-compliant, there is no way of knowing if “Searches” on one report is equivalent to “Searches” on another report, much less how to compare “Searches” to completely dissimilar terms. What about “Visitors”? Are these the total number of separate accesses, the number of “unique” visitors, or something else entirely? One of the first steps during and after gathering your usage reports is to identify the definitions of particular words. If you cannot find definitions for the terms on your report somewhere in an FAQ or a Help link on the database site or administrative portal, then reach out to customer service or your own school’s representative and find these definitions.

But, an even more practical step is gathering the reports to begin with. Some databases have usage reports available in an administrative portal, others have a portal dedicated specifically to usage reports, others have an interface built in to the database itself, and still others require you to directly contact your representatives. The best options are those which let you create custom reports or at least run standardized reports yourself on demand, from both off and on site; contacting representatives presents its own brand of unique challenges. Any time your representative changes, either your interim or replacement representative will need to figure out how to complete your request, and this may produce a set of entirely different reports. If you have a particular way you are accustomed to seeing the data, send that to your new representative when making requests. This tactic can also be useful with older reps if you only ask for this data annually and want to ensure consistency.

While your institution can run monthly statistics and create a variety of visually appealing comparison charts, at the very least, you should gather and compile available usage statistics for each database at least a month before renewal. When COUNTER reports are available, some of the most popular and useful reports include Journal Report 1, which contains successful full-text article requests by month and journal, and Database Report 3, which contains searches and sessions by month and service. COUNTER 4 compliant sites offer Platform Report 1 instead of Database Report 3, containing total searches, result clicks, and record views by month and platform. Turnaways are useful when you’re assessing potential ways to grow your collection, as they demonstrate what content your patrons are trying to access that is unavailable with your current subscription. When COUNTER reports are not available, you’ll have to familiarize yourself with each database’s unique capability and assess your preference individually.

Here are some of the major players in law schools and how to access their reports. I cannot claim that this is a completely accurate list, as it has grown, morphed, and changed over the years to reflect database capabilities. Despite all of my efforts to stay current, I’m always learning things I didn’t already know could be done or my representatives weren’t aware could be done. If you have anything to add or information on other databases that would be of interest to law librarians, please send it to amoye@charlottelaw.edu and I’ll add an update to my next column, as well as my own personal records!

  • Bureau of National Affairs (BNA) – BNA requires contacting the representative directly, and reported usage is broken down by quarter, including email alerts, visits, and page views. The cumulative quarterly report makes it challenging to add BNA into any sort of comprehensive monthly database usage report. Bloomberg BNA does not have usage reporting capabilities at this time.
  • Center for Computer-Assisted Legal Instruction (CALI) – With CALI, you must contact your representative directly, and you can obtain the total number of lessons accessed by month.
  • Commerce Clearing House (CCH) – CCH has changed over the years. When I first started my usage statistics journey eight years ago, reports came through your representative. Then, there was a separate interface where, after requesting access, you could create your own reports. At this point, CCH again requires you to request reports through your representative. Representatives can create a variety of reports, so I usually send our preferred report when I send in a request; this report indicates page hits, users, days, and devices, all separated by month.
  • Chronicle of Higher Education (CHE) – CHE has a link for reporting (http://chronicle.com/campuswide/reports/, requires login), and it gives you a HTML summary of page views, searches, and visitors by month.
  • Chicago Manual of Style – You can request these statistics directly at cmoshelpdesk@press.uchicago.edu or access the report yourself at http://www.chicagomanualofstyle.org/reports/index.epl (requires login). The linked report shows successful title requests by month.
  • eLibrary – This database has a variety of options available at http://elibrary.bigchalk.com/reports (requires login), such as database activity and document usage, as well as a handful of reports in COUNTER format. It’s important to note that eLibrary is COUNTER-compliant to the 2003 code of practice, not the current code, meaning that even comparisons with a current COUNTER-compliant report from another database are flawed. eLibrary also allows you to schedule regular delivery of specific reports directly to your email inbox.
  • HeinOnline – You can request usage statistics directly from techsupport@wshein.com, and they are also delivered automatically in quarterly installments. Statistics reported include hits, articles, page views, visits, and searches by month. Unfortunately, HeinOnline does not separate by library within their statistics, making it impossible to support cancellation decisions within the database using these reports. These reports also provide a good example of the difficulties of matching reported terms with COUNTER-compliant terms for comparison across databases. A natural inclination would be to match Page Views with Record Views, due to similarity in language, but “Page Views” counts each and every page view. If a user reads three pages of an article, it counts as three page views. However, these Page Views are from the same article, creating a single “Article” count, which makes “Articles” more consistent with the concept of “Record View” under COUNTER 4 compliance.
  • JSTOR – Historical JSTOR usage reports, running through 2009, are available at http://stats.jstor.org/. Access newer usage statistics directly from your individual JSTOR login. In order to be set up as an administrator of the system and have the “Usage Statistics” feature available to you, contact JSTOR support at support@jstor.org. COUNTER 4 reports are available from January 1, 2015, and earlier reports are compliant with COUNTER 3 standards. In addition to the COUNTER reports, you can run a variety of other custom reports, and you can schedule your reports for automatic delivery to your email inbox.
  • Gale (includes LegalTrac, Making of Modern Law, United States Supreme Court Records & Briefs) – Reports are available at http://admin.galegroup.com/ (requires login). Be sure to click on “Location” instead of “Institution” to get your full range of admin features, including reporting. Reports are COUNTER 4 compliant, and additional reports are available as well. Gale also links to helpful resources, definitions of the reports, and more; and you can schedule automatic reports.
  • ProQuest – You can access usage reports and schedule reports to run automatically at http://admin.proquest.com (requires login). Some reports are currently COUNTER 4 compliant, while others are COUNTER 3 compliant, and you have a variety of other reporting options, including frequency format. ProQuest also has an informative LibGuide on gathering usage information for ProQuest libraries (http://proquest.libguides.com/gis_usage). One tricky facet of these reports at my institution is that certain data is broken up by library. While it reports the number of regular searches and federated searches consistently across collections, result clicks and record views are specific to the portion of the database clicked and viewed. In cumulative reporting, we total these values for the entire database.
  • LLMC Digital – Statistics are available at http://admin.llmcdigital.org/public/usagereport.aspx, and you can choose a date range for your report. The reports break down activity by events such as Advanced Search, Citation Search, Download, IP Login, and Volume Search. Definitions of these activities are available at http://admin.llmcdigital. org/public/Client_Usage_Report_Definitions_new.pdf.
  • Oxford Products, such as the Oxford English Dictionary and Oxford Scholarship Online – Reports for usage of online resources are available at https://subscriberservices.sams.oup.com (requires login), and https://subscriberservices. sams.oup.com/report/counter.html (requires login) links you to COUNTER 4 reports for usage dating back to January 2013. Access usage reports specific to journals through http://access.oxfordjournals.org/oup/login/local.do.
  • LexisNexis & Westlaw products – These reports create a special challenge for law schools because they bill student accounts and law firm accounts differently, and it is rare for law schools to debate cancelling these products in their entirety. Students typically need to be familiar with both products, as they don’t know which platforms their future employer may have available to them. In addition to providing access to our students, my institution also provides legal resources to the public through a public patron account at Thomson Reuters. For this account, our representative gathers total usage annually, patrons’ monthly usage, and warning screens for content outside the plan. We also request usage statistics of specific academic products, such as the West Study Aids collection, by emailing our West Academic representative directly. These reports contain document views by month, unique visitors by month, top functionality, unique students accessing or favoriting, and the top five document views by series. For Lexis products, I contact our representative for reports on document views, searches, alerts, Lexis.com links, printouts, and Shepard’s. It’s important to note that usually the representative is getting these statistics from someone else at the company, so it’s a slow process. Contact them far enough in advance of your deadline to avoid being stuck in a crunch without the data.

At my institution, we not only use these reports internally, but we also compare our statistics to the other schools within the consortium. We have a Best Practices group for Electronic Services, and one of our deliverables is a quarterly report for our consortia Library Best Practices team, comparing usage of shared databases such as HeinOnline, Gale, and ProQuest across the databases and across the consortium. We also include data on our schools FTE (full time enrollment) and the costs of databases, which may be drastically different for each school and should be accounted for in summary statistics. A bonus of sharing data, either formally or informally across schools, is tracking usage trends and marketing practices. Certain schools may have higher usage on specific products due to marketing initiatives or training provided through the library, and the group can discuss best practices on promotion. Members should house and maintain shared spreadsheets on a system such as SharePoint, Google Docs, or Dropbox, where all members using the report can drop in their data at their own convenience.

For those libraries offering federated searching through the catalog, please note that certain reports do not separate out federated searching from their statistical reports. These searches may inflate hits and accesses, so pay close attention to metrics available that indicate patron usage of the information. Again, the key to using reports successfully is identifying what available activity metrics are best at reporting the true value of the database to your patrons.

Web Access Management Reports and similar reports, available to you through your integrated library system, usually track every page clicked. Keep in mind that clunky or difficult to navigate databases, and even users working in the database that are not particularly savvy at searching, can inflate these statistics easily. Despite their problems, Web Access reports are more useful than not having any metrics at all, and they provide unique supplemental data to traditional reporting, especially if you can track specifics about the patron types accessing information.

Database costs are a simple way of turning the numbers you obtain in these reports into measures of value. Take the amount you pay for the database each year and divide it by the number of searches, giving you a cost per search metric to compare across databases. Again, when you’re trying to compare usage without COUNTER compliance on both sides, do your best to define equivalents in terms and be transparent and mindful about the assumption of equivalents when weighing out your numbers.

If no similar equivalents exist in usage reports between two databases you wish to compare, take a look at the statistics available and isolate which one is the best measure of value. Take this factor and calculate “cost per x.” Supplement this value comparison with qualitative data, such as direct surveys of patrons and experience testing from staff and faculty.

Finally, take a look at training materials and database support available. Perhaps you could cancel an expensive database with middling usage and use it as an opportunity to market a similar, lower priced product, thus increasing its usage and value. At that point, any increase in your usage and value statistics supports the value of the database itself, as well as the value of your efforts.

You can then use these increases in usage statistics to prove the value of the library staff to leadership. For example, after subscribing to West Study Aids at my institution, we did a marketing campaign including regular blog posts highlighting specific subjects, digital signage throughout the school, promotional bookmarks in printed study aids, a dedicated LibGuide, and more. Within the first semester, ninety-nine percent of our students had logged in to the Study Aids, giving us powerful statistics demonstrating the value of the librarians and staff that assisted in the promotion, especially when compared to usage of students at other schools who did not engage in similar promotions.

Overall, the legal usage statistics world is behind the curve in many ways. However, we can still transform the usage data we do have available to us as law libraries into an effective tool for assessing and proving the value of our expenditures, our resources, and sometimes even our own activities.

~Ashley Moye~

 Technical Services Law Librarian (TSLL)  is an official publication of the Technical Services Special Interest Section and the Online Bibliographic Services Special Interest Section of the American Association of Law Libraries.  This article originally appeared in the June 2015 issue and is reprinted here with permission.

Leave a comment

Filed under CharlotteLaw Library Team Members, Library, Technical Services

Database Usage Statistics and Challenges of Determining the Value of Electronic Expenditures in the Legal Realm

tsll

Since the shift to the digital age, libraries must manage not only their traditional print collection, but also their ever-increasing electronic offerings. Electronic resources are appealing options, as they generally require less time to initially process, maintain their currency with little effort, allow you to enhance your collection free of physical space constraints, and often allow multiple patrons to use them at the same time. However, print materials are not without their own unique merits. I’m steering clear of the soapbox on the print versus electronic debate, though. Instead, I’m going to talk about another part of the puzzle, one which is remarkably challenging in the legal realm—finding a way to balance the costs and benefits of our expenditures on electronic offerings by leveraging available usage data. This is a task that becomes even more difficult as budget lines shrink and cancellations are necessary.

Legal database and print material costs are sky-high, and both keep getting higher. Additionally, legal materials require a standard of currency, regardless of whether they are in an academic, public, or private setting. Once you’ve decided to invest in a particular resource, keeping it up-to-date is rarely an optional fee, if you want your original investment in the materials to maintain its value to your patrons. To add insult to injury, if the updates jump in price, you have little recourse—you must simply find the money somewhere or cancel the material and have it begin to lose value. I’ve seen times where it’s far cheaper to buy a brand new up-to-date set of materials every few years rather than pay the update costs continually; assuming, of course, that slightly out-of-date print materials can meet the needs of your patron base.

Obviously, electronic versions of legal resources don’t require updates like loose-leaf releases, advance sheets, pocket parts, and supplements. While you’re still responsible for the cost of maintaining access to the resource and are beholden to price increases there as well, the staff time it takes to check in the materials, process them, and update them in a timely manner can help push any cost-benefit analysis in favor of the electronic resource. Cheaper and faster and more up-to-date is often the mantra of electronic resources.

There are a few things to beware of, though, when debating investing in electronic resources over print resources or even when choosing which electronic resources to purchase. Databases can be full of bells and whistles that seem exciting to your library staff, but which your patrons may not use. There could be a cheaper option with an interface that appears limited to you, but which easily meets your patrons’ needs. Databases have practically unlimited cloud storage, so it’s easy to pad title counts that may impress you, but which your patrons never even notice. Oftentimes, materials are available on multiple database platforms or through open access, freely available online. Isolating specific user actions within databases and tracking the total amount of activity in a consistent manner can help us be sure that our investments are worthwhile and the cost-benefit analysis is accurate.

It’s not as easy as it seems, though. In the print world, you can measure the ‘usage’ of a book by tracking check-outs and developing a simple method to approximate in-house usage for materials which cannot be checked out, such as asking for books not to be re-shelved. This allows librarians to formally track data on books pulled off the shelf, and it provides valuable informal usage data through observation of patron behavior.

However, patrons can use electronic materials outside the physical space of the library where it is impossible to gather informal information through mere observation, and the two basic ideas of check-outs and in-house-usage tracking are replaced with a cacophony of terms, such as page views, record clicks, downloads, hits, article views, users, sessions, searches, and more. Each database defines their own terms and their own “usage statistics,” and databases aren’t required to provide you with any statistics at all. Some may offer no way of quantifying your specific patron group’s user behavior, and you’re left to fill in the blanks as best you can with surveys, user experience testing, and ILS tools such as Web Access Management systems. These measure access, not necessarily activity, and thus have their own set of limitations.

Enter Project Counting Online Usage of Networked Electronic Resources (COUNTER). COUNTER is an initiative which focuses on setting consistent, credible, and compatible standards for both recording and reporting online usage statistics for online journals, databases, books, and reference works. COUNTER has also worked with the National Information Standards Organization (NISO) on the Standardized Usage Harvesting Initiative (SUSHI), a protocol that allows tools to automatically retrieve and consolidate usage statistics across vendors. COUNTER-compliant databases use the same definitions for their metrics, count things in the same way, and report them in identical formats with consistent report names. I always like saying it gives librarians a chance to put two databases next to each other and compare apples to apples, instead of apples to yellow. COUNTER reports also give you a chance to create consistent return-on-investment measures, such as cost per search or cost per session, which allows you to compare the value of these databases to your patrons across the board.

Without a doubt, Project COUNTER is a stroke of brilliance. However, vendors are not required to be COUNTER compliant any more than they’re required to provide you with usage statistics. And in the legal realm, where a few big players run most of the show and monopolize your budget lines, only a few vendors offer COUNTER reports. This factor makes navigating the world of gathering and comparing usage data across platforms especially challenging. While usage data may not be the only thing that you use when deciding whether to cancel or keep an electronic resource, it can be an incredibly valuable tool. The challenge is finding ways to effectively quantify our return on investment without the luxury of COUNTERcompliant reports.

In my next column, I’ll offer some practical advice on delving into usage data within the legal field. As my law library life has been solely in the academic realm, I would welcome any advice from those outside of academia to help flesh out my own tips and tricks. If you have some practicalities to share, whether private, public, or even academic, please email me at amoye@charlottelaw.edu and I’ll include you in my next column!

~Ashley Moye~

 Technical Services Law Librarian (TSLL)  is an official publication of the Technical Services Special Interest Section and the Online Bibliographic Services Special Interest Section of the American Association of Law Libraries.  This article originally appeared in the March 2015 issue and is reprinted here with permission.

Leave a comment

Filed under CharlotteLaw Library Team Members, Library, Technical Services

A Stocking Full of Metric Goodies

tsll

It’s been a while since my last column, and my bag of ideas has found itself overflowing with useful links on a myriad of topics. Seeing as how it is the holiday season, and I couldn’t choose a single topic any more than I can choose a single dessert at the holiday table, I decided to celebrate the spirit of giving with a mash-up approach, a veritable stocking stuffed full of metric-centric resources and commentary.

A stocking usually has something eye-catching and especially relevant sticking out of the top. So, let’s get this party started with a goodie bag of altmetrics, an alternative approach to metrics that measure the impact of scholarly research outside the ever-so traditional confines of citation-based metrics. In today’s online world, measuring scholarly footprints is no longer as simple as tracking formal citations. Not only has social media and the increasingly pervasive information super-highway introduced new channels for broadcasting and sharing works, but it’s also ushered in the world of open access and an ability to immediately both access and share scholarly publications. So, for example, how do we accurately capture the impact and value of work that may not only be formally cited, but may also have been shared through social media channels at length? Enter altmetrics, the buzzword that, although making its rounds since 2010, recently skyrocketed in popularity. If you haven’t noticed, even familiar faces are starting to climb on the bandwagon, such as EBSCO, who recently partnered with Plum Analytics.

Katie Brown, our library director here at Charlotte Law, introduced me to the concept when she was first hired. She also presented at the annual meetings of both the Center for Computer-Assisted Legal Instruction (CALI) and American Association of Law Libraries (AALL) this year, demonstrating some of the new altmetrics tools for assessing and tracking scholarly impact. Even the latest issue of the AALL Spectrum featured an article on altmetrics, which did a stellar job of providing an accessible overview of the field, highlighting various tools you can use for measurements, and clearly explaining a variety of reasons to consider utilizing altmetrics in your own sphere. If your curiosity is piqued and you want to learn more, here are a few more resources I’ve found:

  • The Altmetric Bookmarklet can be installed on Chrome, Firefox or Safari, and allows you to find article-level metrics for any paper you’re reading on your computer.
  • Publish or Perish is a free software program that retrieves and analyzes academic citations and raw data from both Google Scholar and Microsoft Academic Search, providing metrics that allow you to present your research data in its best light.
  • Altmetrics for Librarians and Institutions: Parts 1, 2, & 3 give a comprehensive overview of altmetrics in plain language. Part 1 covers the basics; Part 2 is specifically geared towards librarians, focusing on ways altmetrics can assist with selection management and illustrate collection value, providing you with real-time stats and increasing the value of the guidance you give to your own research community; Part 3 focuses on using altmetrics in decision making within a greater academic context.
  • The April/May 2013 ASIS&T Bulletin features a special section on altmetrics with a variety of more in-depth and specialized papers that demonstrate the broadening scope of altmetrics scholarship.
  • Scholarship: Beyond the Paper by Jason Priem is a somewhat technical piece, written by the founding father of altmetrics, but is definitely an interesting take on the future of altmetrics and the possibility that the traditional peer-reviewed journal and article approach is transitioning into an all-new scholarly communication system.

So let’s dig a little deeper into our stocking now, shall we, and see what the smaller gifts underneath may be?

David Lee King, one of my all-time favorite librarian bloggers, recently did a whole series of posts on “Analytics for Social Media.” King discusses social media analytics that his library tracks, starting with activities analytics, then moving on to audience, engagement, and referral metrics, and finishing up with the grand master of social media analytics – return on investment (ROI). What I enjoy is that he not only explains what metrics he collects and exactly how he counts them, but he also succinctly explains WHY he counts them. As he keeps his posts tastefully brief, I’ll let his words speak for themselves.

Did you miss the webinar from the Philadelphia Chapter of the Special Libraries Association (SLA) on “Leveraging User Data for Strategic Decisions”?  Never fear – the recording is available online. This webinar provides great examples of how two libraries are gathering user data to help make business decisions and to improve learning environments for patrons.

This blast-from-the-past blog post is a great reminder that you shouldn’t just be counting – you should be COLLECTING. Mary Ellen Bates has three, and only three, great questions to ask patrons after every job:

  1. Did we meet your information need?
  2. Would you like us to do additional work on this or set up an alert?
  3. How was this information useful to you?

Bates suggests that while not all patrons will respond, the testimonial information you get from these questions is invaluable in telling the true story of your library and its net worth. Make reports to management including the best stories and link it to any new initiatives you’re exploring. Bates also suggests, if applicable, developing a multiplier to represent how much time your research saves other employees, so instead of reporting the number of research hours you’ve done, you can report the value of the time you’ve saved. Basic counts then become a clear measure of impact on the bottom line.

And last but not least, as Santa supposedly covers the globe in a single night, here’s a little international flair to bring us home again. Did you know that the International Federation of Library Associations and Institutions (IFLA) has a Statistics & Evaluation Section, and even an e-Metrics Special Interest Group? Did you know that this section created a Library Statistics Manifesto in August of 2008 to serve as a certified document about the importance of library statistics? And, more importantly, did you know that they keep an up-to-date bibliography on the “Impact and Outcome” of libraries, including resources on impacts on information literacy, academic success, society, electronic services and more? The IFLA Section even joined forces a few years ago with some other major players to develop and test a new set of statistics that could be used by libraries worldwide. Collecting these statistics regularly on a national basis could provide reliable and internationally comparable data of both library services and library use.

And that brings us to the end of our stocking. At least it wasn’t an orange stuffed in the toe, right? Happy holidays to you all, and as always, feel free to send any questions or topics you’d like for me to cover in the future to amoye@charlottelaw.edu. Nothing makes my day brighten quite like hearing from a reader.

~Ashley Moye~

 Technical Services Law Librarian (TSLL)  is an official publication of the Technical Services Special Interest Section and the Online Bibliographic Services Special Interest Section of the American Association of Law Libraries.  This article originally appeared in the December 2014 issue and is reprinted here with permission.

Leave a comment

Filed under CharlotteLaw Library Team Members, Library, Technical Services

OMGMetrics

tsll

When I first proposed beginning a column on metrics, it seemed like a common sense notion. In fact, the proposal practically wrote itself. Library metrics are the hottest of topics, as we’re simultaneously a service industry and an industry whose value to patrons and communities is difficult to quantify. This results in our necks traditionally being one of the first on the chopping blocks during cuts, and our staff and supporters constantly fighting for more allocated resources. Qualitative anecdotes don’t defend our worth effectively in this business-savvy, metrics-driven world, nor do they assure that we’re maximizing value for our patrons in our expenditure choices.

As a true librarian at heart, once the column was approved, I started my research. Often when beginning research, you cast your first net with extreme caution, prepared to be buried under a towering mound of inaccurate or inapplicable results. Surprisingly, despite the importance and value of library metrics, I discovered they aren’t touched on with near the frequency you’d expect. Why this phenomenon? I have some ideas.

Let’s face it. Librarians are rarely math-centric. I learned this as a MLIS student with an undergraduate degree in actuarial science. While like majors could bond over their commonalities, I always felt a little lost – who needs a math librarian? Further in to my library school career, I was swept up into Technical Services librarianship when I came in for a part-time reference desk job interview at my legal resources professor’s workplace and the Technical Services Director saw math featured prominently on my resume. She immediately usurped my reference interview and stole me away to the land of backlogs of Westlaw and Lexis bills, much to my delight. In retrospect, I don’t even remember interviewing formally. You say, “statistics,” and librarians’ ears perk up. You say, “I like numbers,” and their eyes light up. Then, they hand you a stack of papers covered with numbers and run before you can hand it back.

Yes, people who have bad memories from their math classes growing up are often squeamish around things number-related. While I understand that fear completely, library metrics are completely different. Hence, one of my goals at the outset of this column is to help our amazing group of technical services law librarian readers realize that hearing the word “metrics” is not synonymous with “panic.”

To begin, let’s go over some basic concepts and vocabulary regarding metrics and their uses in libraries. First, all metrics aren’t created equal – for example, they: (1) use different collection and evaluation methods; (2) speak to different audiences; and (3) serve different purposes. Understanding the breadth of this topic is the first real step in creating and tracking functional metrics, which can then effectively communicate value and aid in decision-making. There are many things you can measure in the library falling into the general categories of inputs, processes, outputs, outcomes and impacts.

“Inputs” is a fancy name for resources used to produce or deliver a program or service, like staff, supplies, equipment, and money. Through processes, these inputs become outputs – resources and services that you produce, including your available materials and the programs you organize and host. Input and output tracking gives you those first glance statistics, easy to count, measure, and report, as these are tangible things. Outputs are usually what are reported to stakeholders or decision makers, e.g., we check out this many books, we have this many research guides, or these many people use the library. However, these metrics don’t accurately demonstrate the value of our services and our products.

And here’s where outcomes and impacts come in. I tend to agree with the school of thought that outcomes and impacts are the same thing, seen from different perspectives. Outcomes are changes from the perspective of our customers and impacts are the same change from the perspective of a stakeholder, usually more of a high level change, with long-term effects on the larger community. These metrics are known by quite a few names, including impact metrics, performance metrics, and outcome metrics, and are primarily intangible, making them much more difficult to measure. Naturally, they also communicate the most value and provide the most guidance and support.

Let’s be clear. Metrics are different from statistics, and for that matter, so is data. Just because you did poorly in your statistics class or didn’t score highly on the quantitative section of the GRE doesn’t mean that you should run from data or cringe when metrics is bandied about in a meeting with stakeholders or decision makers. Formally, data is qualitative or quantitative attributes of a variable or set of variables which typically arises as a result of measurements. Statistics don’t even come into play until you study the collection, organization, and interpretation of this data. Even better, in the library world, statistics don’t necessarily require the use of Greek letters or even convoluted equations. Most statistics, measures, and metrics can be organized into operating metrics, customer and user satisfaction metrics, and value and impact metrics.

Operating measures and operational statistics (such as how many people came to the library, how many check-outs took place on a certain day, and how many hits we had on a database) lend themselves well to understanding resource allocation, improving efficiencies, and making budget determinations. Customer and user satisfaction metrics, on the other hand, tell us how well the choices we made are doing based on operating measures and indicate what improvements may be required.  Value and impact measures are incredibly meaningful in their own right, as they often incorporate satisfaction and the importance of separate outcomes. These are the most elusive of all measurements; so naturally, they’re the most valuable.

Martha Kyrillidou, senior director of the Association of Research Libraries statistics and service quality programs, once said “what is easy to measure is not necessarily what is desirable to measure.” This is such a true observation regarding metric gathering in libraries – easy measurements rarely result in meaningful statistics, meaning one of your first challenges is figuring out how to make the things you choose to measure meaningful. Simply put, a meaningful measure shows you how much value you’re getting out of your investment. This could mean the investment in the library itself and the value that the stakeholders or decision makers are getting out of that investment, or it could mean what sort of value your customers are getting out of how the library chooses to invest their resources, both in terms of financial outlay and in terms of staff time. To determine meaningful measures, you need to understand your stakeholders or decision makers, and you need to understand your customers.

For instance, quantitative resource usage information doesn’t show how or why users are using materials, or even indicate how satisfied they are with the products. Relying solely on quantitative data, such as a basic measure of number of hits, isn’t necessarily enough to justify value to stakeholders and customers. Our most popular blog post at the law school, according to easily generated WordPress statistics, is one featuring a cartoon sun. Looking at the numbers and reports, you’d assume this was an incredibly popular post and maybe even assume it contributes a lot of value. However, this particular post features a metadata tag for “cartoon sun,” and one of the most searched keywords that leads people to our blog is – you guessed it – “cartoon sun.” Here, it’s obvious that a simple number doesn’t communicate actual value to our customer base or to our stakeholders and decision makers.

Similarly, one database may feature twice as many hits as another database when comparing generated usage reports, but that could be because it has a convoluted interface (possibly even for the sole purpose of generated inflated hits). Again, just because it’s an easy measure doesn’t mean it’s meaningful. Qualitative data, such as patron survey feedback and user experience testing, provides the context within which to view these numbers. This often means using a hybrid approach of both quantitative and qualitative data.

So there you have it. The metrics world is wide and wild, and this column will do its best to shine light on as many parts of it as possible. In addition to detailed discussions of the general metric concepts already mentioned, additional topics will include collection methods, statistical concepts in a nutshell, resource usage statistics, COUNTER and SUSHI, collection and transactional statistics, consortia challenges, web metrics, altmetrics, faculty support, law firm and public law library metrics, performance indicators and benchmarks, as well as discussion of tools for presentation and manipulation of data.

I’m still figuring out how best to approach the column to meet the needs of our audience, and since the next issue is devoted to American Association of Law Libraries (AALL) Annual Meeting program reports, this column won’t reappear until fall. I’d love to hear any suggestions on format and approach, any questions you’d like for me to attack, or any topics you’d like for me to cover. Shoot me an email at amoye@charlottelaw.edu, and let me know what you think!

~Ashley Moye~

 Technical Services Law Librarian (TSLL)  is an official publication of the Technical Services Special Interest Section and the Online Bibliographic Services Special Interest Section of the American Association of Law Libraries.  This article originally appeared in the June 2014 issue and is reprinted here with permission.

Leave a comment

Filed under CharlotteLaw Library Team Members, Library, Technical Services

What’s in a Day as a Charlotte Law Librarian?

We are excited to announce that the Charlotte School of Law Library won First Place in the “Best Video” category of the 2014 American Association of Law Libraries 2014 Day in the Life contest!

librarians on patrol: no book left behind

librarians on patrol: no book left behind

In Spring of 2013, in preparation for our impending move to a high-rise in uptown Charlotte, we began a massive book giveaway initiative to rid the collection of redundant materials, free up space, and share these resources with our law students and local legal community. Through this project over thirteen thousand books found loving families, but in the midst of the madness, a few books ended up scampering away that needed to come back home. Enter the Librarians on Patrol – in October, six of our staff, both strong and brave, took a trip in a U-Haul across state lines to find our babies and bring them back so they could be stored, wrapped and transferred to our new library shelves come January 2014.

Featuring: Aaron Greene, Ashley Moye, Brian Trippodo, Cory Lenz, Kim Allman & Minerva Mims

Filmed October 11, 2013 in Charlotte, North Carolina and Rock Hill, South Carolina

“Addy Will Know” courtesy of SNMNMNM – snmnmnm.bandcamp.com/

~Ashley Moye~

Leave a comment

Filed under Careers, CharlotteLaw Library Team Members, Events, Hidden Treasures, Librarians Can Be Fun Too, Library, Unique Libraries