Tag Archives: Ashley Moye

Database Usage Statistics and Challenges of Determining the Value of Electronic Expenditures in the Legal Realm

tsll

Since the shift to the digital age, libraries must manage not only their traditional print collection, but also their ever-increasing electronic offerings. Electronic resources are appealing options, as they generally require less time to initially process, maintain their currency with little effort, allow you to enhance your collection free of physical space constraints, and often allow multiple patrons to use them at the same time. However, print materials are not without their own unique merits. I’m steering clear of the soapbox on the print versus electronic debate, though. Instead, I’m going to talk about another part of the puzzle, one which is remarkably challenging in the legal realm—finding a way to balance the costs and benefits of our expenditures on electronic offerings by leveraging available usage data. This is a task that becomes even more difficult as budget lines shrink and cancellations are necessary.

Legal database and print material costs are sky-high, and both keep getting higher. Additionally, legal materials require a standard of currency, regardless of whether they are in an academic, public, or private setting. Once you’ve decided to invest in a particular resource, keeping it up-to-date is rarely an optional fee, if you want your original investment in the materials to maintain its value to your patrons. To add insult to injury, if the updates jump in price, you have little recourse—you must simply find the money somewhere or cancel the material and have it begin to lose value. I’ve seen times where it’s far cheaper to buy a brand new up-to-date set of materials every few years rather than pay the update costs continually; assuming, of course, that slightly out-of-date print materials can meet the needs of your patron base.

Obviously, electronic versions of legal resources don’t require updates like loose-leaf releases, advance sheets, pocket parts, and supplements. While you’re still responsible for the cost of maintaining access to the resource and are beholden to price increases there as well, the staff time it takes to check in the materials, process them, and update them in a timely manner can help push any cost-benefit analysis in favor of the electronic resource. Cheaper and faster and more up-to-date is often the mantra of electronic resources.

There are a few things to beware of, though, when debating investing in electronic resources over print resources or even when choosing which electronic resources to purchase. Databases can be full of bells and whistles that seem exciting to your library staff, but which your patrons may not use. There could be a cheaper option with an interface that appears limited to you, but which easily meets your patrons’ needs. Databases have practically unlimited cloud storage, so it’s easy to pad title counts that may impress you, but which your patrons never even notice. Oftentimes, materials are available on multiple database platforms or through open access, freely available online. Isolating specific user actions within databases and tracking the total amount of activity in a consistent manner can help us be sure that our investments are worthwhile and the cost-benefit analysis is accurate.

It’s not as easy as it seems, though. In the print world, you can measure the ‘usage’ of a book by tracking check-outs and developing a simple method to approximate in-house usage for materials which cannot be checked out, such as asking for books not to be re-shelved. This allows librarians to formally track data on books pulled off the shelf, and it provides valuable informal usage data through observation of patron behavior.

However, patrons can use electronic materials outside the physical space of the library where it is impossible to gather informal information through mere observation, and the two basic ideas of check-outs and in-house-usage tracking are replaced with a cacophony of terms, such as page views, record clicks, downloads, hits, article views, users, sessions, searches, and more. Each database defines their own terms and their own “usage statistics,” and databases aren’t required to provide you with any statistics at all. Some may offer no way of quantifying your specific patron group’s user behavior, and you’re left to fill in the blanks as best you can with surveys, user experience testing, and ILS tools such as Web Access Management systems. These measure access, not necessarily activity, and thus have their own set of limitations.

Enter Project Counting Online Usage of Networked Electronic Resources (COUNTER). COUNTER is an initiative which focuses on setting consistent, credible, and compatible standards for both recording and reporting online usage statistics for online journals, databases, books, and reference works. COUNTER has also worked with the National Information Standards Organization (NISO) on the Standardized Usage Harvesting Initiative (SUSHI), a protocol that allows tools to automatically retrieve and consolidate usage statistics across vendors. COUNTER-compliant databases use the same definitions for their metrics, count things in the same way, and report them in identical formats with consistent report names. I always like saying it gives librarians a chance to put two databases next to each other and compare apples to apples, instead of apples to yellow. COUNTER reports also give you a chance to create consistent return-on-investment measures, such as cost per search or cost per session, which allows you to compare the value of these databases to your patrons across the board.

Without a doubt, Project COUNTER is a stroke of brilliance. However, vendors are not required to be COUNTER compliant any more than they’re required to provide you with usage statistics. And in the legal realm, where a few big players run most of the show and monopolize your budget lines, only a few vendors offer COUNTER reports. This factor makes navigating the world of gathering and comparing usage data across platforms especially challenging. While usage data may not be the only thing that you use when deciding whether to cancel or keep an electronic resource, it can be an incredibly valuable tool. The challenge is finding ways to effectively quantify our return on investment without the luxury of COUNTERcompliant reports.

In my next column, I’ll offer some practical advice on delving into usage data within the legal field. As my law library life has been solely in the academic realm, I would welcome any advice from those outside of academia to help flesh out my own tips and tricks. If you have some practicalities to share, whether private, public, or even academic, please email me at amoye@charlottelaw.edu and I’ll include you in my next column!

~Ashley Moye~

 Technical Services Law Librarian (TSLL)  is an official publication of the Technical Services Special Interest Section and the Online Bibliographic Services Special Interest Section of the American Association of Law Libraries.  This article originally appeared in the March 2015 issue and is reprinted here with permission.

Leave a comment

Filed under CharlotteLaw Library Team Members, Library, Technical Services

A Stocking Full of Metric Goodies

tsll

It’s been a while since my last column, and my bag of ideas has found itself overflowing with useful links on a myriad of topics. Seeing as how it is the holiday season, and I couldn’t choose a single topic any more than I can choose a single dessert at the holiday table, I decided to celebrate the spirit of giving with a mash-up approach, a veritable stocking stuffed full of metric-centric resources and commentary.

A stocking usually has something eye-catching and especially relevant sticking out of the top. So, let’s get this party started with a goodie bag of altmetrics, an alternative approach to metrics that measure the impact of scholarly research outside the ever-so traditional confines of citation-based metrics. In today’s online world, measuring scholarly footprints is no longer as simple as tracking formal citations. Not only has social media and the increasingly pervasive information super-highway introduced new channels for broadcasting and sharing works, but it’s also ushered in the world of open access and an ability to immediately both access and share scholarly publications. So, for example, how do we accurately capture the impact and value of work that may not only be formally cited, but may also have been shared through social media channels at length? Enter altmetrics, the buzzword that, although making its rounds since 2010, recently skyrocketed in popularity. If you haven’t noticed, even familiar faces are starting to climb on the bandwagon, such as EBSCO, who recently partnered with Plum Analytics.

Katie Brown, our library director here at Charlotte Law, introduced me to the concept when she was first hired. She also presented at the annual meetings of both the Center for Computer-Assisted Legal Instruction (CALI) and American Association of Law Libraries (AALL) this year, demonstrating some of the new altmetrics tools for assessing and tracking scholarly impact. Even the latest issue of the AALL Spectrum featured an article on altmetrics, which did a stellar job of providing an accessible overview of the field, highlighting various tools you can use for measurements, and clearly explaining a variety of reasons to consider utilizing altmetrics in your own sphere. If your curiosity is piqued and you want to learn more, here are a few more resources I’ve found:

  • The Altmetric Bookmarklet can be installed on Chrome, Firefox or Safari, and allows you to find article-level metrics for any paper you’re reading on your computer.
  • Publish or Perish is a free software program that retrieves and analyzes academic citations and raw data from both Google Scholar and Microsoft Academic Search, providing metrics that allow you to present your research data in its best light.
  • Altmetrics for Librarians and Institutions: Parts 1, 2, & 3 give a comprehensive overview of altmetrics in plain language. Part 1 covers the basics; Part 2 is specifically geared towards librarians, focusing on ways altmetrics can assist with selection management and illustrate collection value, providing you with real-time stats and increasing the value of the guidance you give to your own research community; Part 3 focuses on using altmetrics in decision making within a greater academic context.
  • The April/May 2013 ASIS&T Bulletin features a special section on altmetrics with a variety of more in-depth and specialized papers that demonstrate the broadening scope of altmetrics scholarship.
  • Scholarship: Beyond the Paper by Jason Priem is a somewhat technical piece, written by the founding father of altmetrics, but is definitely an interesting take on the future of altmetrics and the possibility that the traditional peer-reviewed journal and article approach is transitioning into an all-new scholarly communication system.

So let’s dig a little deeper into our stocking now, shall we, and see what the smaller gifts underneath may be?

David Lee King, one of my all-time favorite librarian bloggers, recently did a whole series of posts on “Analytics for Social Media.” King discusses social media analytics that his library tracks, starting with activities analytics, then moving on to audience, engagement, and referral metrics, and finishing up with the grand master of social media analytics – return on investment (ROI). What I enjoy is that he not only explains what metrics he collects and exactly how he counts them, but he also succinctly explains WHY he counts them. As he keeps his posts tastefully brief, I’ll let his words speak for themselves.

Did you miss the webinar from the Philadelphia Chapter of the Special Libraries Association (SLA) on “Leveraging User Data for Strategic Decisions”?  Never fear – the recording is available online. This webinar provides great examples of how two libraries are gathering user data to help make business decisions and to improve learning environments for patrons.

This blast-from-the-past blog post is a great reminder that you shouldn’t just be counting – you should be COLLECTING. Mary Ellen Bates has three, and only three, great questions to ask patrons after every job:

  1. Did we meet your information need?
  2. Would you like us to do additional work on this or set up an alert?
  3. How was this information useful to you?

Bates suggests that while not all patrons will respond, the testimonial information you get from these questions is invaluable in telling the true story of your library and its net worth. Make reports to management including the best stories and link it to any new initiatives you’re exploring. Bates also suggests, if applicable, developing a multiplier to represent how much time your research saves other employees, so instead of reporting the number of research hours you’ve done, you can report the value of the time you’ve saved. Basic counts then become a clear measure of impact on the bottom line.

And last but not least, as Santa supposedly covers the globe in a single night, here’s a little international flair to bring us home again. Did you know that the International Federation of Library Associations and Institutions (IFLA) has a Statistics & Evaluation Section, and even an e-Metrics Special Interest Group? Did you know that this section created a Library Statistics Manifesto in August of 2008 to serve as a certified document about the importance of library statistics? And, more importantly, did you know that they keep an up-to-date bibliography on the “Impact and Outcome” of libraries, including resources on impacts on information literacy, academic success, society, electronic services and more? The IFLA Section even joined forces a few years ago with some other major players to develop and test a new set of statistics that could be used by libraries worldwide. Collecting these statistics regularly on a national basis could provide reliable and internationally comparable data of both library services and library use.

And that brings us to the end of our stocking. At least it wasn’t an orange stuffed in the toe, right? Happy holidays to you all, and as always, feel free to send any questions or topics you’d like for me to cover in the future to amoye@charlottelaw.edu. Nothing makes my day brighten quite like hearing from a reader.

~Ashley Moye~

 Technical Services Law Librarian (TSLL)  is an official publication of the Technical Services Special Interest Section and the Online Bibliographic Services Special Interest Section of the American Association of Law Libraries.  This article originally appeared in the December 2014 issue and is reprinted here with permission.

Leave a comment

Filed under CharlotteLaw Library Team Members, Library, Technical Services

OMGMetrics

tsll

When I first proposed beginning a column on metrics, it seemed like a common sense notion. In fact, the proposal practically wrote itself. Library metrics are the hottest of topics, as we’re simultaneously a service industry and an industry whose value to patrons and communities is difficult to quantify. This results in our necks traditionally being one of the first on the chopping blocks during cuts, and our staff and supporters constantly fighting for more allocated resources. Qualitative anecdotes don’t defend our worth effectively in this business-savvy, metrics-driven world, nor do they assure that we’re maximizing value for our patrons in our expenditure choices.

As a true librarian at heart, once the column was approved, I started my research. Often when beginning research, you cast your first net with extreme caution, prepared to be buried under a towering mound of inaccurate or inapplicable results. Surprisingly, despite the importance and value of library metrics, I discovered they aren’t touched on with near the frequency you’d expect. Why this phenomenon? I have some ideas.

Let’s face it. Librarians are rarely math-centric. I learned this as a MLIS student with an undergraduate degree in actuarial science. While like majors could bond over their commonalities, I always felt a little lost – who needs a math librarian? Further in to my library school career, I was swept up into Technical Services librarianship when I came in for a part-time reference desk job interview at my legal resources professor’s workplace and the Technical Services Director saw math featured prominently on my resume. She immediately usurped my reference interview and stole me away to the land of backlogs of Westlaw and Lexis bills, much to my delight. In retrospect, I don’t even remember interviewing formally. You say, “statistics,” and librarians’ ears perk up. You say, “I like numbers,” and their eyes light up. Then, they hand you a stack of papers covered with numbers and run before you can hand it back.

Yes, people who have bad memories from their math classes growing up are often squeamish around things number-related. While I understand that fear completely, library metrics are completely different. Hence, one of my goals at the outset of this column is to help our amazing group of technical services law librarian readers realize that hearing the word “metrics” is not synonymous with “panic.”

To begin, let’s go over some basic concepts and vocabulary regarding metrics and their uses in libraries. First, all metrics aren’t created equal – for example, they: (1) use different collection and evaluation methods; (2) speak to different audiences; and (3) serve different purposes. Understanding the breadth of this topic is the first real step in creating and tracking functional metrics, which can then effectively communicate value and aid in decision-making. There are many things you can measure in the library falling into the general categories of inputs, processes, outputs, outcomes and impacts.

“Inputs” is a fancy name for resources used to produce or deliver a program or service, like staff, supplies, equipment, and money. Through processes, these inputs become outputs – resources and services that you produce, including your available materials and the programs you organize and host. Input and output tracking gives you those first glance statistics, easy to count, measure, and report, as these are tangible things. Outputs are usually what are reported to stakeholders or decision makers, e.g., we check out this many books, we have this many research guides, or these many people use the library. However, these metrics don’t accurately demonstrate the value of our services and our products.

And here’s where outcomes and impacts come in. I tend to agree with the school of thought that outcomes and impacts are the same thing, seen from different perspectives. Outcomes are changes from the perspective of our customers and impacts are the same change from the perspective of a stakeholder, usually more of a high level change, with long-term effects on the larger community. These metrics are known by quite a few names, including impact metrics, performance metrics, and outcome metrics, and are primarily intangible, making them much more difficult to measure. Naturally, they also communicate the most value and provide the most guidance and support.

Let’s be clear. Metrics are different from statistics, and for that matter, so is data. Just because you did poorly in your statistics class or didn’t score highly on the quantitative section of the GRE doesn’t mean that you should run from data or cringe when metrics is bandied about in a meeting with stakeholders or decision makers. Formally, data is qualitative or quantitative attributes of a variable or set of variables which typically arises as a result of measurements. Statistics don’t even come into play until you study the collection, organization, and interpretation of this data. Even better, in the library world, statistics don’t necessarily require the use of Greek letters or even convoluted equations. Most statistics, measures, and metrics can be organized into operating metrics, customer and user satisfaction metrics, and value and impact metrics.

Operating measures and operational statistics (such as how many people came to the library, how many check-outs took place on a certain day, and how many hits we had on a database) lend themselves well to understanding resource allocation, improving efficiencies, and making budget determinations. Customer and user satisfaction metrics, on the other hand, tell us how well the choices we made are doing based on operating measures and indicate what improvements may be required.  Value and impact measures are incredibly meaningful in their own right, as they often incorporate satisfaction and the importance of separate outcomes. These are the most elusive of all measurements; so naturally, they’re the most valuable.

Martha Kyrillidou, senior director of the Association of Research Libraries statistics and service quality programs, once said “what is easy to measure is not necessarily what is desirable to measure.” This is such a true observation regarding metric gathering in libraries – easy measurements rarely result in meaningful statistics, meaning one of your first challenges is figuring out how to make the things you choose to measure meaningful. Simply put, a meaningful measure shows you how much value you’re getting out of your investment. This could mean the investment in the library itself and the value that the stakeholders or decision makers are getting out of that investment, or it could mean what sort of value your customers are getting out of how the library chooses to invest their resources, both in terms of financial outlay and in terms of staff time. To determine meaningful measures, you need to understand your stakeholders or decision makers, and you need to understand your customers.

For instance, quantitative resource usage information doesn’t show how or why users are using materials, or even indicate how satisfied they are with the products. Relying solely on quantitative data, such as a basic measure of number of hits, isn’t necessarily enough to justify value to stakeholders and customers. Our most popular blog post at the law school, according to easily generated WordPress statistics, is one featuring a cartoon sun. Looking at the numbers and reports, you’d assume this was an incredibly popular post and maybe even assume it contributes a lot of value. However, this particular post features a metadata tag for “cartoon sun,” and one of the most searched keywords that leads people to our blog is – you guessed it – “cartoon sun.” Here, it’s obvious that a simple number doesn’t communicate actual value to our customer base or to our stakeholders and decision makers.

Similarly, one database may feature twice as many hits as another database when comparing generated usage reports, but that could be because it has a convoluted interface (possibly even for the sole purpose of generated inflated hits). Again, just because it’s an easy measure doesn’t mean it’s meaningful. Qualitative data, such as patron survey feedback and user experience testing, provides the context within which to view these numbers. This often means using a hybrid approach of both quantitative and qualitative data.

So there you have it. The metrics world is wide and wild, and this column will do its best to shine light on as many parts of it as possible. In addition to detailed discussions of the general metric concepts already mentioned, additional topics will include collection methods, statistical concepts in a nutshell, resource usage statistics, COUNTER and SUSHI, collection and transactional statistics, consortia challenges, web metrics, altmetrics, faculty support, law firm and public law library metrics, performance indicators and benchmarks, as well as discussion of tools for presentation and manipulation of data.

I’m still figuring out how best to approach the column to meet the needs of our audience, and since the next issue is devoted to American Association of Law Libraries (AALL) Annual Meeting program reports, this column won’t reappear until fall. I’d love to hear any suggestions on format and approach, any questions you’d like for me to attack, or any topics you’d like for me to cover. Shoot me an email at amoye@charlottelaw.edu, and let me know what you think!

~Ashley Moye~

 Technical Services Law Librarian (TSLL)  is an official publication of the Technical Services Special Interest Section and the Online Bibliographic Services Special Interest Section of the American Association of Law Libraries.  This article originally appeared in the June 2014 issue and is reprinted here with permission.

Leave a comment

Filed under CharlotteLaw Library Team Members, Library, Technical Services

What’s in a Day as a Charlotte Law Librarian?

We are excited to announce that the Charlotte School of Law Library won First Place in the “Best Video” category of the 2014 American Association of Law Libraries 2014 Day in the Life contest!

librarians on patrol: no book left behind

librarians on patrol: no book left behind

In Spring of 2013, in preparation for our impending move to a high-rise in uptown Charlotte, we began a massive book giveaway initiative to rid the collection of redundant materials, free up space, and share these resources with our law students and local legal community. Through this project over thirteen thousand books found loving families, but in the midst of the madness, a few books ended up scampering away that needed to come back home. Enter the Librarians on Patrol – in October, six of our staff, both strong and brave, took a trip in a U-Haul across state lines to find our babies and bring them back so they could be stored, wrapped and transferred to our new library shelves come January 2014.

Featuring: Aaron Greene, Ashley Moye, Brian Trippodo, Cory Lenz, Kim Allman & Minerva Mims

Filmed October 11, 2013 in Charlotte, North Carolina and Rock Hill, South Carolina

“Addy Will Know” courtesy of SNMNMNM – snmnmnm.bandcamp.com/

~Ashley Moye~

Leave a comment

Filed under Careers, CharlotteLaw Library Team Members, Events, Hidden Treasures, Librarians Can Be Fun Too, Library, Unique Libraries

The New OrgSync Interface

OrgSync is releasing a powerful new update to improve your user experience.  This redesign features better organization, simplified navigation and added functionality.

The redesign release is planned for July 22, 2014.  We will notify you if this date changes.

 

And check out the new OrgSync iPhone app, which makes it easy to access what you need, when you need it!

While OrgSync is already mobile-friendly and can be accessed through all mobile web browsers, the new iPhone app can assist iPhone users in the following tasks:

  • Find and join organizations
  • Read the latest campus news
  • Discover and RSVP to events
  • Access campus information, bookmarks and forms
  • Connect with peers and organizational leaders
  • Contribute to discussions

For more information, and to download the app, visit http://www.orgsync.com/mobile.

~Ashley Moye~

Leave a comment

Filed under Library, Student Information