User Experience

Medieval helpdesk with subtitles

Dickson-Deane and Chen (2018) write that "user experience determines the quality of an interaction being used by an actor in order to achieve a specific outcome." Parush (2017) highlights relevant terms like human-computer interaction (HCI) and usability. Let's say then that HCI encompasses the entire domain of interaction between people and computers and how that interaction is designed, whereas user experience (UX) focuses on the quality of that interaction or an interaction with something (like a product). These are not precise definitions, and some might use the terms UX and HCI interchangeably. As ERM librarians, though, it is our job to focus on the quality of the patron's experience with our electronic services, and this entails understanding both the systems and technologies involved and the users interacting with these systems and technologies.

Dickson-Deane and Chen (2018) outline the parameters that are involved with UX. Let me modify their example and frame it within the context of an ERM experience for a patron:

  • Actor: a user of the web resource, like a library website
  • Object: the web resource, or some part of it
  • Context: the setting. What's happening? What's the motivation for use? What's the background knowledge? What's the action?
  • User Interface: the tools made available by the object as well as the look and feel. More specifically, Parush (2017) states that "the user interface mediates between the user and computer" and it includes three basic components:
    • controls: the tools used to control and interact with the system: buttons, menus, voice commands, keyboards, etc
    • displays: the information presented to the user or the hardware used to present the information: screens, speakers, etc.
    • interactions and dialogues: the exchange between the system and the user including the use of the controls and responding to feedback
  • Interaction: what the actor is doing with the UI
  • (Intended/Expected/Prior) User experience: the intended or expected use of the object. The user's expectations based on prior use.
  • (Actual) User experience: the actions that took place; the actions that had to be modified based on unintended results.

These parameters would be helpful for devising a UX study that involves observing patrons interacting with a system and then interviewing them to complete the details. Note that the same kind of systematic thinking can be applied to evaluate other user experiences, like those between a librarian and an electronic resource management system. Often the focus is on patron user experience, but it's just as important to evaluate UX for librarians and to consider UX when selecting an ERM system or an ILS system.

In any case, these parameters help us step through and highlight the complicated process of interacting with a computer, generally, or a library resource, more specifically. As with many other topics we've discussed this semester, we can also incorporate these parameters into a workflow for evaluating UX. But, it is because of the complexities involved and a focus on the systems that Pennington (2015) argues for a more UX centered approach to library website design. To emphasize their case, think about your own state of knowledge of ERM before you started to take this course. For example, a number of you have explored in depth how link resolvers function, and your experience using them as patrons and your understanding of their technical aspects as librarians have provided you with a set of skills and experiences that make you more likely to identify the cause of a malfunction if you find one. But with the ability to suss out an issue, it becomes easier to solve, and the experience itself involves less anxiety. Most users and patrons of these systems will not have any technical knowledge of these systems, and when they break for them, their frustration with the experience might lead to unfortunate outcomes. For example, they may not retrieve the information they need; they may not reach out to a librarian; they might stop using the library's resources in preference for something of inferior quality; and so forth. We need to not forget that this happens, and if possible, to likewise design for such failures of a system (because they will fail). How might we, for instance, design an action to occur when a link resolver or a proxy server fails to identify and retrieve a work?

Here's the crux. As you gain more skill and expertise with these systems, you will eventually lose the ability to see these systems as a novice user, and that distance will only grow over time. It is therefore, as Pennington (2015) argues, important to gather data from users. User experience research nurtures a user centered mindset.

The second article by Pennington, Chapman, Fry, Deschenes, and McDonald (2016) is a contribution from four librarians with theoretical and practical suggestions for UX research. Both Pennington and Chapman note the best way to measure UX or detect UX issues is to conduct research with users (recall our conversations on proactive troubleshooting), but this isn't always possible due to financial, time, or other constraints. However, there is a wealth of research on UX and in the absence of the ability to conduct research, locating prior research and applying that research to your setting is paramount. Drawing from the literature, Chapman describes several important UX principles that include:

  • chunking
  • highlighting and prominence
  • KISS
  • choice simplification
  • choice reduction

In addition to user studies and prior research, libraries possess a wealth of data to explore and this data could provide a lot of insight. The same caveat applies that Chapman described: if only you have time to analyze the data. But Fry points out that you can access library data about location of user, device type, resource use, and more. (There are some privacy issues involved with this, and we'll talk about them at the end of the semester.) Furthermore, here's where the Browning (2015) article is helpful. Whereas Fry describes what we can learn from data about usage, Browning describes what we can learn from data about breakage. Both kinds of data can offer us a substantial understanding about user experience.

If you can conduct a user study, then Deschenes offers helpful tips on recruiting users. Remember that the users that you recruit should be actual users of the systems. If you are interested in why some segments of the population do not use the library's resources, then that would be a different kind of study.

I agree with McDonald that despite having around 30 or so years of experience with web-based electronic resources, more with other types of electronic resources that existed prior to the web and that were based on the then internet, on optical disks, etc., we are still in the throes of disruption. There's a lot yet to learn about design for the web, just like there's a lot of left to learn about how to design a home or office, and nothing will be settled for a while. Although I doubt if there will be any single dominate user experience or user interface, since there are many cultures and backgrounds and experiences, I'm fairly sure the low-hanging fruit problems will work out eventually. Remember though that 95% of the cause of all of this is due to copyright issues, which necessitate the entire electronic resource ecosystem and the complications that are introduced by having to work with vendors who work with different, but overlapping, publishers, etc. If something were to change about copyright, then it's a whole new ballgame.

On a final note, you might be wondering how information seeking is related to HCI and to UX. Anytime we interact with a computer (broadly speaking) in order to seek information, then we have an overlap. But there are areas of non-overlap, too. We don't always use computers to look for information, and we don't always look for information on computers. UX is like this, too. UX is not always about computers but can be about user experience generally. I bring this up because if you do become involved with UX work at a library (or elsewhere), then I'd encourage you to refer also to the information seeking and related literature when it's appropriate to do so. Remember, it's all interconnected.

References

Browning, S. (2015). Data, Data, Everywhere, nor Any Time to Think: DIY Analysis of E-Resource Access Problems. Journal of Electronic Resources Librarianship, 27(1), 26–34. https://doi.org/10.1080/1941126X.2015.999521

Dickson-Deane, C., & Chen, H.-L. (Oliver). (2018). Understanding User Experience. In M. Khosrow-Pour (Ed.), http://services.igi-global.com.ezproxy.uky.edu/resolvedoi/resolve.aspx?doi=10.4018/978-1-5225-2255-3.ch661 (Fourth Edition, pp. 7599–7608). IGI Global. http://www.igi.global.com/chapter/understanding-user-experience/184455

Parush, A. (2017). Human-computer interaction. In S. G. Rogelberg (Ed.), The SAGE Encyclopedia of Industrial and Organizational Psychology (2nd edition, pp. 669–674). SAGE Publications, Inc. https://doi.org/10.4135/9781483386874.n229

Pennington, B. (2015). ERM UX: Electronic Resources Management and the User Experience. Serials Review, 41(3), 194–198. https://doi.org/10.1080/00987913.2015.1069527

Pennington, B., Chapman, S., Fry, A., Deschenes, A., & McDonald, C. G. (2016). Strategies to Improve the User Experience. Serials Review, 42(1), 47–58. https://doi.org/10.1080/00987913.2016.1140614