Workshops

Sharing experience with LibQUAL+ (Monday 11:00 – 12:30)

Analysing LibQUAL results is one thing but before that happens choices have to be made about the survey, it then has to be run, closed and analysed. There are some things we might do without thinking, for example, why do we ask Staff for their opinions if we don’t use that data? This shared experience workshop will give us the opportunity to do just that – share concerns, share questions and even face some fears.

Eileen Theodore-Shusta MAS, MLS

Director of Planning, Assessment & Organizational Effectiveness. , Certified Executive & Leadership Coach., Ohio University Libraries.

Norman Boyd

UX & Quality Coordinator, Anglia Ruskin University

Moving to a balanced scorecard: The Manchester Experience (Monday 13:30 – 15:00)

In 2016, the University of Manchester Library moved from using a management dashboard for supporting leadership meetings to a balanced scorecard approach, with the specific aims of improving data-driven decision making at management level and encouraging senior managers to consider the relevance of what was being  tracked and measured against the University’s strategic objectives.  A series of interactive events were promoted to all Library staff in order to overcome the initial lack of knowledge around the theory and aims of scorecards. From these events, a draft scorecard was produced for further refinement by the Library Leadership Team and the first scorecard was produced to cover the 2015/16 academic year. Having established the top-level scorecard, the next step of the process has been to cascade the balanced scorecard approach across the Library and the aim is to have this completed by July 2017.

The process followed in creating a bespoke scorecard from scratch for the Library’s Leadership Team and the subsequent process used to cascade the scorecard throughout the Library will be explained, with an opportunity for those present to share their own experiences and thoughts on the use of balanced scorecards (including visualisation software) in an academic library setting.

By the conclusion of the workshop, participants will

* possess a basic understanding of balanced scorecard theory

* understand the problems and issues that arise when applying and adapting scorecard theory in a public sector setting

* be able to reflect on how to move towards a greater senior decision-making culture which is increasingly data-driven

* be able to evaluate potential quadrant headings, measures and targets specific to a library setting

* understand the link with the wider institution’s strategic planning aims

* be able to evaluate various methods of engaging staff in the process and of achieving staff buy-in to the scorecard approach

* understand the specific issues arising when cascading a scorecard approach across the whole library

* understand some of the practical issues that arise when adopting a balanced scorecard

Peter Wadsworth

Executive Assistant to the Librarian and Project Officer for the initial Library Management Dashboard project in 2012. The EA role now includes responsibility for the creation of the quarterly balanced scorecard for the Library’s Leadership Team and the collation of other statistics within the Library.

Simon Bains

Head of Research Services and Deputy Librarian, University of Manchester Library.

Introduction to Assessment Methods: Surveys, Interviews, and Focus Groups (Monday 15:30 – 17:00)

This workshop focuses on how to create and conduct surveys, interviews and focus groups. Using interactive activities and lecture, this workshop will discuss common features of these three tools, including informed consent, biases, invitations to participate, scope, and sampling size.

The workshop will then split the remaining time between surveys and interviews & focus groups. Survey specific topics will include key features to writing good survey questions, including response choices, format, demographic questions, and examples of question types. The section specific to interviews and focus groups includes discussion of the utility triangle, question writing, question asking, group dynamics, interview strategies, recording, and location. Participants will be provided with a workbook of activities to guide through the workshop as well as being a resource for future use of these tools.

Participants are encouraged to bring one or two assessment projects where these three tools may be of use. The intent is to provide participants with solid ideas and understandings they can use the next day in library assessment. While this workshop is intended for those wishing to learn how to use these assessment tools, all are invited to improve their skill set and to share their expertise and experience.

Holt Zaugg

Holt Zaugg completed his PhD in Educational Inquiry, Measurement, and Evaluation from BYU. He is currently the Assessment Librarian at the Harold B. Lee Library BYU, where he designs, conducts, analyzes, and disseminates findings from a wide variety of evaluations including collaborations with other faculty and students on campus.

Creating a Data Inventory (Tuesday 11:00 – 12:30)

A data inventory is a document/database that describes the data that an institution gathers: what data it collects, how and when it is collected, why and how it’s used, and where it is stored. In most cases, a data inventory is not a place to store the collected data itself, but rather to track information about the collected data. This workshop will show attendees how to create a data inventory and use it to encourage a culture of assessment in their institutions.

Data inventories are useful tools that can help:
• Provide a regular schedule for data collection
• Streamline data collection and increase efficiency
• Train new staff in data collection processes
• Track what data is used and by whom
• Easily update collection processes

Data inventories are useful tools for many different library positions. Assessment coordinators can use them to find out what data is collected, where to find it, and whom to ask about it. Library staff who collect and report data can use a data inventory to check on what data they should report, how often, and how to collect it. A data inventory is also useful to communicate the purpose and use of data to all library staff. Library administrators can use it to check for gaps in data coverage in a given library area.

This workshop prepare attendees to create a data inventory–clarifying its purpose, forming questions for data providers, and creating a project timeline. Attendees will identify stakeholders, create questions that prompt reflection on how data is gathered and used in their organization, and examine library goals to brainstorm related data points.

Starr Hoffman

Starr is Director of Planning and Assessment at the UNLV Libraries, where she assesses many activities and leads strategic planning. Her research examines the impact of academic libraries and their role in higher education. She is the editor of Dynamic Research Support for Academic Libraries.

Ethnographic Observation (Tuesday 13:30 – 15:00)

Donna Lanclos

Donna is an anthropologist working to inform and change policy in higher education, in particular in and around libraries, learning spaces, and teaching and learning practices. She blogs about this and other work at donnalanclos.com, and you can also find her on Twitter @DonnaLanclos.

Andrew Asher

Andrew Asher is an anthropologist and the Assessment Librarian at Indiana University, where he leads the libraries’ qualitative and quantitative assessment programs and conducts research on the information practices of students and faculty. Asher specializes in utilizing ethnography in libraries, and is the co-editor of College Libraries and Student Culture.

Wearable Eye tracking in Libraries (Tuesday 15:30 – 16:15)

One of the most exciting advances in eye tracking technologies is the development of lightweight wearable eye trackers, usually in the form of glasses, that can capture participants’ eye movements across a physical environment. Wearable eye tracking studies are still quite new; there are no example studies in Lund’s (2016) literature review of eye tracking in libraries, and only a handful in the related field of museum visitor studies. One such study compared label viewing in an art museum and a lab setting, and found that while reading time did not differ between contexts, participants viewed more total artworks in the museum context, and viewing increased with self-reported appreciation of art (Brieber, Nadal, Leder, & Rosenberg, 2014). Another study employed wearable eye tracking to compare experts’ and novices’ trajectories through museum exhibitions and found 10 distinct viewing patterns distributed across their visitors (Eghbal-Azar & Widlok, 2013). They also noted that the eye tracking data was able to show how visitors’ gaze alternated between two closely placed exhibits—a finding that would not have been possible with traditional observational methods.

This workshop will demonstrate wearable eye tracking technology and discuss how it has been used to study library exhibitions at our institution. In addition, I will demonstrate the potential for other uses of this technology, including wayfinding and shelf studies

Kris Markman

Kris Markman is the Director of Digital Learning & User Experience (DLUX) for the Harvard Library, where she leads collaborative efforts to develop and support proficiency in innovative digital pedagogy and evidence-based decision-making across the Library. She holds a Ph.D. in Communication Studies from The University of Texas at Austin.

Reflective Practice (Tuesday 16:15 – 17:00)

Library assessment and performance measurement have turned to qualitative approaches including narrative and ethnographic methods for more meaningful demonstrations of the value and impact of services and facilities. Reflective practice is a core characteristic of qualitative inquiry, but has received minimal attention in literature related to assessment, evaluation and research in the library and information domain. Research methods texts in our field acknowledge the need for reflection when designing an investigation, and recognize the importance of active reflection in qualitative field work and data analysis, especially in ethnography, action research, and case studies; but they offer little or no practical guidance on how to engage in reflective practices or what it means to be reflexive and to “read” qualitative data reflexively (Mason, 2002). Discussions of assessment librarian competencies are similarly deficient, although Oakleaf (2013, p. 128) mentions reflective practice as a required ability. Gaps in the research and assessment literature of our discipline are reflected by a similar shortage of frameworks, models and tools to facilitate reflective practice in other areas of our professional life.

Other professions have domain-specific handbooks promoting reflective practice in the context of their discipline (Bulman & Schutz, 2013; Knott & Scragg, 2016; Taylor, 2010), and recent research in our field points to an urgent need for comparable provision (Greenall & Sen, 2016). Our project targets this gap, using participatory action research to explore development of a reflective practice toolkit for the library and information field, with the information literacy and library assessment communities as our primary testbeds. The proposed workshop will present examples and adaptations emerging as candidates for our envisioned toolkit, and field-test their suitability by having participants engage with our handouts, participate in reflective and reflexive activities, and evaluate prototype tools from their perspectives as assessment practitioners.

Proposed tools for the workshop include adaptations of Michael Quinn Patton’s (2015, pp. 72, 604-605) triangulated reflexive inquiry framework, which uses self-questioning to help researchers understand, articulate, and own their perspective and voice; Patton’s (2011, pp. 266-299) reflective practice process for developmental evaluation – a data-based, story-based, engagement-based interactive approach to investigating an issue or evaluating an initiative; and Jennifer Mason’s (2002) model of literal, interpretive, and reflexive reading of data, which encourages reflection and reflexivity in analysis and reporting.

As a result of attending the workshop, participants will gain:

  • fuller understanding of reflective practice and its relevance to library assessment;
  • practice in reflective thinking, reflective writing, and reflective dialogue;
  • raised awareness of tools supporting reflective practice in particular contexts;
  • continuing access to bespoke documentation, including early sight of project findings.

Sheila Corrall

Sheila Corrall is a Professor at the University of Pittsburgh iSchool, where she teaches courses on Academic Libraries, Research Methods, and Academic Culture & Practice. Her research interests include scholarly communication, collection development, the open movement, and the evolving competencies of information professionals. She has authored more than 100 publications.

Human-Centred Design: Pupils, Pop-ups and Protoyping (Wednesday 11:00 – 12:30)

Paul-Jervis Heath

Paul-Jervis Heath is a designer and innovation consultant. He’s Principal at Modern Human, a design practice and innovation consultancy, where he leads a wide range of design projects like information systems for autonomous vehicles, smart home appliances and concepts for future libraries.

UX Interviewing Techniques (Wednesday 13:30 – 15:00)

UX, or User eXperience, uses anthropological research methods to gain insight into how users actually behave, and also how they feel when using a space or service. Such assessment compliments quantitative data gathering (such as visitor numbers or surveys) and qualitative data gathering (such as focus groups). It can provide a deeper understanding of issues such as what they value and what irritates them.

Quantitative data can answer “How often do you use it?” Qualitative data can answer “Why do you use it?” UX can answer “How does using it make you feel?”

This workshop will introduce participants to three UX techniques that can be used as interview prompts: photo tours; cognitive mapping; and love/break-up letters. Participants will learn how to undertake these techniques, how to use them in an interview, and how to analyse the results.

Frankie Wilson

Frankie Wilson is the Head of Assessment & Secretariat at the Bodleian Libraries, University of Oxford. She received her Doctorate for the development of the Quality Maturity Model and her research interests include quality in libraries, performance measurement, organisational culture, and research methodologies.

ARL Assessment Program Visioning Task Force: Consultation (Wednesday 15:30 – 17:00)

The Association of Research Libraries (ARL) embarked on a visioning process for its assessment program in February 2017. One goal of this process is to develop a forward-looking program that advances the organizational outcomes of the 21st century research library. The Assessment Program Visioning Task Force is considering all current and potential ARL assessment-related services, including the goals, outcomes, deliverables, staff, and other resources related to the existing metrics and tools, and to the surveys in the StatsQUAL suite. This 90-minute workshop will engage attendees in discussions about some of the preliminary findings of the Task Force in order to help test them. Several of ARL’s protocols including LibQUAL+ and ClimateQUAL will be included in the discussions. Workshop participants, as practitioners in the larger assessment community, will be contributing to the overall visioning for ARL’s assessment program.

Sue Baughman

Sue Baughman is the deputy executive director of ARL. She promotes and facilitates the strategic development of ARL policies and programs. Sue engages in program development, and practical management and coordination of the Association, leading its Assessment and ARL Academy programs, and working with the ARL Board of Directors.

Elliott Shore

Elliott Shore serves as the executive director of the Association of Research Libraries, helping to envision and implement a thoroughgoing and groundbreaking Strategic Thinking and Design process leading to new directions for the Association in areas that are intended to bring coherence to the world of research libraries.