Menu
Log in


California Academic & Research Libraries

Log in

SCIL Works 2026

Assessment is Constructed and Contextual: Measuring What Matters in Information Literacy

Friday, February 20, 2026
9:00 am - 1:15 pm
Online

Please join the Southern California Instruction Librarians (SCIL) on Friday, February 20 from 9am - 1:15pm for SCIL Works 2026, a virtual, half-day conference. This year’s SCIL Works will focus on assessment in library instruction, in all its forms and nuances.

What do we mean when we say “assessment?” The answer is, as our title suggests, always constructed and contextual. A library director may define it as demonstrating the value of library instruction to campus administration. An instruction coordinator may think of using finished student work to measure the impact of a one-shot earlier in the year. A student might wonder why they’re being asked to complete so many evaluations of their learning experiences, and where that information is going. All of these perspectives, and more, are part of the assessment landscape. Definitions of information literacy are rapidly evolving; are our assessment methods keeping up?

At its best, assessment helps to answer the question: “How do we measure, reflect on, and evaluate what students are learning?” The SCIL Works annual mini-conference is an opportunity to learn about interventions, best practices, innovative pedagogy, and creative solutions in instruction and information literacy assessment.

Schedule

  • 09:00 - 09:10: Welcome and Opening Remarks
  • 09:10 - 10:25: Session Block 1 (Presentations and Show & Tell)
  • 10:25 - 10:35: Break
  • 10:35 - 11:35: Session Block 2 (Presentations)
  • 11:35 - 11:45: Break
  • 11:45 - 01:00: Session Block 3 (Presentations and Show & Tell)
  • 01:00 - 01:15: Closing remarks and evaluations

Presentations

Research and Practice: 20 minutes followed by 5 minute Q&A
Show & Tell: 10 minutes including Q&A

Session Block 1

9:10 - 9:40 Presentation
Show Me: Performance-focused Assessment in One-Shot Instruction
Dominique Turnbow, Instructional Design Librarian, University of California, San Diego
Amanda Roth, Reference Coordinator, University of California, San Diego

Instruction librarians rarely have access to the student course work needed in order to evaluate the effectiveness of a workshop or assess student learning as a result of a one shot instruction session. In this session, we will demonstrate how we have effectively gathered actionable student output from a library instruction session in order to make meaningful changes to instruction content (assessment) and/or delivery (evaluation) using Google Forms. We will have examples of how we have done this for tutorial content and curriculum-integrated one-shot workshops, ranging from small capstone courses to large (300+ students) writing program courses. 


9:40 - 9:55 Show & Tell
Assessment in the Moment: Using Shared Documents and Reflective 
Surveys to Measure What Matters in One-Shot Instruction
Katie Perry, Social Sciences Librarian, California State University, Los Angeles

Assessment in one-shot library instruction is often shaped by significant constraints: limited time, varied student preparation, and increasing pressure to demonstrate impact without contributing to assessment fatigue. This session shares a paired, low-stakes assessment approach designed to be both pedagogically useful and ethically grounded.

During one-shot sessions, students participate in a collaborative, shared document activity that functions as an in-class formative assessment. Students enter their research question as a complete sentence, translate it into keywords and Boolean search phrases, and add a citation to a relevant scholarly article found in a library database. Because student work is visible in real time, this activity allows the instructor to identify patterns, correct misunderstandings, and workshop research strategies live—treating assessment as instructional care rather than post-hoc evaluation.

This in-class assessment is complemented by a brief, optional post-session survey that centers student reflection rather than skills mastery. The survey focuses on perceived usefulness, key takeaways, and unmet learning needs, and is intentionally limited in scope to respect student labor and reduce assessment fatigue. Together, these methods provide a contextual picture of learning that balances immediacy, care, and consent.

Framed within the theme “Assessment is Constructed and Contextual,” this session demonstrates how modest, intentionally designed assessment practices can support student learning while resisting extractive or compliance-driven models. Participants will see how everyday instructional decisions—what we assess, how we assess, and what we choose not to measure—reflect deeper values about teaching, equity, and ethics in library instruction.


9:55 - 10:25 Presentation
Begin at the End: Using Retrospective Pre-Then-Post Tests to Assess 
Learner Motivation
Bridgid Fennell, Social Sciences Librarian, University of Southern California

Stress, anxiety, confidence, self-efficacy, motivation — consider how students’ disposition relates to their engagement with information literacy. How can instruction librarians measure and support student disposition? This presentation will introduce a technique to assess students’ disposition toward library research. Attendees will learn about the retrospective pre-then-post design method to document learners’ affective domains in information literacy. When this methodology is integrated with the backward design approach to learning, librarians can holistically integrate the ACRL Framework of Information Literacy to assess disposition in addition to knowledge practices.

Assessment, or the measurement of student learning, is critical to student success, but librarians may not have the time, theoretical foundation, or technical skills to incorporate assessment in information literacy sessions. The Framework for Information Literacy presents interrelated core concepts which are often treated as one-dimensional guidelines for instructional content to be covered. However, the Framework is grounded in the backwards design framework (Wiggins & McTighe, 2008) which prompts students to demonstrate their understanding of curricular concepts and skills. Backward design can also be applied to disposition. Prioritizing assessment leads to measurable objectives that document the impact of information literacy instruction on student learning and disposition.

The Framework for Information Literacy incorporates learning dispositions, or the “affective, attitudinal, or valuing dimension of learning” (2015). These motivational characteristics span the information literacy continuum, from initiating library research tasks to ethically creating rigorous scholarship. Student disposition can be measured indirectly through observation, but direct assessment through quantitative methods like the pre-post test design documents attitudes of entire classes and objectively measures change as a result of instruction. The single retrospective pre-then-post design offers many benefits over the traditional pre-post test such as saving time and reducing response-shift bias. Librarians can use existing or free tools like LibWizard or polling software to seamlessly include the retrospective pre- then post-test method.

Session Block 2 

10:35 - 11:05 Presentation
Information Literacy and the Core Curriculum: Advising Faculty on Curriculum Revision 
Hugh Burkhart, Professor and Coordinator of Instruction and Undergraduate Learning, University of San Diego
Martha Adkins, Associate Professor and Research and Instruction Librarian, University of San Diego

The presenters’ university launched a revised undergraduate core curriculum in Fall 2024. As part of this revision, information literacy (or CILT) was moved from the Historical Inquiry area of the core to its own flag across the academic disciplines. Faculty at the university library were integral to this process, beginning with a 2019 task force that included two librarians along with other university faculty. This group drafted new learning outcomes informed by the Framework for Information Literacy for Higher Education and the American Association of Colleges and Universities’ Information Literacy VALUE Rubric. After extensive faculty review, the CILT Competency was approved. 

Upon approval, a new Core Area Representative (CAR) was added to the university’s Core Curriculum Committee (CCC). The current CAR is a librarian who collaborates with two CILT faculty fellows to review course proposals and advise faculty on revisions based on the Area Task Force report that led to the current CILT learning outcomes. To support faculty with the submission process, the CILT CAR and faculty fellows developed a guiding document outlining best practices for ensuring learning outcomes are met and demonstrated in the syllabus; the course schedule; and, most importantly, exercises; major assignments; and any rubrics used to assess this work. Exemplar course materials were linked to the guiding document as more submissions were reviewed and successfully recommended for approval. 

Librarian involvement in the course review process has created opportunities to educate faculty on information literacy instruction and assessment while promoting library engagement. The next step for the CILT team is drafting a general CILT course rubric to clarify expectations for students and assist faculty with designing effective assignment prompts and evaluating student work. These efforts have helped establish a new focus for undergraduate library instruction at our institution. 


11:05 - 11:35 Presentation
Critical assessment practices in academic libraries: What is “Critical 
Assessment” and how can librarians use it to better understand our relationship with student learning?
Tricia Lantzy, Health Sciences and Human Services Librarian, California State University, San Marcos

Implementing and maintaining effective assessment practices in academic libraries is essential to gaining a clear understanding of how library programs, instruction, and spaces impact students. In colleges and universities broadly, the role of assessment holds a contentious space at the intersection of administrators and faculty. Assessment measures are used by college administration to make evidence-based decisions that further student success and communicate impact, while faculty often see assessment as a hindering force that does not live up to the purported role it holds on campuses. In academic libraries, administrators use gate counts (how many people are using the library?), online statistics (how many are accessing the online resources?) and other quantitative measures to communicate a story about the value of the library to the campus community, while many librarians push back on the methods by which administrators are accessing and using this data (Beauchamp & Rawls, 2020). These conflicting perspectives of assessment require library practitioners to find a middle ground where assessment fulfills its intended purpose without harming students, faculty, or library workers in the process.

Critical assessment has been touted as a possible way to bridge this gap and shift assessment to reflective, meaningful, social justice-based practices (Magnus et al., 2018; Hanson, 2019). Ensuring that assessment data is gathered ethically, is actionable, and adequate interpretation is provided has been increasingly recognized in the higher education assessment community. Scholars in the field of academic librarianship have argued for assessment in libraries framed as a culture of care (Douglas, 2020) and through a lens of feminist pedagogy (Hanson, 2019). Magnus et al. (2018) discuss critically interrogating all parts of the assessment process to reach a practice that is rooted in social justice and equity. 

This presentation will take a deep dive into the literature discussing “Critical Assessment” in academic libraries and in higher education. The presenter will share how the principles of critical assessment have been used in reimagining department-wide assessment in the Teaching and Learning Department at Cal State San Marcos. Participants will work together to brainstorm ways critical assessment practices can be applied in our day-to-day instruction sessions.

Session Block 3 

11:45 - 12:15 Presentation
Assessing for Accreditation: How to Integrate Emerging and Traditional 
Literacies
Amy Windham, Drescher Graduate Campus Librarian, Pepperdine University
Sally Bryant, Associate University Librarian for Public Services and Instruction, Pepperdine University
Lauren Haberstock, Director of the Genesis Lab and Librarian for Emerging Technology and 
Digital Projects, Pepperdine University

Pepperdine Libraries are currently undergoing our formal program assessment review process for accreditation through WSCUC (WASC Senior College and University Commission). Our presentation will provide an overview of our program-wide assessment efforts, which we have updated to include our makerspace and Special Collections in addition to both graduate and undergraduate information literacy. We recognized a shift in the way students consume information, and as a result we have implemented new assessment methods that move beyond the traditional definition of information literacy to include AI literacy, tool literacy, and primary source literacy, among others. Pepperdine Libraries utilize a variety of modes of assessment to gather feedback, including behavioral and performance-based assessment, digital tools, and written reflections. In addition, librarians developed a comprehensive survey that not only assessed student learning outcomes, but also asked students and faculty about services they would like us to offer in the future. Based on our assessment, we have noted interest in new services such as AI literacy instruction. Our presentation will highlight the importance of long-term assessment in order to identify program gaps and ways to integrate emerging technologies in classroom instruction. 


12:15 - 12:30 Show & Tell
Large Scale Assessment in an Online, Embedded Library Assignment 
Margot Hanson, Science Instruction Librarian, University of California, Berkeley
Krista Anandakuttan, Public Health & Optometry Librarian, University of California, Berkeley

Bio 1B at UC Berkeley has included a library research skills assignment for many years, and we recently transitioned from the edX platform to the Canvas LMS. It’s a large class, with approximately 700 students per semester, and the library assignment is a graded assignment. The assignment includes active searching activities for students to practice in our library catalog and a subject database, which makes grading 700 submissions a challenge! We will show and tell about automatic grading options on edX and Canvas and we invite suggestions from attendees on any other solutions they’ve tried.


12:30-1:00 Presentation
Skills and Dispositions: Combining TATIL and NSSE for a Fuller Picture of 
IL Learning
Susan Archambault, Head of Reference and Instruction, Loyola Marymount University 
Shalini Ramachandran, Reference & Instruction Librarian for STEM, Loyola Marymount University 

Aggregate data can hide as much as it reveals. As part of reaccreditation, the library at a private, mid-sized institution administered two complementary assessments to seniors: the ACRL's Threshold Achievement Test for Information Literacy (TATIL) and the NSSE Experiences with Information Literacy module. Together, these instruments allowed us to measure both skills and dispositions, providing a multidimensional picture of student learning.

This presentation shares our methodology, recruitment strategies, key findings, and how we used results to advocate for continued investment in library-integrated curriculum. Recruiting sufficient participants for statistical significance proved challenging; we deployed email campaigns, incentives, social media outreach, and promotions at campus events before reaching our target. We will share what worked, what didn't, and lessons learned for librarians planning similar assessments.

TATIL results showed seniors are "college ready" in search skills and outperformed peers at comparable institutions. Students who completed first-year courses with embedded library instruction scored higher in "productive persistence," the disposition to adapt and recover from research setbacks. NSSE results revealed strong engagement with library collections (58% frequently used them for assignments) and positive perceptions of institutional contribution to research skills. However, the data also surfaced equity gaps. Transfer students and first-generation students performed lower on several measures, and transfer students were less likely to use library resources for non-academic tasks. These findings prompted targeted recommendations for supporting underserved populations.

This presentation addresses the conference theme by examining how assessment is constructed and contextual: our choice of instruments shaped what we could measure, the reaccreditation context shaped how we communicated results, and practical constraints shaped who we could recruit. We will discuss the affordances and limitations of standardized assessments and how findings informed conversations with faculty partners and administration. Attendees will leave with practical insights into administering large-scale IL assessments, recruiting participants, disaggregating data to surface equity concerns, and translating findings into actionable recommendations.

1:00-1:15 Closing remarks and evaluation


SCIL logo

© Copyright 2025 California Academic & Research Libraries Association. All Rights Reserved.

Powered by Wild Apricot Membership Software