Patterns in Information Literacy Instruction: What’s Really Going on in Our Classrooms? Pearl Herscovitch, Margy MacMillan, and Sara Sharun As academic libraries articulate the value of their contribution to students’ educational experience, we are all seeking ways to capture information about our instructional activities. Librarians at Mount Royal University have embarked on a year-long project to map what is actually being taught in the more than 600 course-integrated information literacy (IL) classes we provide every year. With the cooperation of all librarians we developed a form to record the complexities of IL instruction.1 Data from the project is critical in planning for IL programming as we develop a more strategic program-integrated curriculum that makes better use of librarians’ expertise. We are gaining insights on areas of duplication and opportunities to develop higher-level IL sessions to address more advanced use of information. The results are already informing our work in the classroom and larger curriculum review projects across the institution. The information we are gathering will also guide librarians and the chair in balancing workload across the library. There is very little published research describing the use of quantitative data gathered from librarians on the content of IL sessions. There is evidence that this kind of data is being collected at various academic libraries in an effort to ensure ILI programs are effective and efficient, but as yet this information does not appear to have been published in the academic literature.2 With few exceptions, program-level evaluation in the literature reports on data gathered primarily from students, generally through skill-based assessments or questionnaires. The content of IL sessions can be gleaned from what students report as ‘most helpful’ or ‘still unclear’ in these surveys, but students do not tell us what was actually taught. Assessing student learning outcomes alone cannot provide us with a complete or accurate view of a library instruction program, and yet most assessment research focuses on student outcomes as the only measure of IL program effectiveness. We can agree with Julien & Boon who state that “[a]n emphasis on such outcomes is essential if librarians are to justify devoting institutional resources to instructional activities”3 but a detailed examination of what we actually do to help students achieve those outcomes is equally important and has the potential to provide more direct and actionable evidence that can impact practice. In addition to data gathered from students, assessment data is also often provided by disciplinary faculty, in the form of focus groups, surveys, interviews or questionnaires. Librarian-generated data, in the relatively rare instances when it is gathered, is usually acquired in the form of peer-evaluation or self-reflection writing.4 However, there are a few recent studies which demonstrate the value of data derived from the content of library sessions. Kessinger described a project to define how a community college’s library instruction program addressed IL skill levels throughout the curriculum by mapping Pearl Herscovitch is Associate Professor/Chair, Mount Royal University Library, e-mail: pherscovitch@mtroyal.ca; Margy MacMillan is Professor/Communications Librarian, Mount Royal University Library, e-mail: mmacmillan@mtroyal.ca; Sara Sharun is Assistant Professor/Librarian, Mount Royal University Library, e-mail: ssharun@mtroyal.ca 612 Patterns in Information Literacy Instruction: What’s Really Going on in Our Classrooms? IL objectives to course-specific and institution-wide outcomes. This allowed librarians to communicate more clearly with disciplinary faculty about IL, and to support their arguments for “developmentally appropriate IL instruction for particular courses” and scaffolding IL concepts throughout the curriculum.5 Gewirtz described an instruction program evaluation that was created to inform and improve individual librarians’ approach to instruction and to determine if all first year students were receiving the same type of information in their IL sessions. A realization that there was a high level of consistency among content covered across first-year courses resulted in the development of first year learning goals. These goals were then used to communicate with faculty teaching upper year courses about what skills could be expected in their students, and where more integration of IL teaching could take place across the curriculum.6 Similar to what these two studies found, we plan to use our findings to communicate with faculty more effectively, for example by providing faculty with evidence of gaps or duplication, and providing them with ideas for IL instruction and assessment activities that stretch students’ skills at higher levels, rather than simply reviewing or reminding them of existing skills. The existing body of assessment literature, which aims primarily to provide evidence of the library’s value and contribution to achieving campus wide learning outcomes, has focused on data gathered from students and faculty, and has overlooked the importance of data about librarians’ instructional practices and ILI session content. Data generated by librarians about what actually is and is not addressed in instruction sessions can provide a program-level view of instruction. This can be applied to program assessment, workload planning, strategic planning, and communication and outreach strategies, in addition to contributing to broader, externally-focussed reporting. Our current study demonstrates how such data can be gathered and the kinds of trends it can describe. Mount Royal University (MRU) librarians are actively engaged in course and program-integrated ILI, in an institution that has declared IL to be a campus wide aim. All 17 librarians teach, and the library provides approximately 600 classes a year across most disciplines, in all levels of courses. MRU is a teaching-focused undergraduate institution; librarians are faculty and are deeply involved with committees, professional development and program and curricular reviews. Strategic planning work in the MRU library identified a need for more data. A survey of faculty had identified interesting patterns around what faculty felt students needed to know and about what they invited librarians to teach,7 which raised the question “What exactly were we teaching?” We had done a smaller scale study of what we taught in General Education courses, and liaison librarians were aware of what they were teaching in their own areas, but we had little sense of the overall picture. Having developed programmatic guidelines for instruction, we were also curious to see whether what we were doing matched the intentions we had laid out in that document.8 After searching unsuccessfully for a tool we could adapt to our needs, we developed our own form to capture a wide range of both qualitative and quantitative data. The form had fields for basic data about the class, an extensive checklist of types of content and level of instruction (Introduction, Review, Advanced), and text boxes for gathering information on collaboration with faculty, preparation activities, assessment strategies, and reflection. We developed the form as a Word document as some librarians expressed a preference for something they could write on quickly, while others wanted to complete it online. In the fall of 2013, all librarians agreed to record data for every class they taught between September 30 and October 7. We reviewed the data, and more essentially, the form with librarians in December, and revised the form based on librarian feedback. Revisions included clarifying some definitions, adding more content options, and allowing for librarians to indicate the difference between content that was planned, and what was actually taught. We piloted the revised form for three weeks in the winter semester. Among other things this was intended to give librarians some experience with completing March 25–28, 2015, Portland, Oregon 613 Pearl Herscovitch, Margy MacMillan, and Sara Sharun 614 the forms. Again, at the end of the semester, we reviewed the data, gathered more feedback on the form and the process, and formally declared that we would run the study for an entire academic year. Revisions at this stage included alterations to the form to make it easier to complete, the addition of a ‘Mentioned’ column in the checklist to cover those occasions where a skill, concept, or resource is noted but not really introduced, and a textbox where librarians could record information related to the emerging ACRL Framework for Information Literacy. In reviewing data from the pilots it was clear that the form would provide information on what we were teaching at which level to which students, and identify areas of overlap, gaps, and patterns in what we taught. Immediately evident, even at that stage was the sheer number of skills, concepts and resources we were trying to fit into some classes. We received Human Research Ethics Board (HREB) approval for the project but in order to stay within ethical, non-coercive constraints, we were allowed access to only a subset of the data from the fall semester, anonymized so that we could not tell which librarians consented to be included. In April, we will have access to all of the data for planning purposes. In the fall of 2014, all librarians agreed to complete the forms for all classes, acknowledging the value of examining what we were teaching. Some librarians were already using their own tracking forms even beyond the pilot period to analyse their work and several remarked on the process as useful for prompting reflection and analysis. Members of the research team found completing the forms relatively straightforward, but time consuming, especially the qualitative sections. Staff members entered the data into a spreadsheet, and created a second spreadsheet that included only the HREB-sanctioned research data. Results In January we reviewed data from the fall semester. All 17 librarians consented to have their data included and we have information on 302 of the 363 classes that were taught this semester. While we had asked that liACRL 2015 brarians complete tracking forms for every class, this was not enforced. Instruction on how to find various types of resources was the main topic of our ILI sessions, included 553 times at all skill levels across 302 sessions in the fall semester (many sessions included instruction on more than one type of resource). Articles were the most frequently mentioned resource, accounting for 54% of instruction on how to find resources. Books accounted for 36% of resource instruction, followed by data and government documents (7%) and images and A/V materials (3%). With regard to trends across course level, figure 1 shows that while the number of instances of instruction on finding both books and articles decreases at higher course levels, articles represent an increasing proportion of our instruction activities from 1000-level to 4000-level courses, while instruction on finding books decreases. The majority of our instruction on finding resources focused on articles, and the data on the forms indicate that the majority of this instruction privileged databases over Google Scholar or Summon, our discovery layer, as the primary means of accessing articles. The relative proportion of instruction on various tools for finding articles is presented in figure 2. Instruction on identifying, describing and evaluating different resource types was the most frequently taught topic after finding sources (addressed in 175 sessions, or 58% of our total sessions). In 77% of these sessions, the topic was addressed at the introductory FIGURE 1 Trends in Instruction on Finding Books and Articles through Various Tools, by Course Level (Includes ebooks) Patterns in Information Literacy Instruction: What’s Really Going on in Our Classrooms? FIGURE 2 Percentage of Instruction on Finding Articles Using Different Tools at All Levels of Instruction (n=299) skill level. Figure 3 shows the breakdown of skill level addressed at each course level. Evaluating information was also very frequently taught in our sessions, and was addressed at some level in 58% of the total sessions taught (see figure 4). Of these sessions, 74% addressed it at an introductory level (see figure 5). The majority of our instruction is at the introductory skill level. This is in part because the majority of our instruction focuses on introducing skills, tools and concepts, and takes place in 1000-level courses. However, this data suggests the existence of a significant amount of duplication of lower-level content in higher level courses. These results may be due to the type of assignments and/or the requirements of course instructors. While the specific content of our sessions may also depend on disciplinary context (data which we could not examine in this study), our preliminary analysis suggests there may be opportunities for instruction that stretches these skills in upper-level courses, instead of re-introducing topics to students. In reviewing the relationship between research assignments and the topics addressed in corresponding library classes, we found it useful to divide the assignments into three categories: A. complete research papers or outputs like posters and essays, FIGURE 3 Identifying, Describing and Evaluating Different Resource Types in IL Sessions at Each Course Level (n=175) March 25–28, 2015, Portland, Oregon 615 Pearl Herscovitch, Margy MacMillan, and Sara Sharun 616 FIGURE 4 Evaluation Skills and Concepts in IL Sessions at Each Course Level (n=171) B. assignments based on parts of the research process, like annotated bibliographies or research proposals, C. standalone in-class assignments or activities. In some cases it was difficult to determine what kind of assignment, if any, was the focus of the instruction. Category A tended to have more topics taught per session than B which in turn had more than C. In category A, the Summon discovery tool was taught more frequently for both articles and books. This category also had the highest frequency of teaching limiters as part of focused searching. In contrast, category B, where most of the assignments were annotated bibliographies, librarians more frequently taught distinct tools such as the catalogue, and article databases, and much more frequently included instruction on Boolean operators. This category also had markedly higher rates of teaching around citations and ethical use of information, likely because annotated bibliography assignments are often used in part to build student familiarity with particular citation styles. The only aspect where category C showed higher frequencies was ACRL 2015 related to discipline-specific materials and patterns in publishing, possibly related to the concentration on specific resources for in-class assignments. Limitations In compiling the data we note a number of limitations, predictable in this kind of exercise. As much as we discussed definitions of Mention/Introduce/Review/ Advanced as a group, it is likely that different librarians may have different conceptions of these categories. Along this line, notes on the kind of assignments IL sessions were supporting were also often difficult to categorize. ‘Research assignment’, ‘essay’, ‘research paper’, or ‘paper’ may mean the same sort of research exercise or quite different ones, so we cannot be entirely confident about patterns of instruction related to assignments. While we may not have data for every single class librarians taught, we have a sufficient number, submitted by all the librarians, that we can assume that our data accurately represents the work of librarians across disciplines. There are also some patterns in the data that we think are attributable to Patterns in Information Literacy Instruction: What’s Really Going on in Our Classrooms? FIGURE 5 Percentage of Each Level at Which Evaluation Was Addressed in IL Sessions (n=171) advanced 3% review 11% menon 12% introduce 74% particular assignments but this is difficult to verify without access to course or discipline specific data. For example, the frequency of assignment-based sessions (category C) that focussed on reference sources may be due to a large number of chemistry sessions that use specific tools. A final limitation is that while the fall semester may be our busiest time for instruction, it has a proportionately higher number of introductory sessions, while the winter semester which is not captured in this data, has a higher proportion of capstone or senior courses. Using the Data This paper addresses the first part of a two-step data collection process. The quality assurance project will involve an analysis of more detailed data representing the entire academic year. This detailed analysis will inform a revision of our programmatic instruction guidelines for information literacy instruction at MRU, originally developed in 2010, based on the 2000 ACRL Standards.9 Our guidelines frame library instruction in four broad categories: location, use/ evaluation, reflection and understanding. Predictably, using the current data to map these categories to the topics listed on the tracking form reveals that most of our instruction falls under location and use/evaluation. A previous survey of MRU faculty identified an expectation that first and second year students be able to read and understand scholarly articles.10 In our library classes we address the portion of research assignments that require students to locate and evaluate scholarly sources. Librarians have made a concerted effort to incorporate reflection and understanding of sources in their teaching, and the current data suggests that 13% of our instruction addresses these skills. Through further analysis, we hope to determine the degree to which we need to reduce duplication to focus on scaffolding more complex instruction that builds on the introduction of these basic concepts. A planned survey of students registered in capstone courses, together with the current data will enable us to revise our programmatic instruction guidelines to more accurately reflect students’ advanced information literacy needs and program outcomes. Mount Royal University has recently released a strategic plan which includes a commitment to providing every student with the opportunity to participate “in at least one senior level research or capstone project through individual or group work, or directly engaged with faculty.”11 The library’s capacity to support this commitment may require a different balance between instruction at the introductory level and instruction at more advanced levels. Our overall goal is to integrate our previous faculty survey with current research and the results of a survey of students in capstone courses to align the library’s programmatic instruction with university goals. This will provide new and continuing library faculty with improved tools for communicating with faculty colleagues outside the library and a more evidence-based approach to addressing gaps in our instruction program. A report based on combined data will also help us develop future directions for teaching and research. March 25–28, 2015, Portland, Oregon 617 Pearl Herscovitch, Margy MacMillan, and Sara Sharun 618 Notes 1. Mount Royal University Library, “Library Instruction Tracking Form.” Available at http://libguides.mtroyal.ca/ ld.php?content_id=8571769 2. See for example: University of Texas Libraries, “Library Instruction Services Assessment Plan.“ Available at http:// www.lib.utexas.edu/sites/default/files/services/instruction/ LIS%20Assessment%20Plan_May%202013.pdf; University Libraries, University of Illinois at Urbana Champagne, “Unit Plan for Assessing and Improving Student Learning.” Available at http://www.library.illinois.edu/infolit/unitplan. html; University of Toronto Libraries, “Guide for Teaching Librarians.” Available at http://guides.library.utoronto.ca/ content.php?pid=435532 &sid=5050120 3. Heidi Julien and Stuart Boon, “Assessing instructional outcomes in Canadian academic libraries.” Library & Information Science Research 26, no. 2 (2004): 121-139. 4. See for example: Elizabeth Spackman Hopkins & Suzanne Julian, “An Evaluation of an Upper-Division, General Education Information Literacy Program,” Communications in Information Literacy 2, no. 2 (2008): 67-83; Meggan Houlihan & Amanda Click, “Teaching Literacy: Methods for Studying and Improving Library Instruction,” Evidence Based Library and Information Practice 7, no. 4 (2012): 3551; Pamela Kessinger, “Integrated Instruction Framework for Information Literacy.” Journal of Information Literacy 7, no. 2 (2013): 33-59. 5. Kessinger, “Integrated Instruction Framework, ” p. 33. 6. Sarah R. Gewirtz.. “Evaluating an Instruction Program with Various Assessment Measures.” Reference Services Review 42, no. 1 (2014): 16-33. 7. Brian Jackson, Margy MacMillan, and Michelle Sinotte. “Great Expectations: Results from a Faculty Survey of Students’ Information Literacy Proficiency.” Paper presented at IATUL Conference Helsinki, Finland. 2-5 June, 2014. Available at http://docs.lib.purdue.edu /cgi/viewcontent.cgi?article=2036&context=iatul 8. Mount Royal University Library.” Programmatic Instruction—Guidelines for Information Literacy Instruction at Mount Royal University.” Available at http://www.mtroyal. ca/Library/Faculty/InstructionalServices/programmmatic_ instruction 9. Association of College & Research Libraries, Information Literacy Competency Standards for Higher Education. (Chicago: American Library Association, 2000). Available at http://www.ala.org/acrl/standards/informationliteracycompetency 10. Jackson, MacMillan and Sinotte. “Great Expectations.” 11. Mount Royal University. “Learning Together, Leading Together Mount Royal University’s Strategic Plan 2025.” Available at http://intranet.mtroyal.ca/StrategicPlanningProcess/ ssLINK/strategicplan2025.pdf, p.12; Mount Royal University Library.” Programmatic Instruction—Guidelines for Information Literacy Instruction at Mount Royal University.” Available at http://www.mtroyal.ca/Library/Faculty/ InstructionalServices/programmmatic_instruction ACRL 2015