Key take away points from this session included:
“They don’t know what they don’t know”
Vasumathi Sriganesh’s assertion (speaking of health science students) that, “They don’t know what they don’t know” really hit home. Library patrons understand that they need to learn about immunology or pharmacology or epidemiology, but don’t have the fundamental understanding that information literacy is also a skill that must be studied/learned.
Classifying & Clarifying Resource Types
This topic came up in two different presentations. Vasumathi Sriganesh discussed the importance of identifying different types of resources to health science students and encouraging them to ask themselves is this resource a….
- Background/factual resource – textbooks, dictionaries
- Research resource – primary sources, journal literature
- Analysis/Synthesis resource – e.g. UTD, Cochrane
Later in the session Mark MacEachern showed an example of a table which is great for helping students understand different kind of resources and their appropriate uses. Such tables, originally created by Dartmouth according to Mark and since adapted for use by other libraries including Mark’s (Taubman Health Science Library at the University of Michigan) are effective for making abstract concepts more concrete to students. I couldn’t find a link to the table on the Dartmouth or UMich library sites, but here’s a great example from the George T. Harrell Health Sciences Library at Penn State University.
For me the biggest takeaway of the talk from the University of Michigan presenters, Whitney Townsend and Mark MacEachern, was that flexibility is key. They found the Informatics sessions for their first year medical students were getting poor reviews and hypothesized that this was due, in part, to the huge variation in M1 students’ undergraduate experience with research and core medical resources. Those with more experience in research or more familiarity with core health science resources such as PubMed rated the sessions poorly since they were perceived as too basic or not a good use of time. To solve this issue, they made the first session on the basics optional so that those with more research experience or knowledge of the lay-of-the-land in the realm of medical resources could elect to skip it. These students would still be responsible for the required assignment and assessment quiz, so that if they fared poorly on these measures, it would be a strong indicator to them that they would benefit by attending. I thought the required assessment might be a good way to combat the “they don’t know what they don’t know” since the assessments can present the students with stark awareness of just how much they have to learn.
Improving Instruction Sessions in an Evidence Based Way
Suzanne Shurtz’s presentation on Evaluation of Best Practices in the Design of Online, Evidence-Based Practice Instructional Modules jived nicely with Joanne Gard Marshall’s Doe Lecture from this morning. When looking to improve the evaluation of their instruction modules (on evidence based practice, incidentally!), the librarians at Texas A&M Health Science library didn’t simply sit around and ask each other, “What do you think a good question would be?” or “Which of these phrasings do you think is clearer?” They started out by consulting the literature to see what information and evidence was available on evaluations of online learning modules and located evaluations they could adapt or use as a guide (including the QuADEM – Quality Assessment of Digital Educational Material). Once they designed their evaluation, they subjected it to testing to make sure the evaluation they created was a solid and reliable way of assessing their EBP online modules. It’s a great example of bringing evidence based practice not just to research, but to instruction as well.