More hands on please! That was one of the comments gathered from Pam Sieving’s systematic review best practices class evaluation that she talked about this afternoon. I’m sure we have all come across that comment before in the SR and EMB classes we teach, specifically when it comes to searching. When we do offer training, we meet with glazed eyes or no eyes at all (poor attendance). Searching isn’t something learnt in 30mins, it takes time and practice and that is something that some researchers (who sometimes are also clinicians) can’t afford. That is why we need to be involved in SRs because we have already had experience with searching many different types of resources. Mala Mann added that many would-be SR authors have no idea how long it takes to write one. 6 weeks at the latest? Nup! Revise that to … maybe 2 years. Susan Fowler reported on a McMaster knowledge tranlsation (KT) project that provides emergency clinicans with the tools to determine clinically relevant evidence-based information. McMaster have a suite of KT projects on their HIRU webpage. How do you know if your tool is a good indicator of clinical relevance? The BEEM rating tool was validated using bibliometrics which demonstrated that articles with a high BEEM score were likely to be cited. However, the tool is not a independent predictor of citation rates. Michelle Henely’s presentation covered new areas for librarians to push evidence, and it included rounding (not that new now, though still emerging). She mentioned that many state they round but when it comes to it, the number is very small. So what is happening here? Do you round or not? I’ve been involved in audit meetings and clinical case conferences but no rounding as yet. What do you do? Please leave a comment!