Do surveys ever tell you anything? Perhaps we could do a survey and not find out. This is a blast against the “Library Survey”. What’s the problem? Well there are many. Here are the first twenty. Sorry, I have edited that down to five smiley faces. But seriously, there are “challenges”.
Samples are biased and statistically insignificant. Who do you want to fill in your survey? Well a statistically significant number, across my user community representing all groups of users. OK. You can wake up now. Who do you really want to fill in your survey? Just about anyone you can coerce into doing it. So just think about that for a moment. That is probably users we know, regular library users, people we have just helped and feel obliged to tick the boxes. In other words a sample of users who are likely to say positive things, things they think you want to hear. We don’t hear from non users, we don’t hear from infrequent users or users who may be self starters. They use the library or its resources but don’t make contact with library staff. Job not done.
The survey is used as a source of evidence not investigation. We use surveys as evidence that activities have taken place. Deliver a course, make a change to your service introduce a new resource even get what is euphemistically called feedback. How do you document the benefits or effects. Do a survey. Eight out of ten users (not cats) thought it was good or very good, depending on the version of the Likert Scale you use. You now have the data to make a bar chart. You have evidence. The chain of evidence goes like this. You do something. You ask people who probably don’t understand the significance of what you have done if its nice. They say it is. You then pass that information on to someone else who may be equally ignorant about your project. Certainly not enlightened by your survey. But satisfied that you did something. Job not done. Again.
Surveys are not good customer service. This is not an argument for not doing them at all. It is an argument for doing good ones sparingly and ditching the bad ones all together. Bad customer service? Well, how many times have you been on a website and declined to take the survey. Have put the phone down to a survey request about your boiler. Have brushed past a survey being administered on the street. Have started a publishers survey in the hope of finally securing that iPad, then realising that it is computer generated nonsense. Not even getting the iPad will overcome your sense of irritation at being duped. Why, I wonder do we think our users, having fought their way through this survey contested space to the sanctuary of the Library, will think any differently. Would you mind filling out our library survey. ——–. Fill in this space.
Surveys are not the only tool we have. Surveys seem to be part of the heritage of Librarianship. Instilled into budding professionals in Library Schools. Need to do a project or write a thesis. Do a survey. The check on the reality of this statement is the number of requests disseminated through eMail lists to complete this or that survey for my … whatever the qualification is. All well and good but it seems to cast a long shadow in professional practice. There are other tools. Since much information is online we can track usage across our websites with Google Analytics or a bespoke metrics tool. We can measure usage through accesses to resources and through Open Athens. We can get usage statistics from publishers. In other words instead of asking users what they think we can watch the effect of our service on their actual behaviour. This isn’t new, but it is underused.
There are other ways of getting data. If you track anything in your library, say using an enquiry tracking system, you can record data you go. Make sure you record at the point of contact what you may need later. Build in the data collection exercise into your everyday activities. The words “Library Survey” may never come up.
What surveys should you do? Well you have to do your Impact Survey as that’s part of LQAF. Handily the LKSL Impact Toolkit it well thought though, tried and tested and easily implementable in SurveyMonkey. Its designed to solicit evidence of impact, evidence that can be used to support the case for a library. Moderate completion rates should achieve the desired result.
If you aim to embark on a pedagogically sound evaluation of teaching or information skills why not use the LIHNN Training Evaluation Survey. Job done. This time.
Where is the voice of the user? It’s usually in the invitation to complete the comment in a survey or the “Other” Option on the answer sheet. However, you can get your vox pop comments by other means. Use the Case Study Proforma from the LKSL Impact Toolkit or Gather the evidence as it comes to you in eMails, conversations, minutes of meetings. Just ask for targeted feedback when you need it.
Librarian, NWAS NHS Trust,
BA, DMS, Dip Lib, MA, MCLIP
NWAS LKS is supported by NW Health Care Libraries Unit (HCLU)