Friday 30 September 2016

Don't blame technology for bad management attitudes

Once upon a time, back when the millennium bug was a thing — or we thought it was — the Director of Recreation & Community Services as was told me to put together a roadmap of the IT developments the library service should be undertaking if money wasn't a big issue. We didn't have the money  and there wasn't any immediate prospect of having it but it would give him a sense of where we wanted to be and the opportunities we'd like to grab if they came along. The results included the usual suspects: we desperately needed to get all the libraries networked and onto the library management system and staff on enquiry desks needed access to the internet as well as the library catalogue, and internet access for the public would be good. I also said that we needed to invest in self-service circulation.

I'm now going to say something heretical, please bear with me. There is no intrinsic value in stamping a due date in a book. There, I've said it. The value in a staff-mediated issue/checkout transaction is:
  • The borrower gets to borrow an item
  • The loan is recorded
  • There's the human interaction, which may be the only one some people get that day
  • The library staff get the opportunity to provide information about other library resources and services ("We've got that author's new book on order," "Did you know we've started having toddlers' tales sessions on Wednesday afternoons?" etc., etc.)
The value of the return/checkin transaction is similar.

To my mind the high-value parts of any transaction are the ones you need to put your resources into. Anything that takes resources away from the high-value areas needs to be designed out.

In those days our busiest library was issuing between four and five thousand items every Saturday. More to the point, six Library Assistants were issuing between four and five thousand items every Saturday. And returning a similar number, which creates a lot more work as something needs to be done with all that incoming  stock. A lot of the time the queues at the counter were awful with staff and customers both having a stressful experience. Consequently the value of the issue transaction was compromised — the loan was effected and recorded but there was no time for the human stuff: both the staff and the customers felt under pressure to get the transaction over and done with as quickly as possible. Some customers even gave up and didn't bother: they wanted to become borrowers and our process stopped it happening. If someone desperately needed that human interaction they were badly short-changed. At the returns desk it was even worse: some borrowers lost patience and just left items on the corner of the counter and in the confusion these sometimes got back onto the shelves without the return having been recorded. The rest of the week there were other, smaller, stress points and other events and activities in the library added to the mix. By this stage we'd successfully made the case for some more Library Assistant hours but there's a limit to the number of bodies and workstations you that can physically fit behind even the huge counter that was in this library.

So I argued that we needed to include some self-service issue/return functionality to try and ease the burden a bit. Some people just want to be in and out, they want to borrow an item or return it and they're not much fussed about anything else. Some people would have privacy or safeguarding issues that could be addressed by allowing them to self-issue a book. Giving these people the self-service option would address their needs and also reduce the queue, allowing more time for the people who did need the human stuff at the counter. The director, whose background was adult and community education, was enthusiastic about the idea: "You mean that we could get the library staff off that production line at the counter so that instead of stamping books they could be doing something more interesting like helping people to find things and getting them interested in something new?" Yes, we could have.

Some years, and a couple of directors, later we were in a position to credibly rattle the begging bowl to fund self-service circulation. The world had changed somewhat. Library managers nationwide had picked up the idea that this functionality was a way of saving money on staffing and particularly cutting Library Assistant hours. Although I got cross when my library managers saw self-service as an opportunity to cut staffing costs I couldn't really blame them as individuals: they were coming late to a game that already had this established narrative. It was a massive pity but there we were. We weren't alone. And as Austerity took its toll self-service circulation became one of the quick fixes — I know of one library authority that cut staffing hours on the basis that kiosks were going to be installed at a couple of libraries and the next year cut the hours further because the self same kiosks had been installed. So self-service kiosks replaced staff instead of freeing them up to do other, more important, things.

The point to this story is that the technology wasn't to blame. Technology is never neutral — design must have an end in view — but the way that it is used and the consequent outcomes are largely down to human decision. In this case the opportunity to enrich the very many parts of the public library service that aren't the issue and return of books was passed over because neither the staff nor the service were being valued by "Professionals" in managerial positions. It wasn't a decision forced on them by outside forces, it was one they came to themselves collectively at the turn of the millennium and which they then transmitted to the people who hold the purse strings. Which makes it deuced hard for people now arguing the case that public libraries have never been only about issue counts: if front-line staff can be replaced by one-trick-pony kiosks all that other stuff can't have been all that important could it? Well, yes it was and yes it is and it's scandalous that enabling technology's been abused in this way.

Library authorities are repeating this mistake with technologies such as Open+. Open+ is a good way of extending the use of a building and some of its resources. Many of the running costs of a building are incurred whether or not that building's in use: the fabric of the building deteriorates regardless and the lights may be out but you'll need the heating on sometimes unless you fancy having a lot of burst pipes in Winter. So it makes sense to maximise the return on this investment in running costs by maximising the building's availability for use. Especially if that use is currently particularly limited: if you have a building and it only has a useful life of ten or twenty hours a week then this is a huge waste. In these cases using a technology like Open+ makes sense: it allows access to the building as a community venue or a quiet study space and you could make stock available for self issue/return, thus extending the reach of part of the library service. What it doesn't do is replace the shedload of other stuff that gets delivered — or should be delivered — by the library service in that building. It isn't a replacement for a library service, it just extends some people's access to some of those parts of that service that can be passively delivered.

Anything else is yet another abuse of library technology.

Monday 12 September 2016

What do visitor counts tell us?

Why did I get into a bate about visitor figures the other day? It's largely because of the spate of recent reports and commentaries about the decline of the public library service based on these numbers. I think there is an over reliance on what is, after all, pretty rubbish data.

Visitor counts are not measures of use. They are an approximation of — sometimes a wild stab at — the number of people who entered a building. If you were to tell me that visitor counts have declined by 30% over a given period I'd take your word for it. Personal experience and observation suggests that fewer people are in some (not all) of the libraries I visit than there used to be so you may be right. And the closing of libraries and nibbling away at opening hours over the past quarter of a century won't have helped any. But I'd be extremely sceptical that you had any forensic evidence to back up your percentage.

Does it actually matter that there are fewer visits? If I can sit in my living room and reserve a book then go and visit the library to pick it up I have immediately cut down the number of visits by at least 50%. But the library has delivered the same service, and much more conveniently for me. While I'm in the library I can still avail myself of all its other services and indulge in a bit of serendipitous discovery amongst the shelves but I am not compelled to an earlier visit with the sole purpose of queuing up at the counter to ask for a reservation to be placed.

Ah but issue figures are going down as well… And? Public libraries never only issued books. Literally an infinitely greater number of people use the public PCs in the library than they did in 1964. Do we have fifty years' worth of attendance figures for story times and author visits? Do we have decades-worth of comparative data of use for quiet study? Or any and all of the other stuff? How do we know that libraries aren't having fewer but richer visits?

But we're delivering less of a service… Are you capable of delivering a better quality of service now? Did you overstretch yourselves in the past and sometimes end up shortchanging your customer service? Were you giving one minute of your time to people who needed five? Were people put off asking for help because they saw that you were busy? High throughput isn't always a measure of high quality.

But visits are down… Do we have annual totals for the number of people who walked through the door, saw the length of the queue at the counter and thought: "I'll come back later?" No, we don't.

Which is why I got in a bit of a bate about it.

Saturday 10 September 2016

Please can we stop pretending library visitor counts are performance data?

From an operations management perspective there is some use for library visitor count data:
  • It's useful to have an idea of when your peak throughputs occur so that you can deploy resources accordingly.
  • When you're designing new library spaces it's good to have a rough ballpark figure of throughputs for designing in capacity and customer flows.
  • It's good for morale to be able to acknowledge and celebrate when you've safely handled a significant number of visitors.
From a service management perspective there is one use for library visitor count data:
  • It's a salutary reminder of how little you know of your customers if the only available data are about loans and PC sessions.
Otherwise, they're not a lot of use. You see, the thing about visitor counts is that most of them are poor data and from a service delivery perspective all of them are meaningless.

Why are so many visitor counts poor data?

We'll assume that everyone's acting in good faith and nobody's playing the numbers.

If you've got an automated visitor count system and you're running your library as a stand-alone service you have the best chance of having good visitor count data. A lot depends on the way the count is done; the positioning of the counter; and the frequency of data sampling and verification.
  • A badly-placed counter can miss visitors — a head count could miss all the children, for instance, or the customer flows could by-pass the counter completely. 
  • Beam-interrupt counters that try to count the number of legs and divide by two have their issues. Back when we installed them at Rochdale we'd heard the urban myth about what happens when you walk past with a ladder. Having enquiring minds we tried it with a seven-rung stepladder and discovered that there was some truth in the story so long as the ladder was being carried horizontally by two toddlers, so we didn't worry about that too much. We did worry that one toddler with little legs = one interrupt = half a visitor. And that a child running into a library didn't get counted (children run into good libraries because they're exciting). People with walking sticks or wheelchairs seemed to be inconsistently recorded. So the figures were only really useful to us as rough ballpark figures.
  • Things happen, so there need to be ongoing checks on the reliability of the equipment, the data and the procedures for collection and collation.
If you've got an automated visitor count system in a shared-service building it's a bit more complicated.
  • The visitor count is potentially useful for the operational management of the building but not really for the library service. 
  • Do you include or exclude the people who came in to renew their taxi licence or claim housing benefit or have an appointment with the nurse in the clinic? 
    • If you include them how is this data useful for the management of the library service? 
    • If you exclude them how do you capture the visitors who come in for a taxi licence/HB enquiry/clinical appointment then take advantage of the fact there's a library in the same building? 
  • How do you exclude the people coming in for meetings with other services? 
  • How do you exclude the passage of staff from the other services? 
It gets messy quickly.

And then there are manual counts…

What's wrong with manual counts? Well:
  • There's the timing of them. It's unlikely that you'll have the resources to do the count all day every day. If you have, you'll find a lot of other things for them to be doing at the same time. (If you *do* have a FTE devoted exclusively to counting visitors you need your bumps feeling.) The data will be a sample. It's easier to do the sampling during the hours when the library's relatively are quiet, but what would invalidate the sample. So there will inevitably be stresses of distraction and confusion.

  • Then there's the seasonal variations. And do you want to base your annual count on that week when you've arranged for the workmen to come in to fix the heating? And so on.

    You could take the sensible view that you're not going to extrapolate the figures, you're just going to do a year-on-year comparison of counts conducted the same way at the same point in the calendar every year. Which is useful if the figures are purely for internal use or published as trends rather than absolute data.

    If that absolute data's used to compare and contrast with another library authority it becomes a lot less useful as you're comparing apples with pears:
    • The methodology may be different. 
    • There may be good local reasons for differences in the  seasonal variation — half term holidays at schools being obvious examples. 
    • There  may be differences in the library calendar — the reading festival may be that week, or it could be the breathing space after last week's reading festival. Back in the nineties our managers were concerned that visitor counts could be distorted by events in the library so, ironically, the two weeks of the visitor count were the only ones with no events in the library, no class visits, no author visits, etc.
  • Then there's the counting. A manual visitor count is easy when the library's quiet. When it's busy you're too busy dealing with your customers to count the visitors. You can try and do a catch-up later but that's always going to be an approximation based on memory and chance observation. And your finger may slip on the clicker you're using for the count. Or you could be one of those people who sometimes have four bars in their five-bar gates.
So there are issues with the visitor count data.

Why is this data meaningless?

A person has come into the building. And…? The visitor to the buildings isn't necessarily a user of the library or a customer of the service, any more than the chap who gets off the train at Waverley Station is necessarily a Scotsman.

There were forty visitors to the library
  • Three people borrowed some books
  • Three people used the computers
  • One came to read the electricity meter
  • One came to sort out the radiator that hasn’t been working
  • One came to deliver a parcel
  • One came to use the loo
  • A drunk came in to make a row and throw his shoe through the window
  • A policeman came in to take a statement about it
  • A building manager came in to inspect the damage
  • A joiner came in to board up the window
  • A glazier came in to replace the window
  • A cat ran in, pursued by:
  • A dog, pursued by:
  • The dog’s owner, pursued by:
  • Three children who followed to see the fun
  • We don’t know what the rest did.
A visitor comes into the library.
  • What did they do? 
  • What did you do? 
  • Was it any good? 
  • Where the right resources available at the right time for the right people? 
  • Did the visitor engage with the service at all? 
The visitor count tells you none of this. So how can it be any sort of reflection of the performance of your service? You can't use data about people you don't know have engaged with your service to measure its performance because the only information you have is about the people, you don't have any about the service delivery.

Attendance figures are important to sporting venues because attendance generates income. They don't determine the trophies that team collect in the course of a season. Nor do you see schools with banners proclaiming: "According to OFSTED more people walked through our front door than any other primary school in Loamshire!"

Visitor counts are not performance data. Performance is about the delivery of outcomes, not throughputs.