Monday, 12 September 2016

What do visitor counts tell us?

Why did I get into a bate about visitor figures the other day? It's largely because of the spate of recent reports and commentaries about the decline of the public library service based on these numbers. I think there is an over reliance on what is, after all, pretty rubbish data.

Visitor counts are not measures of use. They are an approximation of — sometimes a wild stab at — the number of people who entered a building. If you were to tell me that visitor counts have declined by 30% over a given period I'd take your word for it. Personal experience and observation suggests that fewer people are in some (not all) of the libraries I visit than there used to be so you may be right. And the closing of libraries and nibbling away at opening hours over the past quarter of a century won't have helped any. But I'd be extremely sceptical that you had any forensic evidence to back up your percentage.

Does it actually matter that there are fewer visits? If I can sit in my living room and reserve a book then go and visit the library to pick it up I have immediately cut down the number of visits by at least 50%. But the library has delivered the same service, and much more conveniently for me. While I'm in the library I can still avail myself of all its other services and indulge in a bit of serendipitous discovery amongst the shelves but I am not compelled to an earlier visit with the sole purpose of queuing up at the counter to ask for a reservation to be placed.

Ah but issue figures are going down as well… And? Public libraries never only issued books. Literally an infinitely greater number of people use the public PCs in the library than they did in 1964. Do we have fifty years' worth of attendance figures for story times and author visits? Do we have decades-worth of comparative data of use for quiet study? Or any and all of the other stuff? How do we know that libraries aren't having fewer but richer visits?

But we're delivering less of a service… Are you capable of delivering a better quality of service now? Did you overstretch yourselves in the past and sometimes end up shortchanging your customer service? Were you giving one minute of your time to people who needed five? Were people put off asking for help because they saw that you were busy? High throughput isn't always a measure of high quality.

But visits are down… Do we have annual totals for the number of people who walked through the door, saw the length of the queue at the counter and thought: "I'll come back later?" No, we don't.

Which is why I got in a bit of a bate about it.

Saturday, 10 September 2016

Please can we stop pretending library visitor counts are performance data?

From an operations management perspective there is some use for library visitor count data:
  • It's useful to have an idea of when your peak throughputs occur so that you can deploy resources accordingly.
  • When you're designing new library spaces it's good to have a rough ballpark figure of throughputs for designing in capacity and customer flows.
  • It's good for morale to be able to acknowledge and celebrate when you've safely handled a significant number of visitors.
From a service management perspective there is one use for library visitor count data:
  • It's a salutary reminder of how little you know of your customers if the only available data are about loans and PC sessions.
Otherwise, they're not a lot of use. You see, the thing about visitor counts is that most of them are poor data and from a service delivery perspective all of them are meaningless.

Why are so many visitor counts poor data?

We'll assume that everyone's acting in good faith and nobody's playing the numbers.

If you've got an automated visitor count system and you're running your library as a stand-alone service you have the best chance of having good visitor count data. A lot depends on the way the count is done; the positioning of the counter; and the frequency of data sampling and verification.
  • A badly-placed counter can miss visitors — a head count could miss all the children, for instance, or the customer flows could by-pass the counter completely. 
  • Beam-interrupt counters that try to count the number of legs and divide by two have their issues. Back when we installed them at Rochdale we'd heard the urban myth about what happens when you walk past with a ladder. Having enquiring minds we tried it with a seven-rung stepladder and discovered that there was some truth in the story so long as the ladder was being carried horizontally by two toddlers, so we didn't worry about that too much. We did worry that one toddler with little legs = one interrupt = half a visitor. And that a child running into a library didn't get counted (children run into good libraries because they're exciting). People with walking sticks or wheelchairs seemed to be inconsistently recorded. So the figures were only really useful to us as rough ballpark figures.
  • Things happen, so there need to be ongoing checks on the reliability of the equipment, the data and the procedures for collection and collation.
If you've got an automated visitor count system in a shared-service building it's a bit more complicated.
  • The visitor count is potentially useful for the operational management of the building but not really for the library service. 
  • Do you include or exclude the people who came in to renew their taxi licence or claim housing benefit or have an appointment with the nurse in the clinic? 
    • If you include them how is this data useful for the management of the library service? 
    • If you exclude them how do you capture the visitors who come in for a taxi licence/HB enquiry/clinical appointment then take advantage of the fact there's a library in the same building? 
  • How do you exclude the people coming in for meetings with other services? 
  • How do you exclude the passage of staff from the other services? 
It gets messy quickly.

And then there are manual counts…

What's wrong with manual counts? Well:
  • There's the timing of them. It's unlikely that you'll have the resources to do the count all day every day. If you have, you'll find a lot of other things for them to be doing at the same time. (If you *do* have a FTE devoted exclusively to counting visitors you need your bumps feeling.) The data will be a sample. It's easier to do the sampling during the hours when the library's relatively are quiet, but what would invalidate the sample. So there will inevitably be stresses of distraction and confusion.

  • Then there's the seasonal variations. And do you want to base your annual count on that week when you've arranged for the workmen to come in to fix the heating? And so on.

    You could take the sensible view that you're not going to extrapolate the figures, you're just going to do a year-on-year comparison of counts conducted the same way at the same point in the calendar every year. Which is useful if the figures are purely for internal use or published as trends rather than absolute data.

    If that absolute data's used to compare and contrast with another library authority it becomes a lot less useful as you're comparing apples with pears:
    • The methodology may be different. 
    • There may be good local reasons for differences in the  seasonal variation — half term holidays at schools being obvious examples. 
    • There  may be differences in the library calendar — the reading festival may be that week, or it could be the breathing space after last week's reading festival. Back in the nineties our managers were concerned that visitor counts could be distorted by events in the library so, ironically, the two weeks of the visitor count were the only ones with no events in the library, no class visits, no author visits, etc.
  • Then there's the counting. A manual visitor count is easy when the library's quiet. When it's busy you're too busy dealing with your customers to count the visitors. You can try and do a catch-up later but that's always going to be an approximation based on memory and chance observation. And your finger may slip on the clicker you're using for the count. Or you could be one of those people who sometimes have four bars in their five-bar gates.
So there are issues with the visitor count data.

Why is this data meaningless?

A person has come into the building. And…? The visitor to the buildings isn't necessarily a user of the library or a customer of the service, any more than the chap who gets off the train at Waverley Station is necessarily a Scotsman.

There were forty visitors to the library
  • Three people borrowed some books
  • Three people used the computers
  • One came to read the electricity meter
  • One came to sort out the radiator that hasn’t been working
  • One came to deliver a parcel
  • One came to use the loo
  • A drunk came in to make a row and throw his shoe through the window
  • A policeman came in to take a statement about it
  • A building manager came in to inspect the damage
  • A joiner came in to board up the window
  • A glazier came in to replace the window
  • A cat ran in, pursued by:
  • A dog, pursued by:
  • The dog’s owner, pursued by:
  • Three children who followed to see the fun
  • We don’t know what the rest did.
A visitor comes into the library.
  • What did they do? 
  • What did you do? 
  • Was it any good? 
  • Where the right resources available at the right time for the right people? 
  • Did the visitor engage with the service at all? 
The visitor count tells you none of this. So how can it be any sort of reflection of the performance of your service? You can't use data about people you don't know have engaged with your service to measure its performance because the only information you have is about the people, you don't have any about the service delivery.

Attendance figures are important to sporting venues because attendance generates income. They don't determine the trophies that team collect in the course of a season. Nor do you see schools with banners proclaiming: "According to OFSTED more people walked through our front door than any other primary school in Loamshire!"

Visitor counts are not performance data. Performance is about the delivery of outcomes, not throughputs.

Wednesday, 17 August 2016

How many?

I've started doing some work with the Libraries Taskforce. I'd been to one of their workshops and it was pretty apparent that potentially there should be a lot of work needing doing by the less than a handful of people involved and it wasn't easy to see how they'd be able to do it on their own. I've got some time now that I've retired from Rochdale Council so I asked them if they needed a hand with anything and they said yes please. So I'm lending a hand with the work strand that's hoping to develop a core data set for English public libraries that can be openly-available for both public use and operational analysis. It's a voluntary effort on my part; it's something I'm interested in and have been impatient about and it's a piece of work that should have some very useful outcomes.

Whenever you start talking about English public libraries data the elephant in the room very quickly makes its presence known. Before we can talk credibly about anything very much there is one inescapable question desperately needing an answer:
Just how many English public libraries are there anyway?
There is no definitive answer. There is no definitive list. There are at least half a dozen well-founded, properly researched lists. They each give a different answer and when you start comparing them you find differences in the detail. There are perfectly valid reasons for this:
  • Each had been devised and researched for its own purposes without reference to what had gone before. Each started from scratch and each had a differently-patchy response from library authorities when questionnaires were posted.
  • This data's not easy to keep up to date at a national level — especially these days! So some libraries will have closed, a few will have opened, some will have moved and some will have been renamed. 
  • It wasn't always clear just how old the lists were. Some had been compiled as part of some wider project and there wouldn't necessarily have been the resource available to do any updating anyway.
So the decision was made to tackle this head on so that it could be settled once and for all so that the world could move on and they were a few weeks into this work when I signed on. Very broadly, here's the process:
  • Julia from the Taskforce, who has infinitely more patience than me, trawled every English local authority's web site for the details of their public libraries.
  • Between us we scoured the other lists and added any libraries we found in there that we couldn't find in Julia's list.
  • We then went through this amended list to see if we could identify any points of confusion, for instance where "Trumpton Central Library" has moved from one place to another or where "Greendale Library" has become "The Mrs Goggins Memorial Information and Learning Hub."
  • The Taskforce has sent each library authority a list of what we think are their libraries asking them to check to see whether or not these details are correct.
  • The results will be collated and the data published by the Taskforce.
Ten years ago this would have been pretty straightforward. These days the picture is complicated by the various forms of "community library" that have sprung up over the past few years. These run the gamut from "this library is part of the statutory provision though it is staffed by volunteers some of the time" all the way to "we wish them well on their venture but they're nothing to do with us." So where a public library has become a "community library" of one sort or another that needs ro be indicated in the data.

Will this list be 100% correct? Probably not at first, this is a human venture after all. But even if it's only 98% correct in the first instance it should be treated as the definite article. It will then need to be corrected and updated as a matter of course; if that's devolved to the individual library authorities the work becomes manageable and the data becomes authoritative.

Why should anyone bother?

What's in it for anyone to keep their bit of this list up to date and details correct? In my opinion:
  • It's basic information that should as a matter of principle be available to the public.
  • In the past year alone, this question has tied up time and effort that could have been more usefully-occupied. All those enquiries, and FoI requests, and debates about data that could just be openly-available and signposted whenever the question arose.
  • It is essential to the credibility of any English public library statistics. If the number of libraries is suspect then how trustworthy are any of the statistics being bandied around? If the simplest quantitative evidence — the number of libraries — is iffy then how much faith can be placed in quantitative or qualitative evidence that's more exacting to collect?

    For instance, counting the number of libraries within a local authority boundary if you're responsible for supporting or managing them is a piece of piss. Reliably counting the number of visitors to any one of those libraries most definitely isn't — I have 80% confidence in the numbers coming out of any automated system (not necessarily due to technical issues) and to my mind if you're relying on manual counts you may as well be burning chicken feathers. So when I hear that visits to English public libraries have dropped by a significant percentage over a given number of years I may be prepared to accept this in the light of a wider narrative, personal observation and anecdotal evidence but I have no empirical reason to know that this is the case. 
That's why.

Sunday, 24 July 2016

Visible security

A long, long time ago, back when we first put the People's Network into Rochdale's libraries, I got worried about the physical security of all the kit we were putting out there. A few months before we'd had a break-in at Langley Library (back when it was in a stand-alone building next to the bus stop): somebody had drilled through a wall panel into the stock room and pinched a couple of the "homework" PCs we had in the children's library. So I worried about it a bit.

To my mind there's two types of security:
  • Rendering something for all intents and purposes non-existent to anybody who isn't authorised to know about it.
  • Something in your face that says: "We both know there's something here but it would be a pain in the arse for you to try and have away with it.
In a library context where we were wanting everyone to know there were PCs for public use invisibility wasn't an option so we needed something conspicuous to put off the scallies and sneak thieves.

Somebody, I can't remember who, pointed me in the direction of a chap called Peter Radcliffe, trading locally as Nexus Computers who sold computer safes. At the time Peter was selling metal computer safes. There was nothing subtle about these: they were made of plate metal and the fittings used to fix them to the furniture were uncompromisingly industrial. They did the job brilliantly. Only one thing bothered me.

Back in those days computers came in three colours: "white" (beige), grey (beige) and beige (grey). Dead boring. About this time Rochdale was in the early stages of a complete refurbishment of nearly all its libraries. The aim was to make them lighter, brighter and more colourful and I wasn't much keen on installing a lot of battleship grey boxes into this bright new landscape. So I asked Peter if they were available in any other colours. He came back with a paint sheet from the metal-bashing shop he was working with.

"I'll have that," I said.
"Are you sure?"
"Yes. In gloss finish."
"You're mental!"
"Can you do it?"
"The customer is always right. We can do it if you really want it. Are you really sure? Really?"
"Yes, please."

The colour? It was

  Signal Violet 
a bank of PCs at Alkrington Library And so it came to be.

I got a bit of stick about it. Not least because I hadn't spent any time whatever consulting anyone about it (ordinarily I'd accept a good shin-kicking for that but I only had four months to do a complete implementation of the People's Network from scratch across the whole borough and we had just had the one and only planning meeting where we had spent three hours watching a debate about the position of a chair in a particular library). But I stuck to my guns. Still do, in fact:
  • This was aggressively-visible security.
  • It fitted in with the bright, colourful feel of the libraries.
  • They were an easy visual cue. We hadn't consciously gone in for any branding at that stage but it provided a consistent, obvious "Here be computers" message to the libraries' customers.
If I had my time again would I make that same decision? Dead right I would. And Peter still thinks I was barmy.

Wednesday, 20 July 2016

Archaeology: Library staff training ideas

I was digging round on an old laptop looking for a couple of photos when I bumped into this. Back in 2007 I was the library service's union rep, amongst my very many other sins. At that time Lifelong Learning UK (LLUK) was looking at the options for public library staff training. One of the stakeholders being consulted was our union, Unison, and the consultation documents were passed along to local reps. After a few conversations with members in Rochdale's libraries this is the response they agreed that I should send back. We live in a different world now but even in these austere times I think there would be scope for adopting some of this. I shan't be holding my breath, mind.

LLUK consultation on public library training

We feel that this is potentially a useful opportunity for addressing the training and development needs of public library staff while they try to provide relevant and appropriate services to their customers in a fast-changing world. There are, however, three issues which we feel could compromise, or even completely de-rail, the outcomes of this exercise if they are not addressed:

(1) Currently the only reward system for excellent public librarians is for them to move into generic management. This is perverse as:

·         An excellent librarian is not necessarily a good generic manager;
·         If the United Kingdom is serious about wanting to be a player in the Knowledge Economy then it cannot afford to waste the community-level knowledge management, information literacy and reading development skill sets of the public librarian. Why go to the trouble of sending somebody to library school if they’re going to spend all day doing sickness returns and reporting building repairs?

CILIP's current position on this is unfortunate: it is very keen to promote librarians as professionals but this is the only profession where personal advancement is predicated on the abandoning of professional practice

(2) The limited opportunities for career development of public librarians effectively imposes a glass ceiling on the career development of non-librarians in the public library sector. This has an increasingly difficult effect on the recruitment and retention of staff. There is another undesirable effect: there are excellent managers in all sectors of the economy who are not librarians; if the staff resources of the public library sector includes non-librarians with the potential to be excellent managers we should want to want to develop and keep their skills and abilities rather than stifle or lose them. There needs to be a career development path for non-librarians that rewards the excellent work that many of them do in the public library sector and provides the sector with the opportunity to use their talents.

(3) Public libraries are very front-line-focused, often to the detriment of support and training activities. There is no point in there being an excellent training and development package in place if staff do not have the chance to take up the opportunity. As we saw with the NOF-funded training, the need to cover the needs of the service can turn a major opportunity into a worry for managers and a source of resentment for front-line staff. This is especially true in authorities like Rochdale where scant staff are thinly spread over many service points. If staff can't take up the development opportunities then the exercise becomes at best an irrelevance and at worst a bad joke and an impediment to good staff relations.

It is also worth flagging up a potential abuse of the outcomes of this exercise: there is the danger that any qualification will become another recruitment hurdle rather than a career development opportunity. We already see nationwide that an essential requirement for ECDL is being used as the lazy recruiter's shorthand for "must be able to demonstrate computer literacy and we don't want the bother of working out how to define our needs in any measurable way, nor do we want to provide ECDL training for our staff." ECDL has become the modern equivalent of "must have six O-levels." Anyone with a degree in computing but no ECDL need not apply. One of our members attended a national event on the future of public libraries and was shocked that a workshop on staff training needs became a bunch of chief librarians moaning that not enough people with ECDL apply for jobs in libraries. It is important that qualifications should be recognised as proof of some degree of attainment but it is also important to recognise that there is more than one way to demonstrate most skill sets.

Given LLUK's brief it is unreasonable to expect that it should be able to tackle the root and branch structural issues involved in (1) and (2), especially given CILIP's current support of the status quo. However, we do feel that LLUK can usefully call for the need to retain and reward the public librarian's skill set at a community level. LLUK can also usefully acknowledge the importance of, and seek to exploit, the skill sets, experience and dedication of all public library staff in the informal learning economy. We also feel that any training and development outcomes should reflect the needs of the public library service, not the particular management structures.

Starting from where we are the challenge is to provide career development opportunities for all library staff without making librarians feel that their qualifications are worthless. Luckily, in one key area the obvious training path does precisely that. 

We propose that LLUK put together a qualification for Public Library Management, covering three core elements:

·         The public library service: the philosophy and aims of the service; good practice models for the delivery of Information Literacy, Reader Development and Cultural Identity at strategic and community levels; the statutory basis for the services we provide (not just the Public Library Act!); the government-level matrix management of the public library service, including the challenges and opportunities in the statutory bases of every governmental department; national and international public library NGOs; public library performance measurement; and good practice models for cross-sectorial working with academic organisations and with the voluntary and private sectors.
·         Public administration: local government organisation, philosophy and finance; working with elected members; statutory controls; good practice models for interdepartmental working; local government performance management and development; and effective (and appropriate) lobbying.
·         Management skills: project management; personnel management and development; financial planning and control; performance measurement; resource procurement; writing business plans, funding bids and reports; and good practice models for strategic and tactical planning, operational management and delivery.

This qualification would provide a good grounding for people wishing to move into the management of public library services. The advantage to the librarian would be that they will already have covered parts of this curriculum in their library training and so will start the course with their feet a couple of rungs up the ladder. The advantage to the non-librarian would be that there would be a ladder in the first place. The advantage to the service and, importantly, its customers would be that the people running the service will have had the opportunity for a more robust foundation to their skill set than is often currently the case.

We also suggest that LLUK look at the provision of courses and qualifications addressing front-line skills such as:

·         Customer care, including dealing with enquiries effectively; applying appropriate limits to delivery (not getting out of your depth and how to tell a customer that there are some things the library can't, or won't, do and they can't have everything they want when and how they want it in this life); assertiveness skills; and presentational skills.
·         Activity and event management, including small-scale project planning and management; marketing and promotion; health and safety; addressing the audience; keeping it practical; delivering the event; and reporting back to interested parties.
·         Community development, including how to work with community groups; promoting and providing the library as a venue; and promoting and providing library services outside the library.
·         Computer skills, including training in the use of computers in a public library context (contra the standard ECDL training, where people who are working day in, day out with sophisticated databases in their library management systems are told that they don’t know about databases if they don’t know how to build a flat table in MS Access); how to support public use of computers; e-learning and e-government taxonomies and metadata standards; Web 2.0 tools; etc.

It should be noted that a lot of existing courses already cover these areas. Unfortunately they are provided by very many different organisations and at irregular intervals and it can be a full-time job just tracking down what's available. It is also unfortunate that so many of the courses that are available are provided only at London venues. The cost of overnight stays or exhorbitant rail fares is a barrier to the take up of those courses. It would be useful if some sort of co-ordinated pattern of training held at regional venues could be established.

Making staff available for training is a major issue. One solution would be for DCMS or MLA to fund additional staff cover for this purpose. Experience would suggest that this would be both inadequate and unsustained. An alternative would be for one of the CPA-related public library performance indicators to be a measure of the training time per capita of staff, with due safeguards for what constituted "training time." This would give local authorities a financial incentive to provide training cover for library staff. By happy accident it would also give library managers an argument against freezing or cutting front-line vacancies for budgetary purposes. Or else we could carry on as we are, in which case LLUK is wasting its time.

Wednesday, 8 June 2016

Learning from experience

When you're doing any sort of serious developmental work, whether it's solving a problem or exploiting a new opportunity, you have to start by defining your preferred outcome then look at all the possible options for getting you there. One of the things that constantly dismayed me when working in public libraries was just how limited "all the possible" usually was, and that these limitations were often imposed as a matter of principle rather than practice. Often summed up with a look of complacent contentment and the mantra: "Ah well, you see, they don't work the same as libraries." I still bump into this thinking quite regularly on the lists and social media, often as a dogmatic statement along the lines of "Libraries have got nothing to learn from…" This is, of course, the veriest nonsense: libraries can learn a great deal from other business operations, if only the reasons why adopting a particular policy or process would be a bad idea in a public library context (and that aren't "Ah well, you see, they don't work the same as libraries.")

The other reason why it is nonsense is that although a business operation may be very different to yours some of the operational functionality requirements may be very similar, or the same. There's no particular philosophical or ethical difference between opening the public doors to a library first thing in the morning and opening the public doors of a shop, for instance.

The exciting thing about not imposing any constraints on where ideas can come from is that you can often find alternatives to the organisation's "default setting" that are cheaper and more effective to apply.

The responsibilities in my last job included a transport management system called Tranman. Very generally speaking it was used to manage the council's fleet of vehicles and the jobs done in the vehicle workshop. Some of the jobs done in the workshop were part of the routine maintenance work included in the service level agreements with the council departments using the vehicles. A lot weren't: repairs necessitated by accidents or driver abuse were chargeable extras and the workshop also did a lot of private repair work, services and MOTs as part of the operation's income generation. Extremely different to the business operation and purpose of the public libraries I was also supporting.

They had a problem. A lot of the jobs were taking ages to complete and invoice on the system because they were missing information about the parts that were issued to them. Even worse, some jobs had been completed and invoiced without including the rechargeable costs of the parts. This was playing hob with their cash flow and making the accountants cross. So we had a look at it and basically the situation was:
  • There was a manual process involving somebody trying to catch up with a pile of paperwork at the end of the month
  • There was a technical solution which could automate the stock issue but wasn't being used
What was needed was for the parts to be issued to the jobs in as close to real time as possible and that wasn't going to be done manually from the paperwork.

We had a long, hard look at the technical solution. None of us were big fans. It was hard work to get set up and running and, frankly, was over-engineered for its purpose. On paper it looked great:
  • A piece of software on the stores manager's PC generated barcoded labels for each part, the barcode including the appropriate part number and bin, together with a human-readable description.
  • The parts bins were labelled up accordingly.
  • When a part was to be issued to a job the store man entered the job number into a PDA, read the appropriate barcode and entered the number of items being issued.
  • The PDA was then plugged into a console attached to the store manager's PC and the data uploaded into Tranman.
In reality it wasn't so hot:
  • The barcoded labels soon got scruffy and unusable.
  • We could make the barcodes on the labels as big as we wanted but the descriptive text was always tiny and difficult to read, especially in some of the darker corners of the store room.
  • The keys on the PDA were very small (6mm across), which was difficult enough for me with my never-done-a-stroke-of-hard-labour-in-his-life delicate fingers and not a lot of fun for workers who'd spent years bashing their finger ends on bits of metal on cold Winter's days. The incidence of input error was high.
  • Uploading the data was a bit hit and miss at first: it took quite a while for us to iron out the problems and making sure the solutions were properly packaged into the virtual application. (I for one was praying nothing happened to that PC as I wasn't confident we wouldn't have at least a few of these problems again with a new deployment of the software.)
  • It wasn't easy to see where and when something went wrong. The lack of transparency meant that you couldn't do any quality control to make sure that the right parts actually were being issued to the right job.
Something had to be done. 

We looked at various somethings: we spent far too long trying to sort out better barcode labels; then we had a fruitless quest for a more user-friendly PDA that could do the job. Over the next few months, in between a pile of other pieces of work we were doing with this system, we tweaked this process as best we could but in the end it felt more like we were working through levels in an arcade game rather than getting anywhere with improving the process and addressing the cash flow problem. By this stage I was thoroughly fed up and went and had a sulk in my tent. On reflection it was apparent that all we had been doing was following the same ploughed rut over and over again. 

We'd let our problem-solving be dictated by the "solution" rather than either the problem or the desired outcome.

So I looked again. And wondered why the issuing of parts to a job in a vehicle workshop was very much different to the issuing of library books to a borrower…

The solution I came up with was cheap, not especially elegant but did the job and could be seen to do the job. There were two elements:
  • A standard barcode reader, same as you'd see in a shop or library
  • A folder containing sheets of labels including all the information needing to be entered into the job record:
    • The description of the part in a large, readable font
    • The part number as a barcode (using the "3 of 9" font)
    • The bin number as a barcode
    • The store location as a barcode
The process was:
  • The request comes in for the part for the job
  • The stores man gets the part then opens the job record in Tranman
  • The part number, bin number and location are input by reading the barcodes from the appropriate sheet
  • The only manual input is the number of parts
  • The parts are issued to the workshop
For out-of-hours working the process was:
  • The request for the part is recorded on a paper job sheet
  • The fitter working on the job gets the part
  • Next working day the stores man issues the part in Tranman as above
Getting a working test sheet turned out to be a bit of a problem because initially I was doing it in MS Word and the formatting it imposed was disabling the begin and end codes of the barcode (an asterisk *). Jacking it in and doing the job in Wordpad gave us some test sheets we could use to demonstrate proof of concept. I used Crystal Reports for the final print out of all the parts.

Sadly, this is as far as I got with it by the time I was leaving. We'd checked that each step should work as expected but we'd not given the process much hammer and inevitably there'd be some snagging to do. One thing I'd have liked to have done, given the time, would be to modify the Crystal Report so that the first few pages were limited to the high-volume items, to save the stores men having to routinely wade through so many pages.

So that's how I applied a bit of standard public library functionality to a problem in a completely other environment.

So what are the takeaways from this?
  • Don't mistake differences in business operation with differences in operational functionality. In this case, issuing an item in real time is issuing an item in real time.
  • If your problem-solving energies are being devoted to the solution perhaps you've got the wrong solution.
  • To solve a problem you have to know what problem you're solving and the outcome you desire of the solution.

Saturday, 28 May 2016

A rose is a rose is a rose…

a sunny day in Rochdale On Tuesday afternoon me and Rochdale Borough Council go our separate ways. Strangely enough, of all the things I should be thinking about the one I keep coming back to is: "How should I describe myself on Linkedin?"

For a while (at least the four weeks I've promised myself) I'm not going to be in a job and have a job title. "Resting," while dead accurate doesn't really cut it, so I'm having a think. Besides which, I've long since been irritated by the way we all define ourselves by our jobs: when we're introduced to each other with the routine: "And what do you do?" We never boast of the many very splendid things we're capable of: we give a job title or rôle. That's a pity. Even in an employment context this tells so very little of us. So I decided to pick up the "And what do you do?" question and find a different answer.

Casting round amongst family and friends wasn't helpful, though I'll admit to being quite taken with "Provender of finely-wrought artisan drivel" and "Keen amateur idle beggar."

Thinking about my own experiences and the rôles I've assumed or had thrust upon me there are a few consistent strands:
  • Identifying and documenting operational processes and procedures
  • Helping lines of business find new ways of doing things that add value to the operation
  • Helping operations and people accept and adapt to change
  • Identifying and delivering ways of measuring continuous service improvement
Bearing this in mind I bounced a lot of descriptive phrases round in my head. Many sounded like the worst kinds of Management Speak. Many were either too vague or too narrow. A lot of them felt too much like I was parking my bike on somebody else's lawn. It boiled down to two in the end. And I'm still havering between them: one feels vague and one feels like I'm trespassing. One is the working assumption of many of the people I've said cheerio to over the past month. The other is the rôle I've taken on repeatedly over the past twenty-seven years and the one I've tended to feel has been the most productive and worthwhile.

I've a few more days for havering. Who knows, it might end up being Keen Amateur Idle Beggar after all!

Tuesday, 17 May 2016

Taskforce workshop: assessments

In Friday's workshop a few of us got talking about the idea of "voluntary assessments" raised in the Ambitions document. These weren't really defined so it was difficult to know quite which was the best approach for this topic.

There was some concern that any assessment would just be another stick to beat libraries with, like issue figures and visitor counts have become. Equally, would they become another set of targets to be gamed? Both are extremely valid concerns.

I think it's essential that public libraries have a solid suite of KPIs — not for comparison with so-called "peers" in a way that takes no heed of communities or contexts but for comparison with past performance and identifying strengths and weaknesses in the operation. But the "voluntary assessments" *shouldn't* be about KPIs: there are a couple of other useful functions they could perform.

One way would be to direct the stick upwards, towards the DCMS. It's vanishingly unlikely to happen but it could be that one of the Department's own performance narratives, published in its annual report, could include an assessment of the health of the national public library service derived from local returns. 
  • I don't like expenditure as a measure of performance (any bloody fool can spend money) but given that the audience for such a thing would be a political one that largely measures achievement by expenditure one of the assessment measures could be expenditure per capita population, which could be broken down into: investment in buildings; investment in skilled staff; investment in stock; and investment in community activities. (Did you spot the gear-shift there?) 
  • DCMS would — finally! — have to be able to report a definitive number of public libraries in the country, and any changes and trends. 
  • It would be interesting to see a national picture of:
    • The number of staffed library open hours
    • The number of library open hours manned by volunteers
    • The number of unsupervised library open hours
    • The number of "daytime" (9am — 5pm) hours libraries are closed and left fallow
  • And so on. I won't go into more detail because it's so unlikely to happen it's hardly worth the wishing for.
Another, possibly more plausible, function would use the assessment as a feedback mechanism for continuous service improvement (absolutely not a set of targets!). They would be used to evaluate rather than monitor the performance of the service and help direct local decision-making. 
  • The assessment criteria could be deliberately aspirational and impossible to achieve: the assessment would evaluate the direction of travel of the service and the impact on resources, staff and the needs of the communities that the service serves. 
  • Comparisons would be with past performance rather than against "peers," which would remove any unwelcome competitive friction between organisations that should be working collaboratively, 
  • This would also mean that it would be harder to game the figures and it would be harder to coast on past glories as assessments would be reporting the direction of travel towards the impossible goal rather than the successful negotiation of an arbitrary obstacle.
  • For example, one of the "impossible goals" could be 100% of the local population's being active members of the library (however that would be defined). Last year Library Service A could have attained 56% active membership and this year 58% while Library Service B managed 69% and 60%. Crude peer-to-peer analysis would suggest that Library Service B is performing better but A is actually making a better fist of continuous service improvement — it's the rate and direction of performance, not the absolute figures, that are the measure of how the service is being managed and resourced; they'll all have different baseline starting points.
There is a problem with this idea: while it works well in many organisations and is pretty standard performance management fare I don't think our political environment is adult enough not to try and turn this into a badly-fitting set of targets and league tables. It would be nice to be shown to be wrong, though.

Sunday, 15 May 2016

Having a say

I went to Friday's Library Taskforce workshop in Machester Central Library on Friday.The aim of the workshop was to get feedback on the Ambitions document and suggest practical detail that should go into the action plan for it. I was interested so I signed up and was accepted in my rôle of "interested member of the public." It was an interesting few hours, here's a few thoughts on it.

Something that concerned a few of us was the rôle of the Taskforce itself. You don't have to go very far into discussions about English public libraries before you'll hear somebody say: "The Taskforce should do…" And that's a problem. The Taskforce is no magic wand, it has a finite life and actually it's just that scant handful of people who were going round sticking bits of paper on the wall in the library on Friday. It has the same rôle as a business analyst: it can facilitate and stimulate discussion, it can identify work to be done but it's down to other agencies to get on and get things done.

A recurring theme in the workshop was the need for a skilled workforce to deliver library services. A decade ago this would have been couched in terms of CPD for "professionals," here the conversation concerned all library staff. And not just the "we need people with digital skills" thing: there was a recognition that there's a swathe of skills that need developing and supporting, including having properly-trained library managers who can manage operations, services, programmes and projects. This is a welcome contrast to the ongoing narrative of staff losses, closures and farming out the work to volunteers. It's good that the Taskforce is talking up the need for a skilled staff. Translating that into action will be a stern challenge!

Another recurring theme was the tension between local authority control and opportunities for cost-sharing and delivering services to communities that aren't defined by geopolitical boundaries. Even when we look at delivering "bigger" we still seem to be bound by lines on ancient maps: local metropolitan authorities don't often think of working across the Pennine border.

Evidently there'd already been some feedback about the thinness of the digital strand of the document: there were a few more ideas pinned to the wall for us to respond to than I'd been expecting. This suggests that somebody's listening. Actually, I've been consistently getting the impression that the Taskforce members are listening to people. And they're trying to have conversations with people aside from chiefs and politicians. I think the workshops could have been better publicised away from the usual channels to try and get more library customers into the mix because I think it would have proved very useful.

A few of us thought that "income generation" is a tricky topic. It's important that libraries try to maximise their income (good services don't pay their bills in daydreams) but the phrase "income generation" is unfortunately a bit loaded. It's important that libraries stay that safe, trusted place where nobody's asking you what's in your wallet and that shouldn't be compromised. And whenever libraries are told to be more commercially-minded it's conveniently forgotten that local authorities don't work in a commercial environment. The twelve- month budget lifetime isn't anything any commercial operation would be able to live with: the mad March spend it or lose it would be ludicrous in that context. There was a lot said about that! The other problem, of course is that as soon as you say that you're going to generate some income some clever body will impose an income target. Income targets work this way: you make a business case for spending £100 to set up a business selling apples from a barrow; you're told: great, go ahead and by the way we're going to assume that you're going to make a £50 profit so we'll take that off you now; and you're left with enough money to buy either the barrow or the apples but not both and your business plan fails. There's no real money to be had selling off odd scraps of thing or hiring out rooms or buildings. There is potentially funding out there for programmes, projects and activities that are entirely compatible with the traditional aims and purposes of public libraries and I think this is what is meant by "income generation" in this context. I think it would be more useful for the Taskforce to be talking about "maximising the take-up of funding opportunities" instead.

We've until 3rd June to contribute to the discussion. If you haven't already, please do so.

Wednesday, 27 April 2016

Ambitions 1998

Another bit of archaeology. In 1998 — back when we had only five libraries networked and on the library management system and before the reality of The People's Network — I was asked why the library service didn't have an IT strategy so I drafted one. This is some of the working-out.

Rochdale Library Service
Information/Communications Strategy — suggested bullet points

1         Factors informing the strategy
a)       Operational needs
i)        Lending
·         Stock management & control
  Making stock work harder
  Audit requirements
·         Borrower management & control
  Accessibility to library services across the Borough
  Fines/charges control
  Market analysis
·         Reservations management & control
  Reaching performance targets
  Managing costs of interlibrary loans
ii)      Reference
·         Increasing access to electronic information resources
  Access to new forms of material
  Access to wider ranges of material
  Making existing materials more widely available
  Increasing the accessibility of Council information
iii)    Access to Library information
·         Access to the Library Catalogue
·         Access to local history materials
  Digital formats
  Local history collections
  Local newspapers
·         Accessibility
  Within libraries
  Mobile library provision
  Meeting special needs
  Telematic access
iv)    Management information
·         Stock management information
  Stock use analysis
  Stock age analysis
·         Statutory statistical requirements
  Audit Commission
·         Performance indicator measurement
·         Financial information & control
  Stock ordering management
  Book fund analysis
  Budget monitoring
v)      Administrative requirements
·         Word–processing
·         Contact management — shared resources
  Council telephone numbers
  Emergency numbers
  Professional contacts
·         Bibliographical information (all formats)
  Records for requests
  Bibliographical records
b)      Staffing/support issues
i)        Training
·         Designing systems to minimise training needs
·         Commitment to empowering staff — enabling local problem–solving
·         Resources for delivery
  Staff time
  Cover for staff being trained
  Opportunity and resources for preparation
ii)      Support
·         Ownership issues
  Enabling local problem–solving
  Spreading skills widely
  Agreeing what can be expected of staff locally
  Support structures within the Library Service
  Professional Librarians taking responsibility for library systems
  Other staff
  Third–party support
  Within the Council
  Cost implications
·         Providing support at the front line
  For staff
  For the public
c)       External pressures
i)        National issues
·         New Library: People’s Network
·         National Grid for Learning
·         Audit Commission: Due for Renewal
·         Information for All
ii)      International perspectives
·         European information issues
  Information 2000
  IRISA–LAPSA & successor organisations
·         Internet access
d)      Technological possibilities
i)        Dynix library system
·         OPACs
  Graphical interfaces
  Web PACs
  OPACs for special needs (e.g. Libris Envisage)
  Off–line PACs (“OffPACs”)
·         Cash management
·         Acquisitions
  Order management
  Supplier performance monitoring
·         Community Resources
  Community organisations
  Newspaper indices
·         Internet publishing
ii)      PC–based systems
·         Word–processing
·         Spreadsheets
·         Custom databases
·         SQL–compliant systems
·         Electronic reference materials
·         Tutorial materials
·         Web browsers
·         Windows NT networks
·         Intranet systems
iii)    Unix–based systems
·         Electoral roll
·         Corporate financial data
iv)    Self–service opportunities
·         Reservations
  Web–based telematic systems
·         Circulation
e)       Restraints
i)        Funding for projects
·         Capital funding
·         Revenue funding for maintenance, etc.
·         Staffing costs
ii)      Expertise
·         Within the Library Service
·         Within the Council
  Knowledge of library systems
iii)    Staff time
iv)    Support issues (see above)
v)      Corporate policies

2         Suggested outcomes
a)       Dynix Library System
i)        Aim to stay with Dynix for the period of this strategy provided the development and support of the system meets the needs of the Library Service.
·         Join forces with other Dynix users to lobby for enhancements to the system to meet outstanding needs
·         Keep a watching brief as to the development of the product as a graphical system based on PCs 
·         Pending proper reassurance on the life expectancy of the Dynix system, add remaining libraries to the system
·         Implement procedures to automate parts of the stock editing system
  Transfers of some fiction collections
  Flagging up “tired” or “under–used” stock
  Collection inventory systems
·         Implement the Acquisitions module — automating the order process; book fund monitoring; supplier performance monitoring
·         Developing the use of Community Resources
  Making the Community Organisations database available
  On Council intranet via WebPAC technology
  On World–Wide Web via WebPAC technology
  Making local newspaper indices (currently in card format) available on OPAC
  Could be made available on the World–Wide Web and Council intranet via WebPAC technology
·         Investigating the effectiveness & viability of the Dynix Cash Management system
·         Develop new OPAC functions
  Kids’ Catalogue
  Graphical interfaces
  Publishing the Library Catalogue on the World Wide Web, including allowing “WebPAC” access to search for particular items
  Requires web access to the processor housing the library system
 Allows the possibility of allowing self–service reservation via the WWW
·         Telephone access
  Investigate the effectiveness and viability of potential options
  Automated telephone renewals
  Automated telephoned messages — overdues/charges/reservations notices
ii)      Where possible, create interfaces between the Dynix Library System and PC–based software
·         Management information
  Collating data between Dynix and proprietary software (e.g. for spreadsheet analysis)
  This would require a third piece of software to act as an interface between the two
  Investigating the effectiveness & viability of the Dynix “Executive Information System” — data warehousing/reporting system
·         Catalogue information
  Envisage — OPACs for visually–impaired
  Off–line copies of the Catalogue on CD–ROM (using the data conversions already taking place for Envisage)
  Replacing microfiche at off–line libraries
  Replacing microfiches as “back up” at on–line libraries
  Could be available on laptop on Mobile Library
  Could be available on laptop on Housebound Service
·         Using WWW technology (especially Java) to enable telematic delivery of library services (available from Dynix release 162E)
  Access to the Library Catalogue
  Potentially including placing reservations
  Access to library information
  Community Organisations
  Newspaper indices
  Requires a change in Council policy on web access to networked data
  Requires web access to the processor housing the library system
b)      Workstation access to other information systems
i)        PC–based information
·         Access to PC–based electronic references
  Networked electronic references based on a Windows NT server
  Allows resources to be shared between libraries
  Expensive references kept secure centrally
  CD–ROMs cached to hard disk to improve speed of access
  Using existing networks where feasible
  Networking dependent on licensing regimes — some material may not be networked; some networked licences may be prohibitively expensive
  May be able to derive use statistics for individual references
  Potential for differential access — different reference materials for different client groups
  Intranet connection with the rest of the Council
  Council information
  Statutory papers/minutes
·         Library intranet
  Staff manuals in hypertext format
  Staff notices
  Staff training materials
  Contact information
  Shared bookmarks for WWW
ii)      The Internet (more probably, just the WWW)
·         Staff workstations
  Including email
  Access to appropriate lists
·         Public workstations
  See below
iii)    Other systems
·         Reference Enquiry Desk workstations access to Electoral Roll
·         Administrative/management systems
Financial systems
 Suppliers database
c)       PC–based services to the public
i)        Extension of Open for Learning facilities
ii)      Maintenance of text–reading and Brailling facilities
iii)    Word–processing facilities
·         Support?
·         Cost of consumables?
iv)    Computers Don’t Bite and successors
d)      Public Internet access
i)        Controlled access
·         ?Timed out by software on PC
·         ?Time booked and issued to “borrower”
·         Use of Net Nanny et al. To prevent inappropriate access
·         ?Users registered, including disclaimers
  Promise not to look for illegal/immoral materials (how defined?)
  Promise not to hack machines
  Accept responsibility for own actions
  Agree that Library Service not responsible for anything the customer does during their time on the Internet
  Accept that service may be withdrawn from the customer if they break the rules
·         Costs
  To Library Service
  Service provider subscriptions
  Line costs
  To the customer
  ?How to be as inclusive as possible
  ?How nominal is a nominal charge
  ?Charge period — session or season
e)       Local history collections
i)        Access to the Local History Catalogue
·         Staff at Heywood and Middleton need access to the Catalogue at Rochdale
  Training load
  Is there OPAC access?
  Why not Dynix at Local Studies?
  Training load
  Why not both in both places?
  The Library Catalogue on Dynix will be searchable from a Web browser with 162E. If both catalogues Z39.50 compliant, both could be searched from a Web browser (as per draft Telematics strategy)
·         Practical issues
  Large cataloguing task
  May require some modification of the Dynix Catalogue record to include additional MARC fields (e.g. additional media fields; URL tags; image data)
  Would need to look at how other libraries/museums approach the job
ii)      Digitising local history materials
·         Security of primary materials
·         Making materials more widely accessible
  PAC–style picture catalogues
  Intranet/WWW access
·         Pictures
  Scanned into JPEG format
  ?Dynix (how catalogued?)
  ?Proprietary database system
  PAC access
  ?Proprietary system
  Electronic watermarking for copyright purposes
·         Documents
  Copyright issues
  Investigate routes taken corporately through Document Imaging Working Group
  Investigate effectiveness/viability of scanning and OCR (optical character recognition) systems
f)        Office applications at main libraries
i)        Word–processing facilities — MS Word
ii)      Use of MS Word and MS PowerPoint for DTP/notices
iii)    MS Excel for spreadsheet analyses
iv)    MS Access for local databases (created by Systems Manager for local use — e.g. for enquiry desk statistics)