Thursday, 31 July 2014

Data sharing between libraries

We're at the stage in the evolution of the AGMA library consortium where we're starting to work through the practical — and legal — implications of shared services.

  • Sharing our catalogue data is relatively easy: the data standards are well-established and most the data itself is published in the public domain on library OPAC's, etc. Which doesn't mean that it was all plain sailing and we've not got some more work to do. 
  • Sharing borrower data is obviously fraught with all sorts of information governance and data protection issues on top of the problem that there isn't any data standard save that imposed by the structure of our shared LMS and the commonalities we've discussed and agreed on a case-by-case basis.
  • Virtually every circulation dataset is a back door into the borrower data.
I've been thinking through some of the questions we need to be asking ourselves on this journey. It's still early days so isn't exhaustive; at this stage I'm trying to work out what we need to worry about at a general level prior to starting work on a risk analysis.

Purpose Type of Information Recipients Data Controller Notes/queries
Membership information including contact details –voluntary service, customers will be asked if they want to opt in
Customer name, address and contact information, DOB.

Disability, ethnicity and other demographic details

Family relationship details

Lending history
Library staff (including all other authorised Spydus users) of approved Authorities within the scheme Local Authority
(Data Subject’s Local Authority will be the data controller)
Which data is to be shared? Is it all or nothing?

  • If partial, which parts and how managed?

Same question applies to who the data is being shared with

  • What would be the position of volunteer-managed community libraries?

How do we switch sharing on/off?

  • What happens if a customer changes their mind? How are they “quarantined?”

What happens to the data held in loans, charges and reservations?

What happens to any outstanding loans, fines and charges?

Who owns (and is responsible for) the data?
Loans information Details of the loan including borrower, item, location and status of loan.

Loans history
Library staff

Specific customers can see all details of their loan(s)

All customers can see some details of the loan(s)
Local Authority
(which?)
This is the crucial element to be managed:

  • It is the purpose of the data-sharing agreement
  • It is the bridging element between the personal customer data and nearly all the other data sets

There is a hierarchy of viewing permissions

If a customer has said “no” to data-sharing, how is the borrower data in the loan, charges and reservation records expressed?

  • If the customer changes their mind about sharing their data, is it automatically redacted from these records?

Who owns (and is responsible for) this data?

Whose loan policies?

  • Applied from the lending library?
  • Including fines and charges?
  • How do exceptions apply?
  • “Non-default” borrower types and collections
Overdue/pre-overdue notices Contact details including borrower name, address, telephone and email; loan due dates and items involved Library staff

Specific customer
Local Authority (which?) Derived from loans data and subject to same questions

It would make sense to aggregate these to improve efficiency and save costs (see notes on charges, etc.)
Reservations Contact details including borrower name, address, telephone and email and items requested Library staff

Specific customer
Local Authority
(which?)
All the questions for loans apply for reservations (which are effectively loans-in-waiting)

Whose charge régime applies?

Would the Data Controller be the “owner” of the customer record, the library that placed the reservation or the library it will be picked up from (if a different library authority)?
Requests Contact details including borrower name, address, telephone and email and items/articles requested Library staff

Specific customer

ILL system (bibliographic and/or article data only)
Local Authority
(which?)
In nearly all respects as reservations, just more complicated charges

[The operating procedures would probably need modifying in the light of the shared lending environment.]

This will need to be revised in the event of a fuller integration with UnityWeb or equivalent third-party systems
Notifications for any reserved items Contact details including borrower name, address, telephone and email and items requested Library staff

Specific customer
Local Authority
(which?)
Derived from reservations/requests data and subject to the same questions

It would make sense to aggregate these to improve efficiency and save costs (see notes on charges, etc.)
Charges/fines/fees Contact details including borrower name, address, telephone and email; details of the transaction that generated the charge Library staff

Specific customer
Local Authority
(which?)
Derived from loans and reservations/requests data and subject to the same questions

How will these be managed:

  • Payable only where incurred?
  • Payable globally?
  • Impact on traps/alerts (whose parameters apply?)

In the event of recovery, who legally owns the charge?

In the light of the above, what would be the effect (if any) of aggregated notices?
Catalogue/ discovery records — bibliographic data Title-level catalogue data Library staff

Library customers and general public
Local Authority
(which?)
Bibliographic data – already shared data

Don’t forget that there is a link to the borrower record from the review/rating in the bib data in Staff Enquiry

  • Potentially links to more than one Data Subject, so which would be the Data Controller for this catalogue data?
  • Shared responsibility? How?
  • Similar questions are required of other customer-created content such as tags (these are lost in the current versions of Spydus 9)

(Not all data are published for the public)
Catalogue/discovery records — holdings/item-level data Catalogue data, including electronic holdings Library staff

Library customers and general public
Local Authority
(Which?)
Holdings data

Links to personal data via loans/loan history and status/status history

  • Potentially these link to more than one Data Subject, so which would be this Data Controller for the catalogue data?
  • Logically should be the owner of the holding item

(Not all data are published for the public)
Management Information/ Business Intelligence Reports detailing usage of service, per location Library Managers Local Authority
(Data Subject’s Local Authority will be the data controller)
Essentially should be summary data, though we’d need to have safeguards against breaches caused by very small sample data

Proper safeguards and risk analyses are required before making this data available to third parties
Demographic breakdowns Library Managers

Designated authorised analysts
Local Authority
(Data Subject’s Local Authority will be the data controller)
Most would be summary data, though we’d need to have safeguards against breaches caused by very small sample data

Some data (e.g. lists of postcodes) are granular enough to easily identify Data Subjects so safeguards need to be in place on the use and presentation of this data are required before making this data available to third parties
Marketing databases Library Managers

Designated authorised marketing staff
Local Authority (Data Subject’s Local Authority will be the data controller) Is the “I agree to receive marketing” (or equivalent) field global or local?

The selection of data explicitly must be limited to those customers who have agreed to contact so as to comply with Privacy and Electronic Communications Regulations.

Proper safeguards and risk analyses are required before making this data available to third parties
Stock management data Library staff

Designated authorised third-party service providers
Local Authority (which?) Nothing pertaining to Data Subjects should be included in this data.

Stock ownership should be straightforward.

Stock usage more problematic:

  • Global usage figures recorded against bibliographic/holdings data?
  • Local usage only?
  • How would (if at all?) third-party stock analysis systems like CollectionHQ differentiate between local and extralimital use?

In the early days at least there will be pressure to be able to provide evidence that stock is being used “fairly” with local library customers having first dibs for local stock
Ad hoc data requests Library Managers

Designated authorised third parties
Local Authority (Data Subject’s Local Authority will be the data controller) Most would be summary data, though we’d need to have safeguards against breaches caused by very small sample data

Some data (e.g. lists of postcodes) are granular enough to easily identify Data Subjects so safeguards need to be in place on the use and presentation of this data

Proper safeguards and risk analyses are required before making this data available to third parties

FoI requests would be subject to the proper exclusions
SIP2 data Data used for interfacing between Spydus and third-party systems Library staff

Specific customer
Local Authority (Data Subject’s Local Authority will be the data controller) The particular case at the moment would be where data held in the customer record determines the access or not to third-party systems and services.


  • Would the data be determined globally or locally?
  • Standard use of data fields?
  • Standard coding sets?

I'd be interested to know if/how this analysis sits with the experience of established consortium libraries, especially if I've missed something that could cause us problems.

Sunday, 15 June 2014

Library audiences: talking to the other 54%

A couple of months back I had an interesting Twitter conversation with some local government comms folk which got me wondering why so few English public library services have much in the way of child-centred content on their web sites. I asked the very splendid Ian Anstice if he could canvass for examples of good practice on his Public Libraries News site. The good news is that he got some positive responses, including Devon's "The Zone" and Stories from the Web; the bad news is that there are so very few examples. Ian's musings on this point are here.

For me there are a few contributory factors to this famine:

  • The web is still seen as largely "something other" to the public library's service offer. At best a way of promoting activities in the library and somewhere to keep the catalogue and the e-books; at worst an abstraction of resources from beleaguered libraries. (There are plenty of exceptions to that rule, thank Heaven!)
  • If you're doing it right it's going to take time and people to do it. These are increasingly scarce resources.
  • It's difficult to reconcile the needs of a children's page with those of a council's corporate branding, particularly if the brand requires a single monolithic corporate voice.
  • It's a complex and sometimes unforgiving audience: what's great for a five year-old may be acceptable to a seven year-old but puerile to a nine year-old and beyond the pale to an eleven year-old.
  • There's often a confusion as to whether the audience is the child or the parent. Ironically, the younger the intended audience the older the people you're going to be talking to.
Despite these problems there are still some things that can be done without too much expense and hassle.

Customer-created content
There are easy ways of adding children's voices to your content:
  • If your OPAC includes the facility to publish readers' ratings and reviews, actively encourage children's reviews. You might need to do it for them; if so, an ethical solution would be to set up a dummy customer account specifically for posting them.If you have children's reading groups, encourage the groups to post their reviews, too.
  • You'll probably already include links to writers' web sites with the rest of their works in your catalogue; why not also include links to appropriate fan sites?
  • If your children's reading groups have their own web pages link to them from your site.
  • Many OPACs have the facility to build saved lists and incorporate them into URLs to create canned searches. Canvas ideas for reading lists, "top tens" and the like and build them into links in your site. If you can present these as carousel galleries of books covers - yay!!!
  • If you have good working relationships with schools and youth workers, get them involved, too.

    Changing rooms

    If you can't have child-centred pages on your council's web site, can you provide separate versions of your OPAC for children?
    • The basic model would just be to have a version of the OPAC that's limited to the children's collections (this is where we're at in Rochdale at the moment).
    • A modification of this would be to change the wording for this version's home page and search forms (which could probably do with simplifying anyway). You'll need to be pretty clear about which particular audience(s) you're addressing here. You might want to do a CBeebies/CBBC split.
    • If you have a useful working relationship with your comms people and if your corporate brand is either flexible enough to deliver or allows permissible exclusions in particular circumstances, then you could do some interesting work on the stylesheets, etc. to make the look and feel more friendly. This isn't necessarily about using primary colours and Comic Sans (catalogue records look really horrible in Comic Sans, I've tried it). It's usually about: 
      • Simplifying elements, or eliminating them altogether. Is the link to the corporate web site useful to a nine year-old?
      • Adding pages specifically aimed at your audience. The obvious ones would be your help pages.
      • Illustrating ideas and instructions with graphics.
      • Perhaps even having its own character-based branding (like Bookstart Bear).
    These are just a few potential quick-win options. Given time and resources there's a lot more that could be done but I think there's a danger of ignoring the basics in pursuit of the cutting edge and sexy.

    Sunday, 11 May 2014

    Library competencies

    Webjunction has just compiled an update to their Competency Index For The Library Field. It's an interesting read, not least because the table of contents alone provides a challenging checklist: does your local library authority include all these high-level skill sets?

    Sunday, 6 April 2014

    The latest DCMS review

    I admit it, I missed the boat. By a long chalk. And so I didn't submit my views to the latest public consultation on public libraries. A combination of too many ideas, too little time and self-discipline and worrying overly-much as to how I'd make any of it fit to the three questions asked by the commissioned group.

    I've no illusions that I would have made a ha'penny's difference but here are my workings out, in case anyone can use any of it:
    At the outset I would like to wish you the best of luck with this latest review of the public library service in England and the hope that whatever your conclusions they are operationally practicable and support at least a decent-quality library service for our communities. You start with a serious handicap: DCMS announcements of reviews of the English public library service are a seasonal thing like the first cuckoo of Spring or the first M&S advert of Christmas. Each comes and goes and together their total operational impact in the real world has been the square root of jack all. This appalling legacy is going to colour too many of the views you are likely to hear. Including mine, unfortunately. I would love it if you could confound my cynicism.
    • There is a crying need for national leadership.
      • Now the Olympic Games are over and done with, what is DCMS for? Over the past decade — aside from the occasional launch of an enquiry into the public library service — the department’s engagement with the service has been not so much arm’s length as running a mile from.
      • The public library service in England is undefined, at best weakly supported and subject to no performance management.
      • Regulatory guidance on the delivery of the service is virtually non-existent.
      • The potential for improving the efficiency and effectiveness of the public library service by pooling resources and delivery channels across local geopolitical boundaries is being driven patchily at the local level at the same time as long-standing sharing mechanisms are being abandoned or left to wither on the vine.
      • The English public library service is not an integral part of a national literacy programme, a national digital literacy programme or a national information literacy programme despite the huge amount of good work being locally done in these areas by many, if not most library authorities.
        • DCMS is not demonstrating that it knows or much cares about:
          • How many public library buildings are currently still in use;
          • What other delivery channels are being made available for library services;
          • What services are being delivered by these delivery channels;
          • Whether or not these services adequately reflect the needs of the communities they serve;
          • What resources should be employed to provide these services.
    • Nobody knows what the public library service is. Everybody has an opinion, nobody has an empirical measure and there is no bottom-line base level of service that can be expected nationwide.
      • The sad fig-leaf that is the 1964 Act provides a fine-sounding but practically-useless sound bite. The sole practical impact of the Act is that public libraries used to get listed under “Statutory” rather than “Discretionary” when the auditors came round to see how well the local council was doing.
      • There is a view that if a building has had the word “library” stuck on it some time in its lifetime and the doors are still open then all is well in the world.
      • There is another view that so long as a building is open to the public and has some books for loan that it is a public library.
      • There is yet another view that wonders why, so long after Erasmus talked about “libraries without walls” and after nearly two decades of public libraries’ beginning to deliver their services online, English public library services are so often defined by the buildings with the word “library” stuck on them not the services being provided and delivered, often outwith those library walls.
      • Ironically, while there is a long-standing UK standard specifying the base common denominator functions for a library management system there isn’t a similar baseline specification for the service such a system would be supporting.
      • There are no baseline metrics for the public library service. The old public library standards were limited in scope and flawed in definition but they at least required that some attempt at performance management and the accumulation of business intelligence was being made. One would not want the public library service to be defined only by what could be measured (worse still only what could be measured forty years ago!) but any credible argument that the service being delivered is anything more than “the doors are open, end of story” must be supported by robust data. CIPFA returns provide some useful data but this is limited, not always freely available and not at all concerned with outcomes. Benchmark data Should include:
        • Traditional transactional and visitor throughputs.
        • Outcomes of programmes of library activity.
        • Demographic engagement and outcomes — a demonstration that the service is serving its communities and not just providing services “for people like us by people like us.”
        • Stock analyses, including data on special collections, reserve stock and specifically-local elements (not just “local studies” collections). This would also include contextual age-related data — a collection of Victorian books in a special collection is a matter of interest, a collection of fifteen-year-old children’s picture books is a matter of concern.
        • Performance at each service point, including buildings, outreach and digital channels. Transactional data at library buildings normalised to numbers per staff hour so that variations from the norm can be readily identified; while there should be some variation in response to the needs of the local community other variations may be cause for concern.
        • Analyses of delivery channels both within and without the library buildings managed by the service.
      • Once benchmark data had been established, openly-reported trend analyses should include:
        • Patterns of change of use;
        • Patterns of replacement of use — this might be as simple as 78’s being replaced in stock by mp3’s or as complex as a community of use migrating from one library to another;
        • Contextual commentary — for instance a note of the impact of the school next door closing; a new motorway cutting off a community from its library; or the involvement in a new programme of activities.
  • There is a need for the availability and application of librarianship skills at a community level. (This is not a call for a quota of “professionals” in each library authority: this has been tried before and too many of us have experience of working alongside librarians who were doing nothing that the “unqualified” library assistants were doing at least as well.) The librarian is a means to an end, not an end unto itself.
    • The creation of local, parochial bibliographic metadata is culturally- and economically-beneficial to our communities. This is not limited to the traditional form of local studies collections — though these may be seen as an important component of the Arts Council’s commitment to the accessibility of the nation’s heritage.
      • Small-scale publication — especially self-publication — is easier than ever, particularly in e-book formats. There is a very real danger that much of this material will be permanently excluded from the national bibliography. Librarians, working with local authors and publishers should be tasked with the creation and publication of the appropriate metadata.
      • Many titles have a geolocational context that is not recorded or reflected in the commercially-available metadata. Making this local context available provides a hook for the recreational reader; resources for researchers and for teachers creating reading and learning materials; and support for literature-based community activities and tourism programmes.
    • A national audit is urgently needed of those special collections not already dispersed, dissolved or disposed of as a result of austerity measures. In particular it is important to find out how much — or little — of these have been catalogued and published electronically so that a programme of work can be set up to address the oversights.
    • Community knowledge bases.
    • Grey literature.
    • Information literacy.
    • Local Freedom of Information libraries.
  • Engagement with the digital world
    • Digital inclusion/digital literacy
    • Digital libraries
    • Integrating the virtual and physical worlds
    • Crowdsourcing literary engagement
    • Curating user-created content
    • [All that stuff you’ve been arguing for fruitlessly for the past decade]
  • Staff development and continuous service improvement
    • Essential — needs to be resourced and needs a proper framework for all staff
    • Need to avoid replicating the errors and missed opportunities of the NOF-funded training programme for supporting the People’s Network — no “magic bullets” like ECDL
    • Training needs dovetailing with service development needs
    • Anticipating the support needs of communities and customers
  • Use and management of volunteers
    • Complementary to paid staff
    • Needs to be fair to the volunteer — what’s in it for them?
    • Needs to be fair to the service — what’s in it for them?
    • Needs to be fair to the community — what’s in it for them?
    • Not an easy management win
      • Greater churn that paid staff — constant need for recruitment and training support
      • Too little good supervision of remote front-line staff at the moment — how would the same managers add supervision of volunteers to their portfolio?
      • Discipline and behaviour (this is true of all staff — not just volunteers — but fewer available sticks and carrots)
      • How to manage reputational damage when things go wrong?
  • Tuesday, 1 January 2013

    Change management: I'm asking you questions because I'm trying to help you get it right

    If you were to say to me: "You have to make the following changes to your library management system," my response would be: "Perhaps. But not yet." This isn't me being precious or obstructive; this is me doing my job. There are times when the brown stuff is hitting the fan and you have to do something in a hurry but most of the time it isn't; and even when it is you need to go back and check your workings-out when the fuss has died down.

    If you're working in an ITIL environment — and I am — the assumption has to be that you do what the customer asks, so long as it doesn't screw up the integrity of the system that you're managing. So I'd need to ask you a series of questions to make sure that it doesn't. It's important to point out that under ITIL it doesn't matter whether or not the requested change plays Hob with the business; so long as the system remains intact my job would be done. So, for instance, if you were to ask me to set the library loan period to two hours, with a £100 per hour overdue rate I'd be perfectly entitled to raise my eyebrows and ask: "Are you sure?" but the default position is that the change would be made. The customer is always right, within the confines of their rôle and competencies. (I'm not being "neoliberal" here: "customer" in this context has a particular definition.)

    That's the principle of the thing. In reality it's a bit more complicated because we want to avoid the dialogue: "It doesn't work." "It does work, it just doesn't do what you wanted it to do." This is a dismal and unproductive conversation which could do serious damage to the working relationship so we make the effort to avoid it. So I'll ask a few more questions:

    "What do you want to do?"
    It's astonishing how often this question causes a problem. If you don't know what you want to do, how will you know when you've done it? How will you know if the proposed change will address the issue to hand? And if it is the solution to a problem, is it the best solution? You'd be surprised how often the first applied solution to come along becomes accepted as an essential compnent of the process, regardless of the impact on the efficiency or effectiveness of the business. Just because you know how to pick a lock within two minutes just with the aid of a hair pin doesn't mean you'd necessarily want to throw your front door key away.

    "Who does this affect and are they OK with it?"
    Systems and services don't live in hermetically-sealed bubbles. At least have a think about who's involved and/or do an outline impact analysis on the back of a fag packet. And do make sure that anyone affected by the change knows about it and what it means to them. If the impacts are big and scary enough you may need to sketch out a communication strategy for them.

    "What happens if it goes wrong?"
    Give the risks a degree of respect. Don't assume nothing could go wrong or they'll come and bite you on the bum. Make sure you know what could happen if it goes wrong; what the impact would be; and that you have a Plan B, a safety net and/or the capacity to go back to where you started from.

    "What do you mean by...?"
    Make sure you're talking about the same thing with the same meaning. "Better," "Improved" and "Modernise" are words that should be deprecated in this conversation: what do they mean in the working context? For instance, a set of catalogue records may be more complete, with every tag full of data; or may exhibit a purer adherence to current cataloguing standards; or may be Dewey classified to fifteen decimals, but is it actually better? For whom? You may need to sketch out a quality description document for changes to key data or even a quality plan if you're talking about large-scale fundamental changes.

    "How will you know if it's worked?"
    Because we want to avoid that dark and dismal dialogue, right?


    Monday, 31 December 2012

    Lessons Learned

    It being the end of year and it being a time for reflection and review and that I thought I'd put down a few thoughts on a process that's sadly neglected by many library projects: Lessons Learned.

    In my experience, too often the lesson learned is; "We seem to have managed that in the end, so it's OK to fly by the seats of our pants next time, too." This is an opportunity missed: experience is not what happens to you, it's what you do with it. If what you do with it is nothing then the experience is lost. So it's important to build the Lessons Learned process into any project.

    The purpose of a Lessons Learned Document is to capture the experience accrued by the project in a formal document for use on similar future projects, including:
    • Problems that occurred, how they were handled and how they may be avoided in the future.
    • What went well with the project and why, so that other project managers may capitalize on this experience.
    It is not the purpose of a Lessons Learned Document to apportion praise or blame.

    This document should be used to support the continuous service improvement processes within the organisation.


    Just to put my money where my mouth is, these are the recommendations from the lessons learned process from our project migrating from Dynix to Spydus:
    1. A specification of operational functions is essential for the Statement of User Requirements. The more explicitly practical and measurable the more robust the selection process in procurement.
      • Actively encourage staff input in the specification process to get ideas for the specification and buy-in for the project.
      • Actively investigate other solutions and technologies so that you aren’t just doing a like-for-like replacement and limiting yourself to established business delivery models.

    2. Before a procurement process that you own begins you need the following:
      • What is the process? What are the critical paths?
      • Who are the stakeholders — customer/ project/ procurement/ legal/ partners/ whoever
      • Who is/are responsible for doing each step of the process?
      • What information/ documentation is required for each step?
      • When do you know each step has been completed?
      • Has this all been agreed by all the stakeholders?

    3. Agree a Project Initiation Document and work from it.
      • Make sure that you know who is doing what and in what order.
      • Make sure you know what isn’t to be done.

    4. Work to the project:
      • Make sure that you’ve agreed who is doing what and in what order.
      • Make sure everyone knows what isn’t to be done.
      • Have clear lines of communication.
      • Get together regularly to review progress and, where necessary, revise action plans.
      • Allow at least three weeks between Subject Expert Training and Train the Trainers to allow options to be explored, modelled and tested for use (particularly with new functionality) adequately.
      • Train the Trainers is an opportunity to test the commissioning to date. Allow at least a week between Train the Trainers and the first batch of Cascade Training to test the safety of any changes.
      • Have cut-off points for commissioning changes:
        • No changes to codes and data structures after data migration.
        • Severely limit the number of system parameter changes after Train the Trainers.
        • Admit no changes to any part of the system (except in emergency) on the day you go live.


      • Make sure that the technical infrastructure requirements are included in the Statement of User Requirements and agreed with the supplier before commencing the installation.
      • The OPAC is an integral part of the system, not an add-on, so it needs to be treated as part of the whole.

        • The training for the management of the OPAC needs to be included in the Subject Expert programme.
        • Make sure that all the people having input to the commissioning of the OPAC understand its purpose and function.

      • Spending time cleaning up the data using familiar tools in the system you know saves a considerable amount of time, effort and problems with both the data migration and the operation of the new LMS.
      • Prepare for the MARC21 environment by making sure the existing catalogue data maps at least adequately and by making sure that there is sufficient MARC21 expertise within the organisation to verify that it does.
      • It’s useful and important to see how a reference site uses a process.
        • It’s important to make sure that the ‘right’ experience of a site visit is realized: be clear about what the experience needs to be beforehand and proactively manage distractions.


      • Any library service that is not already used to MARC cataloguing should make sure well before the Subject Expert Training that:
        • There is sufficient expertise for the catalogue data mapping process.
        • Staff who will be using catalogue processes (including acquisition via EDI) need to understand at least the basics of the format.
      • Thursday, 4 October 2012

        Remnants from Dynix

        When we migrated from Dynix to Spydus I was keen that we didn't lose more management information than we needed. I ran copies of the canned reports and all the usual suspects but there were two things I particularly wanted to preserve:

        • I wanted to be able to give the stock manager an overview of the state of the collections at each library.
        • We had 21 years' worth of data in Dynix's Statistical Reports Manager and I wasn't keen to lose all that management information.

        So I wrote some Recall reports to strip out the data in text format. These are the Recall Vocs and the local dictionary items that I used.

             COLLECTIONS.REPORT
        0001 PA Saved at 15:15:22 23 SEP 2005 by steveh
        0002 CS
        0003 DISPLAY
        0004 DISPLAY
        0005 DISPLAY          The system is now sorting all the Holdings records
        0006 DISPLAY          A report will be sent to the screen
        0007 DISPLAY
        0008 DISPLAY          This will take some time
        0009 DISPLAY
        0010 DISPLAY          If you want a copy of this report in Word or Excel
        0011 DISPLAY          please ask the Systems Manager
        0012 DISPLAY
        0013 DISPLAY         working.....
        0014 DISPLAY
        0015 SORT HOLDINGS BY AGENCY2 BY COLLECTION BY L-ITEM.AGE BREAK-ON L-LIBNAME "TOTAL FOR 'V'" BREAK-ON T-COLL "TOTAL FOR 'V'" BREAK-ON L-ITEM.AGE "'V'" TOTAL COUNTER HEADING "LIBRARY CATALOGUE STOCK AS OF 'TL'" (CDIP
        
         
        Enter DICT NAME : L-ITEM.AGE
        1  FIELD NAME         1/ YEARS
                              2/ OLD
        2  FIELD NUMBER          0
        3  JUSTIFICATION         L
        4  DISPLAY LENGTH        25
        5  CONVERSION         1/ MCT
        6  CORRELATIVE        1/ A; IF N(L-YEARS.OWNED2) < "5" THEN "LESS THAN FIVE YEARS OLD" ELSE IF N(L-YEARS.OWNED2) > "10" THEN "MORE THAN TEN YEARS OLD" ELSE IF N(L-YEARS.OWNED2) = "" THEN "MORE THAN TEN YEARS OLD" ELSE "5 TO 10 YEARS OLD"
        
        
        Enter DICT NAME : L-YEARS.OWNED2
        1  FIELD NAME         1/ YEARS OWNED
        2  FIELD NUMBER          0
        3  JUSTIFICATION         R
        4  DISPLAY LENGTH        10
        5  CONVERSION         1/
        6  CORRELATIVE        1/ A;(D-N(DATE.ADDED))/"365"
        
         
        
             STAT.MGR.DUMP.REPORT
        0001 PA Saved at 16:29:48 30 APR 2012 by steveh
        0002 SELECT STAT.MGR WITH L-PERIOD="DAY"
        0003 SELECT STAT.MGR WITH L-KEY NOT "[.OV"
        0004 SELECT STAT.MGR WITH L-KEY NOT "[.PO.]"
        0005 SORT STAT.MGR BY L-DATE BY L-CODE.TRANSLATE BY L-LIBRARY L-DATE L-CODE.TRANSLATE L-LIBRARY L-TOTAL (CHIP
        
        
             STAT.MGR.MONTH.DUMP
        0001 PA Saved at 10:55:50 10 MAY 2012 by steveh
        0002 SETPTR ,500,5000,,,3
        0003 SELECT STAT.MGR WITH L-PERIOD="MONTH"
        0004 SELECT STAT.MGR WITH L-ELEMENT2 GT "3000"
        0005 SORT STAT.MGR BY L-PERIOD BY L-DATE BY L-CODE.TRANSLATE BY L-LIBRARY L-PERIOD L-DATE L-CODE.TRANSLATE L-LIBRARY L-TOTAL HEADING "Monthly issue statistics from Dynix as of 'TL'" (NIP
         
        
        Enter DICT NAME : L-PERIOD
        1  FIELD NAME         1/ PERIOD
        2  FIELD NUMBER          0
        3  JUSTIFICATION         L
        4  DISPLAY LENGTH        10
        5  CONVERSION         1/
        6  CORRELATIVE        1/ A;IF N(L-ELEMENT1)="M" THEN "MONTH" ELSE IF N(L-ELEMENT1)="D" THEN "DAY" ELSE ""
        
        Enter DICT NAME : L-ELEMENT1
        1  FIELD NAME         1/ FIRST BIT
        2  FIELD NUMBER          0
        3  JUSTIFICATION         L
        4  DISPLAY LENGTH        10
        5  CONVERSION         1/
        6  CORRELATIVE        1/ G.1
         
        Enter DICT NAME : L-KEY
        1  FIELD NAME         1/ KEY
        2  FIELD NUMBER          0
        3  JUSTIFICATION         L
        4  DISPLAY LENGTH        80
        5  CONVERSION         1/
        6  CORRELATIVE        1/
         
        Enter DICT NAME : L-DATE
        1  FIELD NAME         1/ DATE
        2  FIELD NUMBER          0
        3  JUSTIFICATION         L
        4  DISPLAY LENGTH        10
        5  CONVERSION         1/ D2
        6  CORRELATIVE        1/ G1.1
        
        Enter DICT NAME : L-CODE.TRANSLATE
        1  FIELD NAME         1/ TRANSLATED CODE
        2  FIELD NUMBER          0
        3  JUSTIFICATION         L
        4  DISPLAY LENGTH        50
        5  CONVERSION         1/ MCT
        6  CORRELATIVE        1/ G3.2
                              2/ TCODES;X;;1
        
        Enter DICT NAME : L-LIBRARY
        1  FIELD NAME         1/ LIBRARY
        2  FIELD NUMBER          0
        3  JUSTIFICATION         L
        4  DISPLAY LENGTH        35
        5  CONVERSION         1/
        6  CORRELATIVE        1/ A;"EX.";N(L-AGENCY);:
                              2/ TCODES;X;;1
        
        Enter DICT NAME : L-AGENCY
        1  FIELD NAME         1/ AGENCY
        2  FIELD NUMBER          0
        3  JUSTIFICATION         L
        4  DISPLAY LENGTH        25
        5  CONVERSION         1/
        6  CORRELATIVE        1/ G2.1
        
        Enter DICT NAME : L-TOTAL
        1  FIELD NAME         1/ TOTAL
        2  FIELD NUMBER          0
        3  JUSTIFICATION         R
        4  DISPLAY LENGTH        6
        5  CONVERSION         1/
        6  CORRELATIVE        1/ F;2;S
         
        Enter DICT NAME : L-PERIOD
        1  FIELD NAME         1/ PERIOD
        2  FIELD NUMBER          0
        3  JUSTIFICATION         L
        4  DISPLAY LENGTH        10
        5  CONVERSION         1/
        6  CORRELATIVE        1/ A;IF N(L-ELEMENT1)="M" THEN "MONTH" ELSE IF N(L-ELEMENT1)="D" THEN "DAY" ELSE ""
        
        Enter DICT NAME : L-ELEMENT2
        1  FIELD NAME         1/ SECOND BIT
        2  FIELD NUMBER          0
        3  JUSTIFICATION         R
        4  DISPLAY LENGTH        10
        5  CONVERSION         1/
        6  CORRELATIVE        1/ G1.1
         
        

        Friday, 27 July 2012

        Relief

        Well, jigger me: we did it. We went live with Spydus today with scarcely any incidents save senior Library Service managers emailing the Spydus consultant telling him that he should change some of the wording on the OPAC menus.

        Far too tired and relieved to find the energy to lob a brick at the offending party for breaking the lines of project communication and having an odd sense of priority.

        Friday, 8 June 2012

        New OPAC

        Good news: we've got the TEST versions of OPAC and the resource discovery module, working and they look quite nice.

        Bad news: we've had to postpone the training for the OPAC as we need to incorporate the change in the corporate branding that should be coming on stream this summer. Essentially, we'll be going live at the end of July with an OPAC pretty much out of the box. Perhaps the only "radical" novelty we'll be delivering at this stage is the range of subcatalogues we'll be presenting:
        • The "vanilla" catalogue"
        • Children's Library
        • Local Studies
        • The Co-operative Collection - very much an unsung resource, especially in this International Year of Co-operation
        • The Maskew Collection - a special collection of English literature and philosophy funded by the bequest of a local lady
        At the moment we're only doing this by imposing filters on the library catalogue, not doing anything in the way of additional information and canned searches. This is frustrating, but I guess unavoidable, but we'll just have to do as best as we can. I'm looking forward to getting the training as there's quite a lot of possibility lurking in these two customer interfaces, particularly Sorcer which has a lot of scope for creating personalised learning/reading environments.

        Friday, 18 May 2012

        Training days

        It's daft really: the hard slog of the past few months has felt like a phoney war, despite the fact that there's been a lot of work and a good number of real successes along the way.
        • The test data load from Dynix to Spydus has gone surprisingly well, with scarcely any glitches (I'm still waiting for the first shoe to drop, let alone the second).
        • The MARC mapping of the Dynix seems to be good enough to do the job of converting to MARC21 in Spydus. Hats off to Anne Whiteley who did the original mapping back in 1990, armed only with commonsense and a manual she'd borrowed as a Regional Loan. My contribution to the cause has been a rank lack of commonsense and whatever I can crib off the good folks on th'interweb.
        • We've got a good project plan and we're making the Library Service work to it. The good news is that nearly everyone involved is relieved to have some sort of structure to hang onto when things start getting giddy.
        Next week we start two weeks' worth of "subject expert" training, the hardest part of which is convincing some people that they really are the local experts in their field.

        Once that's done I've got a couple of weeks in which to do as much configuration as possible before we start training the trainers. If we can get as much of the configuration as possible done during the subject expert training we'll have a fighting chance of hitting our end of July deadline.

        We're hoping that two decisions will help us a lot. The first is that I won't be delivering the training myself, which is a blow personally (it's one of the things I seriously enjoy doing) but it's the only way I'll have the time to be able to work on the outstanding technical details, including the interfaces we're requiring with other systems such as smartsm, the Local Land & Property Gazeteer and the corporate finance systems. It'll also give me space and time to respond to any issues or ideas arising from the training sessions being delivered throughout June.

        Which brings me to the second decision: we've selected a group of trainers who'll act as champions and first line of support within the library servce. These are all Library Service staff, some managers but mostly front-line and all volunteers. In fact, twice as many people as we needed volunteered to be trainers, which is a bit gratifying. They'll have three pretty intensive days in which to get to grips with the new Circulation system and put together a package that can be delivered in a one-day training session for front-line staff. Which start the following week.

        Tight deadlines; some hard decisions to be made about what to leave out of the training packages; and impossible if you're not documenting what you're doing and communicating in minutes and hours, not days and weeks. I'm lucky: the Data Hub has a couple of Project Assistants and one of them will be working with me on this project. That should mean we can get some real-time communication between training sessions and myself so that we can address issues on the fly and record the changes as we go along. He's only been with us a week so he's walking in completely fresh with no preconceptions or baggage, poor devil!