Featured

Mother and Baby Homes – records’ accessibility through the lens of Australia’s ‘Find & Connect’


In 2018, Ireland’s Minister for Children and Youth Affairs, Katherine Zappone, announced that a ‘transitional justice’ approach should be developed to examine Ireland’s history of institutional care and to consider the State’s response to that legacy through ‘a truth recovery or truth telling process with victims and survivors at its core’. In reference to the Commission of Investigation into Mother and Baby Homes, the Minister noted that neglecting to do so would mean a failure ‘…in our shared moral and ethical responsibility for the past. It would also leave us with an incomplete historical record’. While the Commission’s terms of reference explicitly recognise an archival need to preserve this documentary legacy ‘…for the purpose of further historical research or examination’, the question of records’ accessibility for former residents and their families is not fully addressed.

On 5 December 2018, the Minister published the Commission’s Third Interim Report and announced that delivery of the Commission’s final report had been extended to February 2019 to accommodate analysis of the ‘vast range of documentary material relating to the institutions under investigation’. This includes the collation of records held by public and diocesan archives and those accessed through orders for discovery made on religious organisations and state authorities. On 9 January 2019, The Irish Times reported that the Commission is now seeking an extension of one year before publishing its final report. Recently there have been concerns raised that the Commission’s possession of records relevant to the investigation are impacting on the public’s rights of access. That the ‘effective operation of the work of the commission and the co-operation of witnesses’ should be used as a basis for refusing an access request stands in stark contrast to the approaches taken in other jurisdictions dealing with similar issues.

In 2007, the Scottish Parliament published the Historical Abuse Systemic Review of Residential Schools and Children’s Homes in Scotland 1950 to 1995 (the Shaw Report) identifying problems for survivors when attempting to trace records and expressed a wider concern over recordkeeping difficulties throughout the Scottish public sector. The Report’s recommendations that ‘the government should commission a review of public records legislation which should lead to new legislation being drafted to meet records and information needs in Scotland’ resulted in the Public Records (Scotland) Act 2011. The Act places a duty on the Keeper of the Records of Scotland to develop and publish a model records management plan (RMP) with stakeholder engagement seen as critical to the development of the plan and to helping the Act function correctly.

In Australia, the response has been similarly robust. Following the national apology to Forgotten Australians and Former Child Migrants in 2009, the Australian government initiated the development of the Find & Connect resource as a specialised records search and support service to support former care leavers and their families. The resource developed its remit from the ‘trilogy’ of reports into the Stolen Generations (indigenous children removed from their family and community), Former Child Migrants (children shipped from the UK and Malta to countries including Australia) and Forgotten Australians (non-indigenous children who experienced institutional care) published between 1997 and 2004. The approach taken is summarised in the project’s submission to the Australian Royal Commission into Institutional Responses to Child Sexual Abuse:

‘Decisions about records management (most importantly, decisions about releasing information in records) need to move away from questions of ownership and consideration of organisational risk, and instead adopt a rights-based framework’

In collaboration with the eScholarship Resource Centre at the University of Melbourne and the Australian Catholic University, Find & Connect worked with archivists, historians, social workers and technology staff to engage with people who had experienced institutional care as well as advocacy groups, support services, care providers, record-holders and government departments.

As a response to the National Apology to Forgotten Australians and Former Child Migrants, the Australian Government’s Department of Social Services developed the following best practice guidelines in providing access to records:

  • The right of every care leaver, upon proof of identity, to view all information relating to himself or herself and to receive a full copy of the same;
  • the right of every care leaver to undertake records searches, to be provided with records and the copying of records free of charge; and
  • a commitment to the flexible and compassionate interpretation of privacy legislation to allow a care leaver to identify their family and background.

The guidelines encourage record-holders to implement records access practices in a nationally consistent manner. In this context, Find & Connect acts as a support resource for record-holders and users to assist with the search for records and with other critical information held by past provider organisations and government agencies. The initiative is framed around aggregating existing records’ descriptions put into the public domain and augmenting them through:

  • comprehensive indexing and search capability to enhance accessibility;
  • supporting documentation such as organisational histories and media reports to help enhance the context of the records;
  • resources and guides for record-holders and users to assist with the search for records; and
  • awareness raising and training for record-holding staff.

To meet the needs of target audiences and to address many of the challenges associated with publishing a public knowledge resource, Find & Connect was developed to accommodate the following service design principles:

  • a standards-based content management system founded on international standards for recordkeeping and archival description;
  • an evidential framework allowing presented content to be verified or disputed by users;
  • support of persistence and meaning through time by presenting contextual information around the record with the flexibility to accommodate the development of themed stories and separate publications;
  • multiple access and reference points through the use of unique and persistent identification of resources;
  • the presentation of ‘contextually meaningful, historically valid and evidentially defensible’ resources;
  • enhanced discoverability through computer to computer data sharing;
  • a commitment to the sharing of knowledge already in the public domain; and
  • an intelligible interface that is useful, welcoming and respectful.

Find & Connect operates as a dissemination service for information provided by record-holding organisations and not a records’ repository or archive. Records that are part of an archive or a library collection are linked to a catalogue page that includes the resource and relevant metadata. The Human Rights Act 2003, the recommendations of the Ryan Report, the Commission’s terms of reference and the Minister’s announcement of a ‘transitional justice’ approach to dealing with the legacy of institutional care in Ireland presents an opportunity to assess the development of a rights-based framework that recognises records’ accessibility as a key component of efforts to redress the wrongs of the past. While solutions to the issues of accessibility to the records of the Mother and Baby Homes are still in development, the Find & Connect project represents an approach to consider on a number of fronts:

  • By focusing on records that are already in the public domain and offering guidance and support, the project was able to deliver positive impacts for both record-holding institutions and the public. Preliminary feedback has identified significant improvements in discoverability and accessibility of records, the location of previously unidentified records and improved recordkeeping efficiency within organisations.
  • The project worked with a wide variety of stakeholders including national and state archives and the record-holders of former care homes. A similar approach could be developed in conjunction with the National Archives of Ireland and bodies such as Association of Church Archives of Ireland to develop accessibility guides and share finding aids and record-holding metadata.

Addressing the legacy of Ireland’s Mother and Baby Homes in this way could not only complement the preservation of historical evidence it would recognise the paramount requirement of records’ accessibility for former residents and their families that should underpin any efforts by the State toward redress.

Open Access and Irish research funding and in the ‘age of austerity’

Posted on 18 July 2009

Everyone knew this was coming but it was still a shock to see the swingeing recommendations in print. It would be futile to speculate on what the organisational topography of Irish third-level education will look like this time next year but I think it’s safe to assume that streamlining and rationalisation are going to profoundly effect how research is funded by the State. Relying on bibliometric analysis as a primary yardstick of research quality meets with cold dismissal:

‘The largest verifiable output to date appears to be the publication of articles as opposed to more concrete
measures of economic returns.’

I’m not sure what these concrete measures are but the emphasis in the report seems to be toward applied research. This is a dangerous supposition infringing on the concept of academic freedom and dismissing the nursery of real innovation. One big gun is already absorbing the extent of this challenge so I won’t dwell on this here. It does seem safe to assume that the agencies delivering funding to research under The Strategy for science, Technology and Innovation 2007 – 2013 will be reconstituted in some way. Against this background of fundamental change, I hope that the enlightened policies favouring Open Access don’t wither on the vine. Science Foundation IrelandIRCSET and the Health Research Board, have been recommended for amalgamation into whatever single funding agency emerges. The proposed dissolution of the Higher Education Authority and the acquisition of that body’s functions into the Department of Education and Science could just be an exercise in ball-hopping, nonetheless this too may come to pass. Clearly, whatever agency emerges will need to re-engage with Open Access. Hopefully familiar faces and old allies will re-emerge in the new body making advocacy a less daunting task than starting from scratch.

All this flux and fear bookends a fairly successful repository summer. I was glad to have my instincts confirmed by Stuart Shieber that funder policies over institutional mandates can be more effective in leveraging Open Access. Using the stated policies of the agencies described above I was able to convince a number of high profile researchers to avail of the benefits of the repository and allow deposit of their current published research. The Open Access service that will aggregate Irish university research outputs (publications, theses etc) is also under development and looking at an end of year delivery date. This should be a core component of an all-Ireland research support infrastructure. This project is currently funded by the Irish Universities Association and the Strategic Innovation Fund which has established Institutional Repositories in all Irish universities and gone some way to keeping a roof over my head since 2005. The McCarthy Report has recommended that the fund be abolished citing the slow drawdown of available funds as the primary reason. I found this particularly irritating because, if true, it does represent a failure to capitalise on available funding when times were good. Using this fund to build the type of infrastructure needed to support teaching and research in Irish third-level would have been an exercise in the type of rationalisation strongly recommended in the report.

Assessing research excellence in the Humanities – an Australian approach

Posted on 11 April 2010

When asked to explain what a librarian does I’ve sometimes come back with the reply that we make lists. Lists that will facilitate ease of access to the collections we curate and preserve. Other colleagues may balk at what might seem a blunt summation of our chief function and point to user education, or collection development and preservation as more illustrative of our mission. Yet behind all these functions lies the goal of access to the stuff we gather and preserve and to do that we make itineraries of what we hold. The methods and standards we deploy to create lists gets defined in the basic job skill of cataloguing regardless of whether it’s metadata creation or scholarly bibliography.


The list also has another function beyond being just a primary facilitator to access. Any exercise in collection development would be lost without it. You need to know what you have before you decide what’s needed. It is therefore no surprise that an exercise like Australian Research Council’s ERA initiative with a list of 20,000 + ranked journals would draw some notice. The initiative seeks to examine where Australian researchers are publishing and give a ranking to the journals they choose to publish in. In effect a variation on the contentious Journal Impact Factor specific to Australian published research with metrics based on citation analysis (via SCOPUS) and post publication peer-review (via a three stage process between Research Evaluation Committee (REC) Members and peer-reviewers). The sustainability of a secondary ERA type peer-review process may be tricky to negotiate; nonetheless it establishes precedent and more efficient models may emerge.


The objectives of the exercise are to:
1. Establish an evaluation framework that gives government, industry, business and the wider community assurance of the excellence of research conducted in Australia’s institutions;

2. Provide a national stocktake of discipline-level areas of research strength and areas where there is opportunity for development in Australia’s higher education institutions;

3. Identify excellence across the full spectrum of research performance;

4. Identify emerging research areas and opportunities for further development; and

5. Allow for comparisons of Australia’s research nationally and internationally for all discipline areas.
The project identifies 8 research clusters with specific disciplines identified by Field of Research Code (FoR) e.g.:


The initiative deserves a commendation for bravery in choosing Humanities and Creative Arts (HCA) along with Physical, Chemical and Earth Sciences (PCE) as the trial research clusters. The latter could draw on existing citation analyses not suitable or indeed available to the former. Australian higher-education institutions provided data to ERA via SEER, an interface developed on top of the existing Australian institutional repository infrastructure.

Looking at humanities in isolation, if we take the monograph as the ‘gold standard’ of humanities research a search in Australian Research Online (an aggregation of open scholarly research outputs across Australia) returns 421 item types = book in the date range 2008 – 2010. Within these results 7 are listed under the subject heading ‘arts’ (sadly none of the records have a link to an open copy). I’m assuming that these items were submitted to ERA and that a hard copy of the monograph was examined as part of the peer-review analysis. I say brave because I imagine there is as much scepticism among the Humanities and Creative Arts research cluster in Australia as to the true object of the ERA exercise as there would be among its Irish counterpart.

By far the most important outcome of this process is that it has established an independent, national metric to try to uncover quality in research. Based on citation and independent peer-review analysis the trial established a journal assessment metric –

A* Typically an A* journal would be one of the best in its field or subfield in which to publish and would typically cover the entire field/subfield. Virtually all papers they publish will be of a very high quality. These are journals where most of the work is important (it will really shape the field) and where researchers boast about getting accepted. Acceptance rates would typically be low and the editorial board would be dominated by field leaders, including many from top institutions.

The majority of papers in a Tier A journal will be of very high quality. Publishing in an A journal would enhance the author’s standing, showing they have real engagement with the global research community and that they have something to say about problems of some significance. Typical signs of an A journal are low acceptance rates and an editorial board, which includes a reasonable fraction of well known researchers from top institutions.

B Tier B covers journals with a solid, though not outstanding, reputation. Generally, in a Tier B journal, one would expect only a few papers of very high quality. They are often important outlets for the work of PhD students and early career researchers. Typical examples would be regional journals with high acceptance rates, and editorial boards that have few leading researchers from top international institutions.

C Tier C includes quality, peer reviewed, journals that do not meet the criteria of the higher tiers.

to support a five-point ERA Rating Scale.

5. The Unit of Evaluation profile is characterised by evidence of outstanding performance well above world standard presented by the suite of indicators used for evaluation.

4. The Unit of Evaluation profile is characterised by evidence of performance above world standard presented by the suite of indicators used for evaluation.

3. The Unit of Evaluation profile is characterised by evidence of average performance at world standard presented by the suite of indicators used for evaluation.

2. The Unit of Evaluation profile is characterised by evidence of performance below world standard presented by the suite of indicators used for evaluation.

1. The Unit of Evaluation profile is characterised by evidence of performance well below world standard presented by the suite of indicators used for evaluation.

NA. Not assessed due to low volume. The number of research outputs does not meet the volume threshold standard for evaluation in ERA.

This is an important and bold initiative in which input from Australian libraries comprises a vital component supporting the institutional stakeholders. It’s not pretty and it’s not without its critics but we have to take notice.


Searchable database of ERA ranked journals [ http://lamp.infosys.deakin.edu.au/era/index.php ]Physical, Chemical and Earth Sciences (PCE) and Humanities and Creative Arts (HCA) Clusters EvaluationGuidelines for the 2009 ERA Trial (PDF) [http://www.arc.gov.au/pdf/ERA_Eval_Guide.pdf  ]Physical, Chemical and Earth Sciences (PCE) and Humanities and Creative Arts (HCA) Clusters ERA Indicator Benchmark Methodology (PDF) [http://www.arc.gov.au/pdf/ERA_Indicator_Bench.pdf ]

Implementing the UK Open Access policy: The embargoes for Green

Posted on 3 March 2013

The positive achievement of the UK in positioning Open Access front and center of the debate around the future of academic publishing cannot be denied. However, defining a clear path toward policy implementation has been less successful. Here’s why:

Embargoes for Green

Anyone who has been tracking the rapid transition from the recommendations of the Finch Group to the emergence of RCUK’s policy must admit that the horse-trading around OA embargoes caused considerable confusion. The House of Lords Science and Technology Committee report into the policy published on 22 February 2013 produced this graphic to highlight how it should work.

The thing is it was the first time most people had seen it. Was this the policy tweak we were told would emerge at the end of February? Perhaps. While David Willets’ position on Green OA verges on the politically hostile, RCUK and HEFCE have tried to hold a more pragmatic line albeit one which the former appeared at pains to avoid stating plainly. Put simply: although the policy has a preference for Gold, funding is limited and Green will meet the shortfall.

Before the decision tree came to light, the position on embargoes was as follows:

Ideally, a research paper should become Open Access as soon as it is published on-line. However, the Research Councils recognise that embargo periods are currently used by some journals with business models which depend on generating revenue through subscriptions. Therefore, where a publisher does not offer a ‘pay-to-publish’ option the Research Councils will accept a delay between on-line publication and a paper becoming Open Access of no more than six months, except in the case of research papers arising from research funded by the AHRC and the ESRC. Because current funding arrangements make a six month embargo period particularly difficult in the arts, humanities and social sciences, the Research Councils will accept a delay of up to twelve months in the case of research papers arising from research funded wholly or in part by the AHRC and/or the ESRC. However, this is only a transitional arrangement, for a period of five years, and both the AHRC and ESRC are working towards enabling a maximum embargo period of six months for all research papers.

In August 2012, The Publisher’s Association released a position statement on RCUK policy which contains our now familiar decision tree. We can assume that in the following six months there was considerable lobbying by the PA to get BIS and RCUK to clarify their position but if you look closely this is a bit more than a simple policy tweak on the time-scales of embargoes. Where a publisher doesn’t offer a paid APC option for a particular journal, the author will be compliant with RCUK’s OA policy if the author’s final draft, post peer-review, is deposited in a repository and released from embargo between 6 to 12 months depending on discipline. Where the publisher DOES offer a paid APC option but there is no money to cover the APC, the embargo gets expanded to 12-24 months.

Therefore publishers can impose an embargo of 12-24 months on the 55% of published research in year one of the RCUK policy. That’s quite a roll-back on the original position and to whose benefit?

Emerging controls in Elsevier’s scholarly communications ecology

Posted on 4 August 2017


As I’ve written about Elsevier’s service development strategy in the distant past, I’ve decided to shame myself into making some fresh comment and help kick the tires of a very neglected DarkRepository.

Elsevier’s ongoing acquisitions programme turned another chapter on Monday with news that it had acquired bepress, (formerly the Berkeley Electronic Press), an academic software firm producing products and services to support scholarly communication. According to the press release, goodness will flow through integration with ‘Elsevier’s suite of research products, such as Scopus, Pure, SSRN and SciVal [and] will enhance the breadth and quality of the reach, promotion and impact services bepress delivers to its customers.’

The bepress suite of products includes Digital Commons, characterised as ‘an institutional repository, a comprehensive publishing platform, and a fully integrated research and impact suite [Expert Gallery Suite] for experts at your institution’. On the academic publishers’ blog The Scholarly Kitchen, Roger C. Schonfeld has described the deal as making Elsevier ‘…a major if not the foremost single player in the institutional repository landscape’. I can’t say if the hyperbole was intended but Schonfeld does reference a UKCoRR discussion from 2015 indicating what the trajectory could mean for research support infrastructure in UK HE.

When it comes to any disruption to its business strategies, Elsevier has been masterful at controlling the language to control the debate. It also has deep pockets from decades of market dominance in academic publishing, a shameless corporate ethos when it comes to political lobbying and a reputation for aggressively defending its acquisition of copyright to the published outputs of predominantly publically funded research.

There can be little argument that acquiring bpress is further evidence of a strategy to compliment rather than supersede its established publication services. With SciVal/Scopus and Plum (not a dessert) it also has a suite of data driven ‘metric and impact’ indicators supporting national and global assessments of research output and university quality. The attraction of being inside this tent should not be underestimated and it’s the Research Office not the Library that will be more likely to make the call.

So what exactly does Elsevier hope to gain from the deal?

Elsevier’s acquisition of the PURE Current Research Information System (CRIS) in 2013 came at a time when the main driver for research support infrastructure development in UK HE was REF2014. As an institutional repository manager, I worked with the platform between 2012 and 2014. PURE does a lot of things brilliantly, particularly research output data management and reporting, but its built-in repository service is dismal. So dismal that I became convinced that the development of basic repository interoperability functions to share or expose content to other services had been parked while Elsevier/SciVal designed its model to accommodate the development of a locked-in scholarly communications ecology. In light of Elsevier’s ongoing service acquisition strategy I don’t believe I was too far off the mark.

With the bpress/Digital Commons acquisition does Elsevier have a piece of the jigsaw to develop a fit for purpose institutional repository service integrated with PURE? Perhaps but only to a point. Working with the Digital Commons platform to extract metadata for harvesting to RIAN.ie (Ireland’s national research portal) was, to say the least, challenging but ultimately successful. That said, I don’t believe that facilitating content aggregation will be a service development priority for Elsevier unless that aggregation is an outcome incorporating controls to manage it as an Elsevier business process.

The May 2016 acquisition of the Social Science Research Network (SSRN) was a far bolder move by Elsevier to dominate and dictate the pace of change in academic publishing on its own terms. The service allows the open dissemination of ‘working papers’ thereby supporting a pre-publication declaration of research interest related to a particular topic. It has been operational since 1994 and is widely used across the social sciences. Working papers appearing in SSRN can be considered as early drafts which may subsequently appear as published and peer-reviewed papers in academic journals. While this acquisition has obvious cost-benefits it has the potential to increase control over the decisions authors make regarding the transfer of copyright and licensing early in the publication lifecycle. Don’t be surprised if these controls are extended into Elsevier’s forthcoming ‘institutional repository’ service.

Elsevier’s slice of Big Data pie

Posted on 10 April 2013

One of the more illuminating points made during the twitter storm that followed Tuesday’s announcement of the Elsevier Mendeley takeover came from @researchremix [ Heather Piwowar ]. To recap: Elsevier (academic publisher) acquired Mendelay (a reference management platform) for up to £65m. Mendelay is a collaboration tool, with Open Access (publications and data as well as the ‘lab notebook’ that facilitates scientific discourse and investigation) at its heart. By doing so it  allows researchers to share and annotate papers, build bibliographies and create alternative citation metrics. It also has an institutional edition which can be used to manage research information (‘…combining a next generation collaboration platform for the individual researcher with a real time analytics tools for the library’ according to the corporate video.) The ‘oodles of workflow data’ that Elsevier have acquired include data relevant to where researchers are publishing and what they are reading. Under commented at the time but nonetheless significant in this context was Elsevier’s acquisition of  the Danish company Atira in October 2012. Atira are the development team behind PURE, a Current Research Information System (CRIS) with an optional Institutional Repository service. (A CRIS is a research management system drawing on institutional data to create relationships between researchers and research publications, projects, project funding etc.) The platform has been deployed in 19 UK universities primarily to support reporting and submission for the REF2014 exercise.  

 Both acquisitions represent a new focus in Elsevier’s corporate strategy built around the data generated by institutional research management platforms and those channels of scholarly communication that utilise open content and social media. At its core is a concession that opportunities are emerging for new data driven business models. Cash rich after sustaining year on year profits of around 33% for the past decade, it can well afford to take a punt. 

The core of Elsevier’s business remains an academic publishing model buttressed through rights acquisition and library subscriptions. Looking past the corporate glad-handing that has followed the Mendeley deal it’s worth bearing in mind the ferocious lobbying undertaken by the company to protect this cash cow. Elsevier will continue to talk out of both sides of its mouth about Open Access while it develops a tangible corporate strategy that guarantees sustained profitability for these data driven services. Its well established bibliographic database SciVerse Scopus was already integrated with PURE prior to the acquisition and my feeling is that this integration will be developed further. Currently Scopus data populates PURE with both publication records and citation data which can be delivered as open content through a publicly accessible research portal. The Institutional Repository acts as an optional service associating full-text, when available, with research records (publications, projects, etc). 

So far Elsevier has not made PURE a Scopus only CRIS. Why would it? The API to the bibliographic database of its corporate rival Thomson Reuters (as well as PubMed and Arxiv) remains available to PURE. This approach allows the institutional publication dataset to be enhanced by records imported from a variety of sources. Checked for accuracy (by the institutional library) and associated with open research outputs, they are then disseminated via the web. But the enhancement of this content doesn’t stop there.  

The Gordian Knot of author disambiguation looks like it might now be cut by the ORCID project. Elsevier is a development partner. Authors register with the service and receive a unique identifier. They can then import their research output data (institutional affiliation, publication and patent references, grant information) and make these data open or closed. One of the real impediments to assessing research impact based on citation metrics is the variation of author names in the literature (John Doe, John F. Doe, Doe, John Frederick, etc). Accurately associating these variations with a specific author will give any commercially available bibliographic database a competitive edge. ORCID’s close integration with the Scopus database allows authors to import their publication references provisional on the association of their existing Scopus id with the ORCID unique identifier. In PURE, authors can now include their ORCID UI when creating reference data. If they choose to import publication records from Scopus into PURE, the ORCID UI can be used to make an accurate association between the imported record and specific institutional authors. 

This machine driven data cleansing and enhancement relies on platform to platform interoperability. With the acquisition of PURE and Mendeley and the close integration of Scopus into ORCID, Elsevier have developed a locked-in platform specific workflow with real commercial potential. Open in terms of access and re-use. Open on Elsevier’s terms.

Irish Libraries and the Crisis in Scholarly Publishing: What’s the Big Deal?

Posted on 30 March 2013

It’s a rare thing indeed that a parliamentary question gets asked about academic library subscriptions but that’s exactly what happened in Dáil Éireann [principal house of the Irish parliament] on 3rd March 2013. Peter Mathews TD asked the Minister for Education and Skills to make a statement ‘…regarding electronic subscriptions for academic journals . What follows are some personal observations.

The backdrop to this is of course Ireland’s five years of austerity and fiscal adjustment. Ireland’s university sector is almost entirely state funded and therefore subject to the same regime of cuts imposed throughout the public service. In previous years, university libraries had benefited from increased state investment into research allowing for an expansion of library resources to accommodate the developing ‘knowledge economy’. The state’s chief funding agency for scientific research, Science Foundation Ireland (SFI), was a major beneficiary of this €20 billion decade long public investment. This national spend on R&D was mostly sustained in the two years following the crash of 2008 but began to fall back from 2010. From 2005 SFI invested €35 million in university libraries to guarantee access to the corpus of research literature. The libraries had formed a consortium, IReL, to negotiate a national licence with academic publishers for all seven Irish universities.

Mise en scène

Part of any librarian’s mission is to build collections to support the research activities of their home institutions. The first disruption to this core function arrived when scholarly communications for the sciences were transformed by the huge investment of public funding that arrived into universities after World War II. The accompanying demand to publish would see the development of an almost completely privatised scholarly press. Press baron Robert Maxwell would exploit the potential in German academic publishing at war’s end to help establish Pergamon Press (now an Elsevier imprint). Collection development still involved close co-operation with researchers to service the demand for the latest communications but now there was so much more of it. For researchers, progress in an academic career became even more wedded to publication. Thus publish or perish began to stoke the serials crisis.

The second disruption began with the shift from print to digital during the 1990s. Publishers were able to bypass the library and deliver content directly to the desktops of their readers while the academic journal became more fragmented as individual papers could now be electronically transmitted and shared. The journal no longer had to occupy a physical space on the library shelf and collection development became an exercise in negotiating licenced access to remotely held content. Academic publishing had embarked on a frantic period of mergers and acquisitions allowing for the bundling of multiple electronic journals into subscription packages. If this consolidation allowed for an exponential increase in the availability of titles it also saw library spending sky-rocket. While libraries reduced the number of their subscriptions by 6% between 1986 and 1999 they spent 170% more on titles. Bundling effectively killed off the quality control aspect of collections development.

Culling the Big Deal

This is the publishing environment Irish libraries now engage with. By negotiating access licences on a national level, the IReL consortium has allowed smaller Irish university libraries to have a range of electronic resources that match those of the larger institutions. It’s a common model that allow libraries to support core research priorities by providing substantial access to the literature. Licences can be negotiated with publishers on a national level through library consortia or by individual institutions. All straightforward enough but what happens when budgets are constrained and cuts to resources on the agenda?

Any decisions around what to cull are complicated by the bundling of multiple titles from a particular publisher. In the same way you can’t unravel your cable TV package and choose just the channels you wish to watch, so it is with electronic journal bundling.

The more substantive challenge is how to assess what parts of literature constitute essential resources. The crudest measure would be examine what resources are the most heavily accessed via access logs or download counts. This can be refined by data from services such as the COUNTER initiative which gathers usage statistics on online databases and journals. IReL is a library consortia member. (While I hope that all available efforts were made to gather full data on usage, I wonder why I’m unaware of services such as the SUSHI harvester, developed on common library protocols and COUNTER compliant, being deployed in an Irish context).

Usage statistics should form one part of the picture but assessing quality by journal remains challenging. Despite fragmentation, publishers remain very protective of journal brand identity. The Journal Impact Factoris probably the best known metric used to assess journal quality and has been widely endorsed by publishers. Despite being downplayed by agencies tasked with designing national research assessment exercises such as Deutsche Forschungsgemeinschaft and the UK’s Research Excellence Framework, it is still commonly used as a yardstick of quality.

In 2008, University of California libraries adopted a new strategy for journal value assessment designed by the California Digital Library.

A key aspect of this new methodology is the use of a Weighted Value Algorithm to assess multiple vectors of value for each journal title under review.  Value is assessed in three overall categories:  Utility, Quality, and Cost Effectiveness.  For example, usage statistics contribute to a journal’s Utility score, impact factor contributes to its Quality score, while both cost per use and cost per impact factor contribute to its Cost Effectiveness score.  A composite score is then assigned to each journal to assess its overall value in comparison to other journals in the same broad subject category.  In addition to the weighted value algorithm, many other metrics are compiled and provided to campus librarians by CDL to ensure the richest possible set of information with which to make important selection decisions. 

The CDL approach appears to provide a well engineered solution to quality assessment but doesn’t mention data relevant to where institutional authors choose to publish. When an Irish library consortium takes the decision to drop a journal subscription should it not also consider if that journal contains contributions from Irish academic authors and if not renewed, how access to those Irish research papers are guaranteed?

Open Access: the third disruption

It is worth remembering that academic publishing is dominated by a handful of multinational enterprises. Academic authors write, review and edit for no direct remuneration. They also compromise their rights as authors through copyright transfer agreements or exclusive licencing arrangements. The libraries in their home institutions buy access to their outputs via journal subscriptions. Most of these activities and the research they underpin are funded through the public purse.

With the shift to digital, the disbinding of the journal has accelerated, affording a rethink of the entire academic publishing process. Scholarly networks can in theory deliver those essential activities of review and dissemination while bypassing the publisher middle man. One obvious benefit would be the removal of tolled-access barriers to impact. Academic publishing can respond to this new reality or find themselves touting a service platform that is surplus to requirements.

Some publishers have begun to reconstitute the journal to allow for open dissemination and licencing in ways which recalibrate or circumvent the subscription based business model. Most have agreed a line of compromise whereby research institutions can collect and openly disseminate a version of the published paper by allowing the deposit of the author’s final draft manuscript, post peer-review, into an institutional or subject based repository.

The UK has adopted an even more radical approach. From 1stApril 2013, Research Councils UK will directly fund a proportion of the publications generated through their research grants to be made Open Access in the journal of publication. Many UK research libraries now manage a publication fund as well as an institutional repository. Both approaches (institutional or subject repository deposit and journal-side Open Access) are endorsed by research funder mandates and in some cases institutional publication policies similar to Trinity College Dublin’s.

Champions of the current RCUK preference for paid journal-side Open Access over repository deposit can claim that this will eventually lead to the dismantling of the library subscription model and the ‘big deal’ bundle. Yet even the most optimistic admit that the current UK approach simply supports a ‘period of transition’. This is reflected in the policy through support for a hybrid publishing model. Library subscriptions will continue while, subject to the availability of funding, the journal will offer authors a paid option for journal side Open Access. Institutions will pay twice as some publishers transform their journals to a business model sustained by direct publication payments. Critics point out that this ‘double dipping’ by publishers provides no guarantee that it will affect a universal transition away from the subscription model. UK authors may have access to limited publication funding but their international colleagues and research collaborators may not. The economic evidence suggests that a far more effective way to achieve an Open Access tipping point is to support repository deposit. Either way, the Open Access publication fund is here to stay.  

Irish researchers, particularly those working in the STEM (Science, Technology, Engineering and Medicine) disciplines, will be familiar with author pays, journal side Open Access. In the life sciences, publishers such as PLoSBioMed Central and Frontiers provide important publishing platforms for Irish research. Those in receipt of research funding from agencies such as the Wellcome Trust will be aware of mission critical Open Access policies that underwrite publication costs as part of the research project spend. For those without publication funding, strong policies supporting repository deposit as a route to Open Access exist across STEM publishing.

Irish research libraries have created a network of institutional repositories reinforced by funding agency policies which require deposit as a research grant condition. While full compliance remains a challenge, authors do have an option that guarantees access to a peer-reviewed version of their published paper even if subscription to the journal of publication is discontinued.

The IReL selection

It is unclear what criteria informed the IReL decision and why titles from one publisher in particular were selected. Taylor and Francis are well known as a big AHSS (Arts, Humanities and Social Sciences) publisher. The dropped subscriptions are largely titles with a STEM focus. This may appear an arbitrary selection but it is not without precedent. In 2011 University of Virginia Library decided not to renew subscriptions for 1,169 Taylor and Francis titles. Many of these journals are found on the IReL list.

Whatever the reasons, I hope IReL respond to the parliamentary question and are transparent about their assessment methods. More importantly, I hope Irish research libraries recognise that future collection development and management must be fully integrated with the existing repository infrastructure. Irish research deserves nothing less.

Further Reading

Derek J. deSolla Price,  _General theory of bibliometric and other cumulative advantage processes_ Journal of the American Society for Information Science 27 (5-6): 292-306 1976. PDF [ http://garfield.library.upenn.edu/price/pricetheory1976.pdf ]

Carolyn E. Lipscomb, _Mergers in the publishing industry_ Bulletin of the Medical Library Association 89 (3): 307-308 2001 PubMed Central [ http://www.ncbi.nlm.nih.gov/pmc/articles/PMC34566/ ]

Glenn S. McGuigan, _The Business of Academic Publishing: A Strategic Analysis of the Academic Journal Publishing Industry and its Impact on the Future of Scholarly Publishing_ Electronic Journal of Academic and Special Librarianship 9 (3): 2008 [ http://southernlibrarianship.icaap.org/content/v09n03/mcguigan_g01.html ]

Deborah D. Blecic, Stephen E. Wiberley, Joan Fiscella, Sara Bahnmaier-Blaszczak, and Rebecca Lowery _Deal or No Deal? : Evaluating Big Deals and Their Journals_ College & Research Libraries Accepted Manuscript 2011 [ http://crl.acrl.org/content/early/2012/01/09/crl-300.short ]

 Jacqueline Wilson, _Journal Value Metrics Assessment_ California Digital Library, 2011 [ http://www.cdlib.org/cdlinfo/2010/03/30/journal-value-metrics-assessment/ ]

_Report of the Research Prioritisation Steering Group_ [ Ireland ], March 2012 [ http://www.forfas.ie/publications/featuredpublications/title,8958,en.php ]

When the pay walls return, ‘de-monetize’ the archive

Posted on 23 August 2009

Following the $3.4bn net loss posted by News Corporation in June it would seem that Rupert Murdoch is about to tear up the free on-line content model and introduce pay access. And where the Digger goes others may well follow. Even The Guardian seems to be contemplating toll access to some types of premium content although a blanket pay wall would seem out of the question according to the paper’s Director of Digital Content, Emily Bell. Last Friday, The Los Angeles Times reported that News Corp had been in discussion with publishers ‘…including New York Times Co., Washington Post Co., Hearst Corp. and Tribune Co…’ to set up a consortium to charge for distributed on-line news content. With APN News & Media (part owned by Independent News and Media) also ‘…examining many options, including paid content, transactional and ‘club’ models’, I think it is safe to assume that this debate is ongoing in Irish publishing also.

Clearly the notion that the public will pay for online news ring fenced behind aggressive copyright control is now on the boardroom table. This change of direction must also be raising some wry smiles at the BBC. The Newstracker aggregation service was recently overhauled helping to counter claims of ‘unfair advantage’ by big media. If the links will now resolve at the pay wall, why should the BBC bother at all?

It’s been over a year since The Irish Times removed toll access to online news. The paper’s digital archive ‘from 1859 to the present’ is available yearly for €395 and via Proquest for institutional subscriptions. This model of monetizing the archive is fairly common among newspapers with both The Times (€89.95 per annum) and The Guardian and Observer (£49.95 per month) recouping on investments made in digitisation programmes. It’s a pretty crude business model that turns on using the archive (much of which is public domain) as a cash cow. Pay walling this content stifles the possibilities that a more open dissemination model would allow. Never mind the semantic web, this arrangement does not even make the Web 2.0 paddling pool. These archives also appear as stand-alone entities and not truly interoperable with the newspaper’s main online services.

If we have to pay for online news content can the argument not be made that the archive should be free to access? If the archive is more fully integrated into the primary service, then a rolling wall of say 1 day to 1 week that moves content from toll to free could be achieved and some best practice regarding Web 3.0 development retained.

Achievable and effective political reform through public access

Posted on 30 December 2010

The release of the 1980 State papers in the UK and Ireland under the 30 year rule sees the media’s annual review of events past begin to fill column inches and airtime. Post-wikileaks and in post IMF/ECB Ireland, this now seems more like an established pageant with infotainment as a primary output. To add context to the released papers, broadcasters will supplement presentation and discussion with content from their own copyright protected and monetized archives (skillful editing and production can backlight context but getting drawn into the ‘reeling in the years’ vortex may prove irresistible).

While the National Library of Ireland have been canny enough to recognize the marketing opportunities of this particular news cycle and announce the discovery of the Orpen Letters the more important backstory of public access to the paper trail of the Irish state has been strangely muted. At the beginning of 2010 the crisis in the National Archives of Ireland and the proposed legislation to merge the National Archives of Ireland and the Irish Manuscripts Commission into the National Library of Ireland was highlighted by Peter Crooks in the Irish history blog ‘Pue’s Occurrences‘. This was followed by a symposium in Trinity College Dublin with contributions from Catriona Crowe, Fintan O’Toole, Eunan O’Halpin and Diarmaid Ferriter. The delusion that a crude underfunded programme of rationalisation can somehow produce a fit for purpose service is bad enough but it’s only one part of a deeper malaise. The right of access guaranteed by the National Archives Act (1986) and The Freedom of Information Act (1997) have never been presented as pivotal pieces of legislation acting as cornerstones of Irish participatory democracy. Instead they have been begrudgingly implemented or as is the case with FoI, amended on the premise of lessening the bureaucratic burden through the introduction of a toll.

While much has been made of the need to radically reform the Irish political system in terms of the legislature and electoral system little has been said about strengthening the rights of public access to the record of the state. It was telling that at the beginning of the Christmas holidays the first judicial test of the Credit Institutions (Stabilisation) Act 2010 saw the commercial sensitivities clause in section 60 evoked and the hearing on the further injection of €3.7 billion of public funds into Allied Irish Bank held in-camera. This draconian piece of legislation was rushed through the Dáil in four hours and rubber stamped by the Council of State which to all appearances engaged in a futile piece of political choreography. It’s difficult for a public worn out by political spin and mendacity not to be skeptical about the timing of all this falling as it did on the quiet news day of the 23 December – and just to make sure the press were barred from attending. As it stands we may have to wait another 30 years to get to the heart of this matter.

For many Irish citizens desperate to make sense of the country’s tailspin, 2010 marked a shift away from the passive consumption of reporting and opinion forming as practiced by the Irish media establishment toward a data-centric and largely politically neutral analysis presented by economists, political scientists and journalists built upon open content and distributed via social media. Many found themselves driven to this out of necessity. The continued co-opting of the Irish media into making excuses or not properly questioning one disastrous policy decision after another eventually drove us to seek the light elsewhere. One knew all was not well when analysis from The Financial Times or The New York Times proved time and again more cogent and inciseful. My hope for the new government of 2011 is that something like the private members bill introduced by the Labour Party on the 9-10 November 2010 will form a platform for a complete overhaul of the apparatus of access to the public record. All that’s at stake is the future of an effective and transparent republic.