The question in the title is serious: of the ~US$10 billion we collectively pay publishers annually world-wide to hide publicly funded research behind paywalls, we already know that only between 200-800 million go towards actual costs. The rest goes towards profits (~3-4 billion) and paywalls/other inefficiencies (~5 billion). What do we get for overpaying such services by about 98%? We get a literature that essentially lacks every basic functionality we’ve come to expect from any digital object:

  • Limited access
  • Link-rot
  • No scientific impact analysis
  • Lousy peer-review
  • No global search
  • No functional hyperlinks
  • Useless data visualization
  • No submission standards
  • (Almost) no statistics
  • No content-mining
  • No effective way to sort, filter and discover
  • No semantic enrichment
  • No networking feature
  • etc.

Moreover, inasmuch as we use the literature (i.e., in terms of productivity and/or journal rank) to help us select the scientists for promotion and funding, we select the candidates publishing the least reliable science.

Taken together, we pay 10 billion for something we could have for 200 million in order to buy us a completely antiquated, dysfunctional literature that tricks us into selecting the wrong people. If that isn’t enough to hit the emergency brakes, what is?

We may not be able to buy paradise with 10b annually, but with such a low bar, it’s easy to get anything that’s at least not equally abysmal. The kind of modern technology we can buy would probably solve most of the most pressing issues with our literature, cover all our needs in terms of data and make sure we can cite and reuse all scientific code in a version-controlled manner – and then leave a few billion to play around with every year.

With the fruits of our labor firmly in our own control, we would have a flourishing market of services, such that whenever our digital infrastructure would lack the functionalities we expect or becomes too expensive, we can either switch service providers or hire our own experts without loosing our content.

As an added benefit, cutting the steady stream of obscene amounts of money to a parasitic industry with orthogonal interests to scholarship would prevent further buyouts of scholarly ventures such as Mendeley or SSRN and with it the disappearance of valuable scholarly data in the dark underbelly of international corporations.

One reason often brought up against canceling subscriptions is that faculty would complain about the lack of access subscription cancellations would entail. However, already published literature can in principle (although substantial technical hurdles still need to be overcome) be accessed via a clever combination of services such as automated, single-click inter-library loan, LOCKSS and Portico. Moreover, some libraries have seen substantial cost-savings by canceling subscriptions and instead supporting individual article downloads. Finally, institutional repositories as well as pre-print archives need to be leveraged whenever the publisher-version isn’t available. After all, we have DOAI and the pre-print versions are almost identical to the final article. With such an effort, most users likely wouldn’t experience more than maybe a few hiccups, but they’re already used to patchy access anyway, so it wouldn’t look vastly different from what people are experiencing now. In fact, if subscriptions were canceled, there would be a substantial incentive to get the most modern access tools and to keep them up to date, so for many institutions this might actually increase the spread and ease of access, compared to the current largely subscription-only access model. Thus, it is technically feasible to cancel all subscriptions in a way that most users probably wouldn’t even notice it. Essentially, all we’d have to manage is replacing one patchy access system with another patchy access system. While this may not exist out-of-the-box, yet, it should not be too complicated to assemble from existing technologies. One could call this a “legal Sci-Hub“. Add to that an information campaign that alerts users that while no major disruption is anticipated during the transition, some minor problems may arise, and everyone will support this two decades overdue modernization.

Another reason often provided is that the international cooperation between institutions required for such a system-wide cancellation to be effective were impossible to accomplish. That is a problem less easily dismissed than the supposed lack of access to the literature. After all, some governments explicitly don’t want their institutions to cooperate, they want them to compete, and even to develop “world-class competitiveness” (page 17):

competitiveness

In this regard, I recently listened to a podcast interview with Benjamin Peters, author of “How not to network a nation“. His description of the failure to develop the internet in the Soviet Union (compared to the successful developments in the West) reads like an account of how not to make open access a reality:

the American ARPANET took shape thanks to well-managed state subsidies and collaborative research environments and the Soviet network projects stumbled because of unregulated competition among self-interested institutions, bureaucrats, and others.

We need the same collaborative spirit if institutions are to abandon subscriptions, just as they cooperated to spend money to draw cables between institutions, even though they were separated by borders and even oceans. If in the 1980s, our institutions collaborated across nations and continents to spend money on a research infrastructure nobody yet knew, can’t they collaborate now to save money being wasted on an obviously outdated infrastructure? Has neoliberal worship of competition poisoned our academic institutions to such a degree, that within 25 years they went from cooperating even if it means spending money to never cooperating even if it means saving money? I refuse to believe that, even though that’s what some try to tell me.

Instead of trying to tell scholars to behave ethically in spite of the downsides to them personally, maybe we ought to tell institutions that our infrastructure is outdated and that we need the functionalities everybody else but academics are enjoying? We need to get the same mechanisms going that in the 1980s got our universities to invest in networking hardware and cables, despite a functioning telephone and snail mail system. Cancelling subscriptions doesn’t mean losing access, so nobody can tell me that canceling subscriptions is more difficult than installing compatible networking hardware across the globe. I’m now paying for my phone bills out of my budget, while my skype calls are provided by my university. Maybe rather than trying to convince scholars to choose the ethical over the advantageous, it would be more effective to ask our institutions to provide us with modern technology and have those who still want to use the legacy version pay for it themselves?

Framing our issue as an ethical one (“the public needs access to publicly funded research!”) may work, but it is a slow, ineffective and uncertain approach. Framing it as merely a technical modernization strikes me as potentially quicker, straightforward and effective.

UPDATE (May 23, 2016):

Some people have asked for evidence that canceling subscriptions and instead relying on individual downloads can save money. Besides hearsay from several libraries, this is a piece of evidence which should be quite convincing that this can work for some publisher/library combinations (click for larger version):

wiley tokensSo if libraries were to cooperate in identifying more such opportunities and then cleverly combine this action with LOCKSS and/or Portico as well as smart (single-click) ILL between those libraries whose Big Deals have already run out and those which are still running, given enough participants, almost all of the already published literature ought to be accessible. It may not be trivial, but it’s definitely feasible and the technical problems are not the main obstacle – it’s the collaboration that needs to be established. Moreover, this only needs to work for a relatively short time, until most of the journals have run dry of funding and ceased to exist.

Update to the update (June 8, 2016): Similar to the example above, other libraries have canceled big deals and come out ahead. Also here, providing previously subscribed content combined with rapid ILL from collaborating institutions worked just fine. (Link thanks to Micah, in the comments)

Indeed, the OA Tracking project at Harvard is collecting such examples under their own oa.cancellations tag. Quite a nice list of cancellations there. It’s a fact, canceling subscriptions can be done without faculty revolt and with substantial savings.

UPDATE II

Triggered by online discussions, a few hypothetical use cases:

  1. A user is requesting an older document from a journal that their institution longer subscribes to, but was accessible in the past. The link resolver checks if it can be served via LOCKSS or Portico. If not (weird that LOCKSS/Portico would not serve if content was already purchased before!?), then the resolver extracts the article meta-data (importantly, the DOI for use with DOAI) and screens the institutional repositories or places such as PubMedCentral, arXiv/bioarxiv or ResearchGate for the document. If all that fails, the link resolver checks for ILL availability with an institution whose Big Deal has not expired, yet. If none of these 3/4 services can serve the document, then ask user if they want to send copy request to author (single click) or download individual article and pay the fee (faculty get informed about their usage/costs!).
  2. A user is requesting a brand new article from a journal that their institution is not subscribing to. The link resolver extracts the article meta-data (importantly, the DOI for use with DOAI) and checks for availability in repositories. If unavailable, check for ILL. Both not available, ask user if they want to send copy request to author (single click) or download individual article and pay the fee (faculty get informed about their usage/costs!).
  3. A user is requesting an older document from a journal their institution never subscribed to. The link resolver extracts the article meta-data (importantly, the DOI for use with DOAI) and checks all relevant repositories, then ILL. If both fail ask user if they want to send copy request to author (single click) or download individual article and pay the fee (faculty get informed about their usage/costs!).

UPDATE III (Jan 17, 2017):

I’ve collected a short list of ten different ways to access journal articles without a subscription, nine of them completely legal. Clearly, canceling subscriptions (or not renewing them) is not the big deal it once was. If we used the saved funds to invest it in our digital infrastructure, by the time all subscriptions had run out, we would look back and wonder what took us so long.

UPDATE IV (Feb 16, 2017):

Since the start of 2017, about 60 German institutions lost all access to the journals of publisher giant Elsevier. According to a news report:

The loss of access to Elsevier content didn’t overly disturb academic routines, researchers say, because they found other ways to get papers they needed

It’s official. It works. We don’t need subscriptions.

UPDATE V (May 2, 2017):

In addition to the long list of reports about successful and painless subscription cancellations, there is now an evaluation of an informal survey of 31 US-based libraries. 24 of the sample had cancelled Big Deal subscriptions and the author’s conclusion was that “relatively few libraries that actually do cancel their Big Deals end up regretting it”. Obviously, more and more libraries realize that subscriptions are bad value for money and do just fine without them.

UPDATE VI (December 14, 2017):

SPARC is now also tracking big deal cancellations across the globe. It can be done, the evidence is out there for all to see.

(Visited 159 times, 159 visits today)
Share this:
Posted on  at 17:32