STM predictions for 2014

Predictions for the information industry and scholarly publishing for 2014

Filed under: Research Productivity

As we enter another New Year, it is the perfect time to look ahead to see what 2014 might hold for the academic publishing predictions 2014publishing industry. The last year has been full of industry developments, discussion and change, and the next looks to be similarly eventful: here we identify five of the key topics and trends for 2014.

eBook subscription services

Seemingly right on the verge of taking off in the consumer market, academic publishers will be keeping a very close eye on successes, failures, and their potential impact on scholarly eBook publishing.

Current discussion is very much focused on what subscription model will win out, and 2014 looks to be the year in which we find out. The Oyster and Scribd services – continually described as Netflix-like – are all-you-can-eat models paid for by monthly subscription, which provide access to a wide range of mainly back-list titles. Entitle has more recent bestsellers on board, allow fewer downloads a month (2-4 books), but let the user keep the eBooks they have downloaded, even if they cancel their subscription. Kindle Owners’ Lending Library, meanwhile, is included as part of the paid-for Amazon Prime service, with a catalog of over 350,000 title available – though only to Kindle owners.

Looking ahead, one can see how some of the unresolved issues still hovering over consumer eBook subscription services are relevant to the academic market. The need for a cross-platform service - compatible with all eReaders plus other devices such as smartphones and tablets - is an obvious area of crossover between the two arenas, as is the question of finding a pricing model that works for the user, the subscription service and the publishers. As always however, we must remember that the consumer side of eBooks behaves very differently to its scholarly counterpart.

Open Data

Much has been said about Open Access this year, and a deeply related topic now garnering its fair share of attention is Open Data. Proponents argue that openly sharing datasets facilitates verification, validation and replication of experimental results, thus proving an overall good for research.

Examples such as Hendon et al.’s critique of Excel coding errors in the high-profile Growth in a Time of Debt lend strength to this argument; the methodologies critiqued were difficult to investigate without all the necessary data being openly available, which may well have delayed discovery of issues with the paper.

This trend is not limited to academia. There is also a growing movement encouraging public bodies to be more open with their datasets, and the BBC in the UK is one high-profile example of an organization that has committed to data openness.

The arguments for Open Data are strong, but some have cautioned that it is not enough in itself. Unless datasets are well-curated, clear and usable, with appropriate metadata, they can be of little worth. Helbig et al.’s whitepaper on The Dynamics of Opening Government Data warns of the negative feedback cycle that can result from datasets failing to meet these criteria and thus experiencing low use, little return, and difficulties gaining momentum for the Open Data movement.

Mina Bissell further cautions that in scientific research, open access to experimental data may not always be sufficient to replicate results – when complex systems and experimental conditions are involved, in-depth consultation with the original authors could well be required.

Notwithstanding these small notes of caution, there are other causes for celebration in a culture of more Open Data. Given that physical data storage always carries a risk of degradation, some researchers are keen to safeguard against the loss of data by opening information up to interested parties – the settings, software and algorithms used in experiments as well as the results. For example, CERN’s Compact Muon Solenoid (CMS) research group – who work with the Large Hadron Collider - are piloting a scheme of releasing their 2010 experimental data, ahead of plans for wider public availability.

Data openness has the further advantage that it provides assistance in the area of data citation. Altmetrics enthusiasts have long argued that there should be means of crediting those who produce and store useful datasets. Open hosting of datasets provides a location from which they can be cited, which is one more step on the way to normalizing the practice. There is already a group of services operating in this space, which is pushing the traditional boundaries of how data can be reused and cited (e.g. Figshare).

Mobile

How to make content available for mobile is a very current issue with many complexities. Numerous decisions must be made about how best to offer content to mobile: a JISC Collections workshop this year discussed apps vs mobile versions of content portals, the role of responsive design, how to ensure interoperability and accessibility, and the problem of authentication.

One area where mobile has already made an impact is in medical publishing, as certain features of this area of the market lend themselves particularly well to a mobile offering. Scholarly Kitchen’s Joseph Esposito argues that because doctors may make use of quick mobile access to publishers’ content when seeing patients, mobile usage represents additive business in this sector. Perhaps in 2014 we will start to see services like this – and mobile access in general to e-resources - really take off.


The Internet of Things

The Internet of Things was a concept first described by Kevin Ashton in2009 as “computers that [know] everything there [is] to know about things – using data they gathered without any help from us.” He envisaged a future in which a multitude of always-on internet-connected devices (things) continuously fed information read by their sensors back into the cloud. This description encompasses devices from GPS and personal fitness trackers, to the ultimate IoT device: the smartphone.

the internet of thingsWith a multitude of sensors (light, sound, accelerometer to name but a few), smartphones are a key component of an Internet of Things, and the past few years’ explosion in their use has provoked increased discussion of the concept, though the term has not quite broken through into the mainstream. However, with all the buzz (e.g. Info Today article) surrounding wearable technology in 2014 (Google Glass and the iWatch being the big names to watch out for), it looks like we could finally be heading for the Internet of Things’ moment.

This is another area where the consumer world is ahead of the academic, but if wearable technology and the Internet of Things turns out to be as big a trend for 2014 as some are predicting, assessing how these tools will affect scholarship and the wider information industry is challenging when much of it is already digitally embedded on the internet. Perhaps it is most applicable at the point of research, where more connected sensors can allow more complex experiments, and yield more complex and sophisticated data.

Peer review

Peer review is an integral part of academic publishing, but its processes have come under some criticism in 2013. John Bohannon’s submission of a spoof article to a collection of Open Access journals uncovered some bad practice in peer review. However, Bohannon was widely criticized for not testing traditional journals to see how they compared, and whether issues were in fact industry-wide. The end result has been an interesting wider discussion about peer review and how it is evolving, which was perhaps inevitable in any case, as innovations in peer review have been gathering momentum for some time.

Post-publication peer review is also of growing interest, particularly with the launch of F1000Research. This Open Access journal conducts an elementary internal check before publishing, but the real peer review process takes place post-publication. Given the range of different peer review models currently in place, it will be interesting to see the direction that discussion takes in 2014.

Whatever happened to…?

While we think our five predictions are likely to be some of the top talking points of 2014, of course some hot trends end up short-lived. Take these, for instance: three topics that generated plenty of buzz leading up to 2013 but never made it big:

Enhanced eBooks

Ever since the eBook first made it big, people have been talking about the potential the format offers for a more a more immersive, interactive and rich reading experience. Enhanced eBooks, however, have remained unable to capitalize on traditional eBooks’ success. After some hopeful coverage towards the beginning of 2013 that eagerly anticipated the “Year of the Advanced eBook,” consensus now seems to be that it isn’t happening.

eBook growth

After years of dramatic increases in market share and sales figures, eBooks’ popularity seems finally to be reaching a plateau, with sales having fallen for the first time year-on-year. Now universally recognized as a significant segment of the overall publishing landscape, it looks like the eBook market may finally have settled somewhat.

MOOCs (Massive Open Online Courses)

MOOCAfter a period of dramatic coverage centered on their potential to completely disrupt higher education, the hype surrounding MOOCs seemed to die down somewhat during 2013. Some commentators were led to question whether it was time to call time on the MOOC altogether, quoting alarming statistics such as a 4% completion rate at the University of Pennsylvania . However others have argued that the MOOC is still evolving, and that massive take-up is still to come – the hype may be over, but the MOOC itself is not. So perhaps the MOOC is in fact another trend to watch out for in 2014?

What are your predictions for the next 12 months? Leave us your ideas in the comments below.