07.12.11
The Evolution of the Net and Literature
Summary: An opinion and personal perspective on how the information people access is changing over time, and ever more rapidly with the emergence of the Internet
THIS post is not about the general evolution of the Internet or of publishing. It is based on a very personal perspective and it should be limited to anecdote, not historical evidence. To put it less vaguely, this is an attempt to explain how the passage of ideas — including those which one might put in a patent application — can (if not does) change over time.
Back in the days, people used the printing industry to spread their ideas, which needed to be clustered together into packages that make acquisition and transportation worth the cost. Books were very comprehensive pieces of work and some were a compilation of works, a medley of sorts. Books could also be shared between people, so for each manufactured book there was a travel time lasting decades if not centuries, each occupying days of one’s time (or several people’s time) at the expense of years of one’s work (assuming the book is well written and properly researched for).
“We no longer depend on travel to conferences, or at least not insist on those.”Academic journals are an interesting beast and nowadays they get grouped into sets which are sometimes sold under something like the LNCS banner (Lecture Notes in Computer Science). We no longer depend on travel to conferences, or at least not insist on those. We can find a lot of videos on the Web and download particular papers of interest (in abundance) off the Web, rather than ordering them by snail mail, then waiting for a long time for them to arrive (lag), alternatively having a subset of these stockpiled in libraries, which still require travelling to and they make copying of material (for reference at home) cumbersome, especially if one needs to chase all the bibliography. This world of journals and conference papers is still somewhat riddled by legacy conventions that make everything slow, extremely time-consuming, yet narrow in terms of scope (page limits constrain writers to publish just a tiny subset of their results, usually just the best ones). These papers, along with books that are often derived from these (by reuse), are still some of the best literature we have out there because these are written by experts in their fields — not journalists who try to help sell ads (akin to fiction writers and novels) — and they are peer-reviewed, then selected also in part based on reputation. Newspapers offer no references and sometimes also omit names of those involved in putting together a story. The model is trust there is lacking.
Nowadays, blogs are popular and increasingly — although there are exceptions — people find that they prefer microblogging for publishing (and for digestion) because it’s faster. It is also more diverse (more narratives per time unit) and quality control relies less on grammatical and structural assessment (which depends on repeated proofreading). Along with that there is a growth in social networks and sites where comments are massively shortened or even redacted. We live in a world of “bites” rather than “stories” and a lot of people start to get their information through platforms such as Facebook. It is far from ideal as it breeds trust in all sorts of junk ‘information’ (superstition, racism, etc.) and leaves the accurate reporting only to those who are patient enough (vanishingly small number).
“Along with that there is a growth in social networks and sites where comments are massively shortened or even redacted.”Speaking for myself, my history on the Web did in some way follow the trends above. Although I built my first Web site when I was 15, I started to get heavily involved in USENET around 2004 which is also the year I started publishing papers and giving lectures (I was 22 at the time) and even though I continued to publish in academic circles in years to come I found myself drifting towards blogging where the audience was large, the composition process was a lot more rapid, and most importantly there was constant feedback from both supporters and sceptics. In 2006 I started getting more involved when I joined Digg and became ranked 17th in the site (at the same year as joining) and later in the year I even got a job in the area (Netscape.com). Separately, I got involved in blogging outside my own site (schestowitz.com had published about 1,000 blog posts by that point) and notably I was involved in “Boycott Novell”. This really took off in 2008 and in 2010 Tim and I started forming an audiocast around our existing readers base (in 2011 we also experimented a little with video, which is very fast to produce). The increased interest in Identil.ca (and later on Twitter) was complementary to this because the main function of these sites is linkage to one’s items of interest, sometimes with an additional remark (140-character limit is… well, limiting). So here we are in an information cycle where messages are increasingly abbreviated (I have not bothered submitting papers to journals or conferences since 2006 when it was needed for me to get my Ph.D.) and attention moves away from long articles that can take writers days to prepare (this is how real reporting should be done). As for books, nowadays they are not sold but are rather than that “licensed” for digital use by one single person. Disgusting from the point of view of sharing information, but possibly acceptable from a business person’s point of view (and we all have DRM to thank for that).
What do our readers foresee as the future of information? We assume all information will eventually converge in digital form, even scanned and OCR’d in some cases, but what medium will dominate? Might professors start blogging more often than not? Will Open Access become the norm? Will Open Data become a pre-requisite for publication where results are reproducible and open to audit? Cablegate was a sort of example of Open Data/Open Access and it was fantastic for honest reporting.
At Techrights we continue to value spin-free writing that ignores the PR and really gets to the bottom of issues. █
Needs Sunlight said,
July 13, 2011 at 1:13 am
Open Access will indeed become the norm. Arxiv led the way in physics, mathematics, and computer science. It is probably already so in several other fields. Right now publishers of paper journals have been really raising their prices so it’s becoming impractical to remain subscribed to them. As soon as alternatives in any field become even remotely viable, people jump ship.
Dr. Roy Schestowitz Reply:
July 13th, 2011 at 2:09 am
My colleagues who still invest a lot of time in printed publications will hopefully realise that they are pursuing yesterday’s medium, which nonetheless is still the ‘currency’ of academia. I see more and more people who just use Google or Wikipedia rather than open up a journal or travel to conferences. There’s a problem due to lack of peer review, but having said that, many people read and fix Wikipedia and in Google you can limit the search to published papers. The future is unknown, but now that Google scans a lot of books, it sure seems like there is centralisation.
Needs Sunlight Reply:
July 13th, 2011 at 6:09 am
I was thinking more about Public Library of Science than Google. PLoS is becoming the high impact publication for more and more fields.
Dr. Roy Schestowitz Reply:
July 13th, 2011 at 7:30 am
I am aware of it, but still unaware of people who put/use stuff there. By the way, the Web Archive is having financial issues, as it publicly signalled this month. If it pulls the plug we might need a web archive for the Web Archive. Governments should put our tax money into such projects rather than some of the other rubbish (like bailing out large banks).
twitter Reply:
July 13th, 2011 at 6:13 pm
Every local library should be doing what Archive.org does and adding their own special collections. The tax money is already there, it’s simply wasted purchasing and storing paper. The rich and powerful don’t want control over knowledge, so they will continue to block reasonable reform and public spending where it could do any good.
Print journals are also a waste of time, but they sit on a vast reserve of knowledge that should be liberated rather than consolidated under nasties like Elsevier. Greedy publishers continue to lengthen copyright at everyone else’s expense. This makes it difficult to research and gives the publishers power that no one wants them to have. Their power will dissolve as people who know what they are doing continue to publish in open access journals.
Google centralization is not an issue if everyone is free to do what they do. The problem is a legal framework that can only be overcome by huge companies who will then have an incentive to continue the exclusionary policies of the past.
Dr. Roy Schestowitz Reply:
July 14th, 2011 at 12:24 am
Journals too can be managed by governments for the betterment of science (widening access too).