« Call For Papers: International Workshop: Citizen Politics: Are the New Media Reshaping Political Engagement? Barcelona, May 28-30 | Main | Jihadists on Facebook »

Flat Earth News: robots.txt

After becoming thoroughly absorbed by the fascinating insights in 'Flat Earth News' by Nick Davies, I can affirm that the book has greatly served to improve my critical understanding of news production.

 

The book, in his own words, 'names and exposes the national news stories which turn out to be pseudo events manufactured by the PR industry and the global news stories which prove to be fiction generated by a new machinery of international propaganda.' The main thrust of the book details how growing commercial pressures on media producers have radically changed the role of journalists, limiting their role to that of 'churnalists' who simply no longer have the time to do their jobs properly.

 

Critical reviews of the book itself have focussed on some errant instances of the author's interpretation of the data he collected with the help of researchers from Cardiff University, and more strongly, out-and-out refutations from journalists working for the newspapers mentioned.

 

My reading is that while indeed Mr Davies goes after his targets with a ruthless polemic, the sheer volume of evidence collected from a myriad of sources involved in the industry suggests that he has correctly identified the main narrative which describes a true crisis in our mass-media.

 

I guess you'll have to read it to believe it.. and then?

 

Robots.txt


And then you read a BBC News Online story linking the new Obama administration's dedication to 'openness' with a change in the robots.txt file (which tells search engines how to index content on websites) on the whitehouse.gov server. Auntie goes with the angle 'By contrast, after eight years of government the Bush administration was stopping huge swathes of data from being searchable.'

 

Alas, this is a piece of Flat Earth News. This particular news-nugget appeared earlier on the BoingBoing blog, where commentators correctly explained that the old robots.txt file merely prevented the indexing of duplicate text-only versions (along with some other technical fixes).

 

It seems that while the most cursory fact-checking would have revealed this, it just goes too well with the prevailing narrative of the new administration. Davies talks in particular about how BBC News Online journalists are given 15 minutes or so to get an article up from it appearing on wire services, and how in many cases only one source is used.

 

While this particular instance may seem insignificant, Davies explains how the commercial pressures placed on journalists working today have enabled errors, some with much more important ramifications than this, to become commonplace in our mass media.

Posted on Thursday, January 22, 2009 at 04:12PM by Registered CommenterJonathan Perfect (MSc student) in | CommentsPost a Comment

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments

There are no comments for this journal entry. To create a new comment, use the form below.

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>