logo

Criticsm of Google's Book Search

Using this essay by Paul Duguid as a basis, if:book takes a look at quality control problems surrounding Google’s Book Search program. As Ben Vershbow asks, “Does simply digitizing these—books, imprimaturs and all—automatically result in an authoritative bibliographic resource?”

Duguid’s suggests not. The process of migrating analog works to the digital environment in a way that respects the orginals but fully integrates them into the networked world is trickier than simply scanning and dumping into a database. The Shandy study shows in detail how Google’s ambition to organizing the world’s books and making them universally accessible and useful (to slightly adapt Google’s mission statement) is being carried out in a hasty, slipshod manner, leading to a serious deficit in quality in what could eventually become, for better or worse, the world’s library.

As is so often the case, the devil is in the details, and it is precisely the details that Google seems to have overlooked, or rather sprinted past. Sloppy scanning and the blithe discarding of organizational and metadata schemes meticulously devised through centuries of librarianship, might indeed make the books “universally accessible” (or close to that) but the “and useful” part of the equation could go unrealized.

There are a lot of issues to debate in relation to this project, but if even 5% of the texts are as unreadable as those featured in Duguid’s article, I think this project will have problems.



One response to “Criticsm of Google's Book Search”

  1. Shaun says:

    I really appreciate your content. The article has really peaked my interest.
    I am going to bookmark your site and keep checking for new information. Thanks for sharing it.

Leave a Reply

Your email address will not be published. Required fields are marked *


The reCAPTCHA verification period has expired. Please reload the page.

This site uses Akismet to reduce spam. Learn how your comment data is processed.