Although the internet is being touted as a next-generation library, I don’t think it should be considered so until someone solves the problem of extinct web sites.
More often than naught, I am faced with the frustration of bad links. They point to web sites that have been shut down, or whose content has changed and no longer contain the information I’m looking for. It’s easier to retrieve that information if the original site has merely changed its server, but in many cases, the site is gone altogether.
I suppose it’s not fair to say that information in libraries is safe: if a library burns down or is elsewise physically destroyed, it will no longer be a reliable information source. However, it is at least more permanent than information floating around on the web.
I am thinking more about the problem of bad links because news articles are increasingly adding hyperlinks. In a print-oriented world, a reporter would have to describe each element, but now, links are taking the place of descriptions, keeping the article compact yet abling the reader to pursue further information of choice by clicking on the links. It certainly works well with technical or medical terms, organizations, websites, and so forth (for instance, when writing about embryonic stem cells, I would just put a link on the term instead of spending a paragraph explaining what it is).
Relying too much on links, however, is dangerous, because when you are looking through articles that are older, many of the links no longer exist.
Pingback: News in the age of Web 2.0 « Arctic Penguin·