A few glaring items came to mind as I finished reading Roszak's "The Cult of Information:"
- The second half of this book is much, much better than the first
- I wish I would have known about this book when I was writing "The Developers"
- This guy seemed to predict or at least know about so much, but how could he have not seen the usefulness of syncing data online?
Let's start with No. 1. In the first 100 pages or so if the book, it seemed to me that Rosak could have condensed his thoughts to about 30 pages, considering that he seemed to talk in circles regarding information versus knowledge versus intelligence. A few times, he even seemed to come to the opposite conclusion of his original premise, although it's quite possible that I was the one confused in reading it. I'm pleased to report that in the last 100 pages, he uses more scientific proof for his theories, and not just in a passing manner. Few would disagree that there is indeed a "radical discrepancy" between how a computer "thinks" and how humans think (p. 125). Roszak uses the example of how a computer might be able to win a game of chess, but it would not necessarily know how to prevent itself from becoming wet during a rain storm. While this could now be programmed into a machine (if it needed to have a purpose of being in and out of the rain), it is still important to note that computers, in general, are not built with a "common sense" module, nor would that be an easy thing to add. For instance, a scientist could probably create a robot to simulate a number of my routine activities, but it would take an infinite amount of time to account for things that I may participate in on an irregular basis. Maybe a computer/robot could be programmed to do all things humans do, and perhaps make some decisions for itself, but as spontaneous as life is sometimes, how long would it take to build such a machine?
This brings to late our latest reading, "Brave New World," and a question that hit me after giving my presentation: Were the characters in the book individuals or robots? The conditioning that ruled the Hatchery is really not that different than conditional statements used in computer logic. Roszak discusses how simple rules create for/while loops that handle a good deal of code, which could be summed up like the following:
while (kid < 12) { foreach (day in year) { while (iteration <= 150) { echo "everybody's happy now"; iteration++; } } }
So, before the kid turns 12, every day in the year, he or she will hear the phrase "everybody's happy now" 150 times. True, The World State isn't programming the child to do a task in this instance, but it would not be difficult to create a function to do this. It should also be mentioned that most of Roszak's arguments toward computer logic come from a procedural point-of-view (including my example above), while programming as a whole has seen a flip toward object-oriented programming in recent years. Without getting too involved in the details, this means that instead of just going line by line, from Point A to the final result in Point B, we have seen an increase in coders attempting to start with the problem (Point B) and come up with a solution that might involve bits of code elsewhere in a given program. I could explain this better with real-life examples, but I feel as if I'm really off on a tangent now ...
I was intrigued at the number of examples regarding governmental agencies and various organizations that have been mulling electronic storage for decades. Probably the biggest piece of information was learning about the 1985 National Security Directive, which gave the NSA complete control over all federal computers and data banks (p. 210). On top of this, the NSA was also given the right to access all third-party data banks linked to government operations. I had no previous knowledge of FORECASTS, which seems to be a data-driven approach to predict the future in relation to cultural behavior of various nations (p. 227). This interests me because had I known about this, I may have created a different scenario in my fiction book, "The Developers." In it, a team of web programmers are enlisted by the government to essentially build a new Internet ... one that promises ridiculously fast speeds in exchange for divulging a little extra information about yourself and family. The idea was that people would be eligible for the new Internet through census questionnaires, and with a tiny change to the U.S. Census code laws, this information could then be shared with commercial companies. Of course, this would be a huge red flag for privacy proponents, but the change would be so minor that it would have been difficult to detect.
If I had to do it all over again, I probably would have linked this situation to the existing NSA databases, although it is somewhat humorous to think of all this data sitting around for decades ... how relevant would it be now? I've found that many databases are not even kept up to date, so when a company wants a new system, we have to export their old info but redo bits and pieces to make it even work for the new system. Combine this with the fact that people frequently move, their demographics change every other week, and you are oftentimes left with nearly useless data.
Lastly, with all of Roszak's info, I was a bit shocked that he never got into the details of a future Internet, where it is feasible to have an insurmountable amount of data online. In an interview with Thomas Mann, a librarian at the Library of Congress, the supposedly arduous process of electronically categorizing books and information seems to prove that this could never occur on such a large scale. Yet, Roszak seems to think there is some time frame during which all of this has to happen, when in fact, moving words from media to media has gone on ever since the written language was developed. What would have happened if people had given up utilizing moveable type because they said, "We have too many hand-written books ... this will take forever!"
The coincidental thing about this particular item is that people have come up with clever ways to do exactly what Mann claimed to be nearly impossible. The reCAPTCHA program enables millions of Internet users to help digitize books without even trying to do so. CAPTCHA is oftentimes used on websites to prevent spammers from setting up scripts to fill out web submission forms. This particular type of CAPTCHA takes scans from actual books that have not be electronically formatted yet. Therefore, when a human types in the word or two, he or she is not only proving to be a real person, but the info is then used to create the electronic version of the full text.
Will every bit of information and books be available online? Of course not ... I do not plan to go back and type every note I've ever taken in a class and put it on the web. And who's to say that my notes aren't useful enough information? I think a fun fiction plot would be to determine the most useful piece of information that's not on the Internet. I suppose once it is found, however, we'll probably find a way to bookmark it and share with our virtual friends.
Comments