Sunday, September 16, 2012

A Deep Blue of Philosophy?

Just posted this to a reddit r/Objectivism thread:


I often find that when I come to some point of Objectivism (or anything else) that I critique, I employ the methods of Objectivist thinking and I can't think of any way around those methods, which emphasize the concepts of integration, context, and hierarchy. How does one attack those concepts without self-contradiction? Anyway, if I come upon something in Rand's writings that I think falls short, I critique it on the grounds that it isn't a correct application of her own prescribed methods. This is why I'm more comfortable calling myself a perfectivist than an Objectivist; it commits me to no doctrine or practice other than the relentless accumulation and integration of knowledge, like with Aristotle. There are chess grandmasters, and then there's Kasparov, and Aristotle is philosophy's Kasparov. (Which raises a question: could a philosophical "Deep Blue" be developed? Damn...has that question been asked before?)


Just to clear up a thing or two right away: Deep Blue isn't a conscious entity.  As it is there are discussions within philosophy of mind circles whether and how we can determine that a machine with the behavioral characteristics of HAL in 2001: A Space Odyssey is conscious.  HAL does display many if not all requisite characteristics of intelligence.  Deep Blue is far from a HAL, but much like HAL, is quite expert at performing the task of playing chess, but that is a rather limited task.

What led me quite quickly to think of HAL is what HAL is short for: Heuristic ALgorithmic.  I'm not an expert on what a heuristic algorithm involves, but I gather Deep Blue relies on such a princple.  (The term "heuristics" appears once in the Deep Blue wikipedia article, so I'm probably onto something.)

Something else to clear up: Where "the highest responsibility of human philosophers is to serve as the guardians and integrators of human knowledge" (Rand, ITOE), a Deep Blue machine wouldn't technically qualify as a philosophical machine because knowledge requires a consciousness.  (I'm pretty sure of that necessary connection but I'll think it through some more.)  What a Deep Blue machine, in the task of integration, would be integrating content without being aware of that content.  (I'm speaking here of the Deep Blue machine as it is now, not an extra-advanced one like HAL.)

But here's the interesting part: Say that scientists could program a machine of Deep Blue's computing power to crawl the web (Google, wikipedia, etc.), integrate its contents, and generate output for humans to work with.  Would that (not) be pretty awesome?  Would the task involve much greater complexity than that involved in playing a chess game while seeing 18 moves ahead?  Could such an algorithm be developed to hone in on what is essential content, and to hone in on connections between items of content, such as what terms in a wikipedia article are hyperlinked?  As has already been discovered, wikipedia has a hierarchical organization demonstrated through a certain pattern of hyperlinking practices, with approximately 95% of wikipedia entries leading to the Philosophy entry.  (This would come as a surprise to a lot of folks, but not the least bit of a surprise to Miss Rand, who, aside from penning endlessly-carictured novels, actually wrote things on the nature and role of philosophy in the human endeavor, and topics connected with that.  If this kind of stuff had already been spelled out in philosophy textbooks, I might have noticed.  Seeing as so few people acknowledge the fundamental role of philosophy in human life, I doubt this message, even if contained in textbooks, got through to the readers as it fucking well should have.)  Wikipedia is quite the example of a system of content, enabled by the development of the internet, that, qua mapping of territory, condenses or essentializes a vast array of territorial concretes.  (I think the term "encyclopedic knowledge" involves the same phenomenon, i.e., systematic essentialization, not necessarily an expertise or familiarity with the mind-boggling number of concretes that an essentialized system necessarily contains.  Encyclopedic knowledge isn't so concrete-bound.)

Hell, what might result if such a machine were set to the task of integrating the contents merely of a high-quality dictionary?

I'll leave the rest to the imagination.