Tuesday, August 7, 2007

An Open Question for AI Researchers

Here's a question that has been bothering me for a couple of days now; it is directed towards anyone who is seriously studying/researching/developing AI:

Does an intelligent system need to be independent of outside influence?

Issue #1: Puppetteers (a la Spinoza)
Perhaps it is my lack of knowledge in narrow AI, but do narrow AIs require an external controller of some sort that determines the choices in computation? (e.g. human systems analyst who oversees the processing of the AI)

Issue #2: Stimulus (a la Descartes)
Perhaps this itself is a nonissue if Cartesian duality were not the case, but would a true "intelligent" system be able to exist in its own self-contained system such that the external universe does not affect it? (e.g. Dharmic religious teachings of liberation from material existence... kind of. More solipsistic, to be very honest, but nonetheless a question worth asking)

No comments: