Discussion about this post

User's avatar
Rafael Olivas's avatar

ChatGPT, and all other AI innovations so far, including Alexa and Dall-E, can surprise us with their parlor tricks. And those tricks are impressive, especially when it comes to natural language processing. But they are still some distance away from "understanding." There's a brilliant thought experiment called the Chinese Room, by the philosopher John Searle (https://plato.stanford.edu/entries/chinese-room/) that helps to explain why.

It is worth considering what that argument means. In a nutshell, it suggests that operating a syntax does not generate semantics, at least not by itself. Or stated a bit more plainly, a tool cannot understand the problem to which it is applied.

I bet if you ask ChatGPT to "explain itself" or to even explain one of it's assignments, it will fail to express that it has any understanding. Like the "Chinese Room," it can operate a language syntax, and augmented with a searchable data store (like the Internet), it can "compile" a readable text that YOU can understand. But, at least at the present time, ChatGPT cannot understand its own utterances, and less so, itself.

(A corollary: simply passing the Turing Test does not guarantee that the "device" understands, even if it has intelligently found, translated, and expressed symbols.)

But given enough evolutionary pressure, sensory capabilities, and processing abilities it may be possible that "understanding" capabilities will *emerge* as a consequence. But it is not likely to be "built-in" to the device via programming or modeling or data. "Emergence" seems to be how a level of "semantic intelligence" we witness in some living organisms, especially birds, mammals, and octopuses has happened: it has emerged as a consequence of the neural capabilities in the organism.

So, "overall," yes, for now anyway, ChatCPT will not replace you. Mostly.

Expand full comment
Adam J. Blust's avatar

I’m finding that ChatGPT is good for outlining topics, like for my newsletter. It’s a fascinating starting point. It can be extremely bland, but the Bible story was great.

It reminds me of a story I read in grade school, so the 70s. One of a group of kids had a dad who was a computer scientist, and the dad created a computer that could write papers. The kids just had to input their textbooks, and once they did, the computer wrote the papers for them. But they realized that in the process of inputting the info, they learned the subjects themselves. Pretty similar to what’s happening with A.I. these days. You need to know what to ask.

Expand full comment
13 more comments...

No posts