I read and I responded on Medium. (I have the same user name there as here.)
Whether instructions have meaning for the machine is not important.
What is important is to understand the situation well-enough to respond appropriately. For people, "meaning" alone carries you only that far. One needs work to get good. Machines need the same.
On that there is agreement. The question is, how deep should meaning go.
Deep down, people are just wiring, just as machines are instructions. We build a lot of finely-calibrated world models on top of that.
A sufficiently faithful machine model can do just as well in terms of meaning and understanding of specific phenomena needed to handle intelligently the tasks at hand.
On this, there is fair agreement. However, machines can be made to understand everything people understand, if modeled right.
Models are themselves instructions, and machine instructions carry zero meaning for machines themselves. I've repeated my explanation in many different places and this LI post is one of them: https://www.linkedin.com/posts/dhsing_anthropomorphism-ai-education-activity-7198133462703169536-REho/
I read and I responded on Medium. (I have the same user name there as here.)
Whether instructions have meaning for the machine is not important.
What is important is to understand the situation well-enough to respond appropriately. For people, "meaning" alone carries you only that far. One needs work to get good. Machines need the same.
Uh, meaning is understanding.
On that there is agreement. The question is, how deep should meaning go.
Deep down, people are just wiring, just as machines are instructions. We build a lot of finely-calibrated world models on top of that.
A sufficiently faithful machine model can do just as well in terms of meaning and understanding of specific phenomena needed to handle intelligently the tasks at hand.
I've started an email thread as you've requested. Living beings are not "just wiring". That's an extremely shallow conception.
Describing something that an artist is unfamiliar with and providing reference images is standard practice for commissioning art.
This doesn't mean that artists lack comprehension of "concepts", it means that you have to describe things in ways that people understand.
While these AIs don't actually understand anything, this is irrelevant to why they are useful.
I wasn't talking about whether something is useful or not. Dumb doorknobs are useful. The point remains that they don't deal with concepts.