'Reasoning and causality fall out of language', quote from cognitive scientist John Ball. We all learn our native language, or mother tongue. Studying that and the world's languages, unlocks the key to the building blocks of language.
With the help of linguistic framework, Role and Reference Grammar (RRG) we can now exploit its linking algorithm that connects syntax with semantics for implementation on machines.
Ultimately, grounding needs intentionality, which is a power of a conscious mind. Shifting loads inside a machine isn't going to achieve that. However, there are going to be more and more performant ways of simulating the grounding through various means.
You memorize a whole bunch of shapes. Then, you memorize the order the shapes are supposed to go in so that if you see a bunch of shapes in a certain order, you would “answer” by picking a bunch of shapes in another prescribed order. Now, did you just learn any meaning behind any language?
That sounds like you’re describing sequences like beads on a string, which in today’s AI is a glorified version of, totally focused on form, or syntax. Albeit a powerful one.
When John Ball describes his implementation of RRG linguistics for computers it’s based on the associations all humans acquire via their native language.
If he’s watching here, hopefully he’ll jump in with the scientific response rather than my conceptual one.
'Reasoning and causality fall out of language', quote from cognitive scientist John Ball. We all learn our native language, or mother tongue. Studying that and the world's languages, unlocks the key to the building blocks of language.
With the help of linguistic framework, Role and Reference Grammar (RRG) we can now exploit its linking algorithm that connects syntax with semantics for implementation on machines.
Ultimately, grounding needs intentionality, which is a power of a conscious mind. Shifting loads inside a machine isn't going to achieve that. However, there are going to be more and more performant ways of simulating the grounding through various means.
It falls out of language
=====
You memorize a whole bunch of shapes. Then, you memorize the order the shapes are supposed to go in so that if you see a bunch of shapes in a certain order, you would “answer” by picking a bunch of shapes in another prescribed order. Now, did you just learn any meaning behind any language?
=====
That sounds like you’re describing sequences like beads on a string, which in today’s AI is a glorified version of, totally focused on form, or syntax. Albeit a powerful one.
When John Ball describes his implementation of RRG linguistics for computers it’s based on the associations all humans acquire via their native language.
If he’s watching here, hopefully he’ll jump in with the scientific response rather than my conceptual one.
https://johnsball.substack.com/