There's no such thing as "machine learning."
The term doesn't mean what most people think it means

The following is excerpted from my article “Artificial Consciousness Is Impossible” written in 2021.
Machines can only be “Learning Rooms” (A “room” in terms of Searle’s Chinese Room) that appear to learn but never actually learn. Considering this, “machine learning” is a widely misunderstood and arguably oft-abused term.
AI textbooks readily admit that the “learning” in “machine learning” isn’t referring to learning in the usual sense of the word:
“For example, a database system that allows users to update data entries would fit our definition of a learning system: it improves its performance at answering database queries based on the experience gained from database updates. Rather than worry about whether this type of activity falls under the usual informal conversational meaning of the word “learning,” we will simply adopt our technical definition of the class of programs that improve through experience.”
-T. Mitchell, Machine Learning (1997), McGraw-Hill Education (1st ed.)
Note how the term “experience” isn’t used in the usual sense of the word, either, because experience isn’t just data collection. The Knowledge Argument shows how the mind doesn’t merely process information about the physical world.
Possessing only physical information and doing so without comprehension, machines hack the activity of learning by engaging in ways that defy the experiential context of the activity. A good example is how a computer artificially adapts to a video game with brute force instead of learning anything.
In the case of “learning to identify pictures”, machines are shown a couple hundred thousand to millions of pictures, and through lots of failures of seeing “gorilla” in bundles of “not gorilla” pixels to eventually correctly matching bunches of pixels on the screen to the term “gorilla”… except that it doesn’t even do it that well all of the time.
Needless to say, “increasing performance of identifying gorilla pixels” through intelligence is hardly the same thing as “learning what a gorilla is” through conscious experience. Mitigating this sledgehammer strategy involves artificially prodding the machines into trying only a smaller subset of everything instead of absolutely everything.
“Learning machines” are “Learning Rooms” that only take on the appearance of learning. Machines mimic certain theoretical mechanisms of learning as well as simulate the result of learning but never replicate the experiential activity of learning. Actual learning requires connecting referents with conscious experiences.
In today's world, the confusion of modern technologies is a result of ignoring Precision in Language by using the misuse of terminology for new innovations leads.