story
Human reasoning, as it exists today, is the result of tens of thousands of years of intuition slowly distilled down to efficient abstract concepts like "numbers", "zero", "angles", "cause", "effect", "energy", "true", "false", ...
I don't know what reasoning from scratch would look like without training on examples from other reasoning beings. As human children do.
Emergent tool use from multi-agent interaction is a good example - https://openai.com/index/emergent-tool-use/
In your particular case the prompt would look something like: <pubmed dump> what are the plants that aren't poisonous to most people?
A general reasoner would recover language and relevant world model from pubmed dump. And then would proceed to reason about it, to perform the task.
It doesn't look like a particularly efficient process.
In an axiomatic system, those solutions are checkable, but how discoverable are they when your search space starts from infinity? How much do you lose by disregarding the gritty reality and foam of human experience? It provides inspirational texture that helps mathematicians in the search at least.
Reality is a massive corpus of cause and effect that can be modeled mathematically. I think you're throwing the baby out with the bathwater if you even want to be able to math in a vacuum. Maybe there is a self optimization spider that can crawl up the axioms and solve all of math. I think you'll find that you can generate new math infinitely, and reality grounds it and provides the gravity to direct efforts towards things that are useful, meaningful and interesting to us.
At the end of the day, all theory must be empirically verified, and contextually useful reasoning simply cannot develop in a vacuum.
On the contrary, when reasoning about the real world, one must reason starting from assumptions that are uncertain (at best) or even "clearly wrong but still probably useful for this particular question" (at worst). Any long and logic-heavy proof would make the results highly dubious.
Unless the brain is using physics that we don’t understand or can’t replicate, it seems that, at least theoretically, there should be a way to model what it’s doing with silicon and code.
States like inspiration and creativity seem to correlate in an interesting way with ‘temperature’, ‘top p’, and other LLM inputs. By turning up the randomness and accepting a wider range of output, you get more nonsense, but you also potentially get more novel insights and connections. Human creativity seems to work in a somewhat similar way.
To your point, experience is the training. Without language/data to represent human experience and knowledge to train a model, how would you give it 'experience'?
They were pre-designed to learn what they always learn. Their minds structured to readily make the same connections as puppies, that dogs have always needed to survive.
Not for real reasoning, which by its nature, does not have a limit.
Its easy to train the same things to a degree, but its amazing to watch different dogs individually learn and reason through things completely differently, even within a breed or even a litter.
Reasoning ability is always limited by the capacity of the thinker to frame the concepts and interactions. Its always limited by definition, we only push that limit farther than other species, and AGI may eventually push it past our abilities.
- humans experience reality at a slower pace than AI could theoretically experience a simulated reality
- humans have to transfer knowledge to the next generation every 80 years (in a manner that's very lossy), and around half of each human lifespan is spent learning things that the previous generation already knew
Reasoning could very well have originally been an emergent property of a group of beings.
The animal kingdom is full of examples of groups being more intelligent than individuals, including in human animals as of today.
It’s entirely possible that reasoning emerged as a property of a group before it emerged in any individual first.
What I wonder instead is whether reasoning is a property that is either there or not there, with a sharp boundary of existence.
Do this continually through generations until you arrive at modern society.