I agree, and I think Goodman & Tenennaum [2] is a great place to see other things that may pop up in the road to AGI. LLMs are great, but they do too much at once. I think moving towards AGI requires some form of symbolic reasoning, possibly combined with LLMs, which may play the role of intuition and kitchen-sink memory.