story
That notion is wrong (at least with a very high likelihood), and it's usually stated by people who fetishize the lambda calculus but know little of its long evolution. It's just your ordinary case (of hubris) where people aesthetically drawn to something describe it as inevitable or even a law of nature. And I know it's wrong in part because of the following quote:
> We do not attach any character of uniqueness or absolute truth to any particular system of logic. The entities of formal logic are abstractions, invented because of their use in describing and systematizing facts of experience or observation, and their properties, determined in rough outline by this intended use, depend for their exact character on the arbitrary choice of the inventor.
This quote is by the American logician Alonzo Church (1903-1995) in his 1932 paper, A Set of Postulates for the Foundation of Logic, and it appears as an introduction to the invention Church first described in that paper: the (untyped) lambda calculus [1].
The simpler explanation, which has the added benefit of also being true, or at least supported by plentiful evidence, is that the lambda calculus was invented as a step in a long line of research, tradition and aesthetics, and so others exposed to it could have (and did) invent similar things.
If you're interested in the real history of the evolution of formal logic and computation (and algebra) you can find the above quote, and many others, in a 300-page anthology of (mostly) primary sources that I composed about a year and a half ago [2]. They describe the meticulous, intentional invention of various formalisms over the centuries, as well as aesthetic concerns that have led some to prefer one formalism over another.
[1]: Actually, in that paper, what would become the lambda calculus is presented as the proof calculus for a logic that was later proven unsound. The calculus itself was then extracted and used in Church's more famous 1936 paper, An Unsolvable Problem of Elementary Number Theory in an almost-successful attempt to describe the essence of computation. That feat was finally achieved by Turing a few months later.