It's true that replication is evolution's specialty, but maybe the initial construction of grey goo would require some specialized environment that doesn't naturally occur. Nuclear bomb isn't so complicated that evolution couldn't figure it out, for example, but it didn't and won't anytime soon.
That's kinda true, but there was a natural nuclear reactor: http://en.wikipedia.org/wiki/Natural_nuclear_fission_reactor
Also, the Sun seems to work pretty well.
If course, neither of these evolved though biological mechanisms, though, but they did arise in a natural environment. I find it difficult to imagine circumstances under which evolution would "figure out" a nuclear bomb - ie, I'm agreeing with your point that evolution usually requires a series of gradual refinements rather than a big jump forward.
Maybe we would have multiple species of replicators trying to outcompete each other.
Things like this always make me think of Sam's Archive's "Geocide" page. Which, by the way, points out that this does not in fact bring about the destruction of the Earth: http://qntm.org/destroy
like an microorganism or device designed to impregnate every women on the planet
or maybe people will start modifying their offspring to worship them
my personal favorite are tiny robots that interfere with the unjust use of force worldwide, causing the collapse of most governments
I suppose their memory holds both code and data?
A good reminder for businesses. If your only goal is to dominate the market, it is not a worthy goal.
>In particular, it turns out that developing manufacturing systems that use tiny, self-replicating machines would be needlessly inefficient and complicated. The simpler, more efficient, and more obviously safe approach is to make nanoscale tools and put them together in factories big enough to make what you want.
(note that he explicitly acknowledges the safety risk) and
>The popular version of the grey-goo idea seems to be that nanotechnology is dangerous because it means building tiny self-replicating robots that could accidentally run away, multiply and eat the world. But there’s no need to build anything remotely resembling a runaway replicator, which would be a pointless and difficult engineering task. I worry instead about simpler, more dangerous things that powerful groups might build deliberately - products like cheap, abundant, high-performance weapons with a billion processors in the guidance systems.
This does nothing to diminish the risk of replicators if they are, in fact, created. And there are all sort of possible problems where replicators would be essential. For example, we may want to release replicators into the environment to clean up certain kinds of pollution which can't be easily brought to a central facility.
Drexler thinks we underestimate the difficulty of building run-away replicators. Nature's had 4 billion years and hasn't managed it. Yes, I'm aware of the wheel argument.