today's sum n' squash (sometimes not even squash) graph networks were just kind of a curiosity before gpus turned them into a new very successful computational paradigm. maybe we'll see something similar with these high element count optical spiking graphs, even if they aren't great approximations of the real biology.
i like to think that a new analog computational substrate (or mixed analog and digital system) will be what drives the next leap in machine computation.
They are not meant to. This is not "brain simulation" or similar - which exists, but is a different matter. This context is instead about neuromorphic computing, as hardware implementation of components for Artificial Neural Networks. And results seem to be remarkable:
> They calculated that the synapses are capable of spike rates exceeding 10 million hertz while consuming roughly 33 attojoules of power per synaptic event (an attojoule is 10-18 of a joule)
The comparison with biological neuro-transmission is just indicative - for trivia, for curiosity.
--
Edit:
on the contrary, these devices aim to be in a way simpler than ANN's neurons (far from aiming to be as complex as cerebral neurons):
> By only rarely firing spikes, these devices shuffle around much less data than typical artificial neural networks and, in principle, require much less power and communication bandwidth
That is because the underlying aim is to achieve using a single photon for communication, with an immediate potential practical use in ANNs.
This whole AI field keeps on failing because people like to overthink things. Did Michelangelo need to know molecular chemistry to make sculptures? Why do people pretend there is no artistic component to building AI? Rant finished.
Would it be the equivalent of edges communicating between each other in artifical neural networks?
The brain then sends a signal for the foot to react, but the amount of time it will take for the foot to move will be the same as normal since the signal takes the same time to reach the foot. I imagine at 30000x, you would notice that it's taking very very long for the foot to move. To notice it would mean to have feedback from the skin inside your socks/pants that the foot/leg is moving against your sock/pants, and the brain will probably think that feedback is taking forever after it sent the signal.
Yes, evolution builds upon what went before rather than starting fresh, but nature never calls it quits. It's a process, not a thinking entity. In any stable population there will be variances that have neither a benefit or cost until environmental pressures force it to "select" the most appropriate. You have to look at a longer timescale to see the adaptations take hold.
You could say species dying out is calling it quits in a way, but evolution encompasses everything not just the extinct - but I don't think that's what you meant.
Hence calling it quits: nature will do what it needs to do until things work well enough, and then calls it quits for that particular feature set until such time where what used to be good enough isn't good enough anymore.
Great, so we're just bags of meat sashaying around with loads of technical debt baggage via the slapdash coding delivered by nature in a multi-billion year project. The ongoing result of always picking good and cheap over fast.
How densly is it possible to structure them and still provide adequate cooling?