(Source: "The mind is flat" by Nick Chater)
Physical analog chemical circuits whose physical structure directly is the network, and use chemistry/physics directly for the computations. For example, a sum is usually represented as the number of physical ions present within a space, not some ALU that takes in two binary numbers, each with some large number of bits, requiring shifting electrons to and from buckets, with a bunch of clocked logic operations.
There are a few companies working on more "direct" implementations of inference, like Etched AI [1] and IBM [2], for massive power savings.
[1] https://en.wikipedia.org/wiki/Etched_(company)
[2] https://spectrum.ieee.org/neuromorphic-computing-ibm-northpo...
My armchair take would be that watt usage probably isn't a good proxy for computational complexity in biological systems. A good piece of evidence for this is from the C. elegans research that has found that the configuration of ions within a neuron--not just the electrical charge on the membrane--record computationally-relevant information about a stimulus. There are probably many more hacks like this that allow the brain to handle enormous complexity without it showing up in our measurements of its power consumption.
Jaxley: Differentiable simulation enables large-scale training of detailed biophysical models of neural dynamics [1]
They basically created sofware to simulate real neurons and ran some realistic models to replicate typical AI learning tasks:
"The model had nine different channels in the apical and basal dendrite, the soma, and the axon [39], with a total of 19 free parameters, including maximal channel conductances and dynamics of the calcium pumps."
So yeah, real neurons are a bit more complex then ReLU or Sigmoid.
[1] https://www.biorxiv.org/content/10.1101/2024.08.21.608979v2....