A subset of the AI group at MIT and also some non-MIT people (eg, R.Beer, L.Steels) during the 1980's produced many [similar] papers that I really enjoyed.
Really, ALL of the autonomous mobile robot people working with subsumption were pretty fantastic.
(Also, check out the Roomba vacuum lineage)
http://people.csail.mit.edu/brooks/papers/ -- "Elephants..." is a simple one.
Edit: (adding) If anyone in the Seattle area wants to chat about this, I can talk your ear off. I'm up for coffee/beers.
"We have no idea how human brain actually works, there are billions of neurons. So instead we study brains of simpler organisms. Flatworm's brain has 52 neurons. We still have no idea how it works."
Does this still hold true or did we moved forward since?
Stripped down to the bare essentials, the simplest photovore circuits  work by setting up a voltage divider between two photoresistors positioned like eyes on either side of the robot's body. The output is fed into an inverter based oscillation circuit biased by the side of the robot light is coming from. This runs through some more inverters in parallel to step up the current high enough for two tiny motors with wheels. It creates a robot that waddles towards the brightest light in the room.
The simplicity of the circuit makes the various emergent robot behaviors that much more surprising and interesting. The frequency of the oscillations get faster in brighter light and slower in dim conditions, almost like the robot is searching harder to find the light in a dark room. If one wheel gets stuck, the torque stalling motor backfeeds into the control circuit and it can sometimes jolt or vibrate itself unstuck. If you power the whole thing with a solarpanel that charges up a capacitor for bursts of runtime, the robot really seems to come alive as it wiggles it's way towards the light imperative that keep it's circuits fed. There is something magical when you see a robot struggling to stay alive.
" Opteran has successfully reverse–engineered the algorithm honeybees use for optical flow estimation (the apparent motion of objects in a scene caused by relative motion of the observer). This algorithm can do optical flow processing at 10 kHz for under a Watt, running on a small FPGA. "
I wonder if we may accidentally carry over evolutionary behaviors.