Intelligent Plastic Machines

The World Within Us edit

The word science comes from the Latin word "scientia," meaning "knowledge". The practice of science is a search for the truth about reality. Scientific discoveries have created revolutions in our understanding of the reality of the world around us:

"... there have been within the experience of people now living at least three serious crises... There have been two such crises in physics - namely, the conceptual soul-searching connected with the discovery of relativity and the conceptual difficulties connected with discoveries in quantum theory... The third crisis was in mathematics. It was a very serious conceptual crisis, dealing with rigor and the proper way to carry out a correct mathematical proof. In view of the earlier notions of the absolute rigor of mathematics, it is surprising that such a thing could have happened, and even more surprising that it could have happened in these latter days when miracles are not supposed to take place. Yet it did happen." - John von Newman (1963), writing about Godel's (1931) Incompleteness Theorem.

René Descartes's concept of the mind as a spirit which communicates with the brain via the eyes.

Today, we stand on the threshold of a revolution in our understanding of the reality of the world within us; one that has been slowly creeping up over the horizon of our comprehension, as discoveries in biochemistry, botany, microbiology, medicine, psychiatry, psychology, neuroscience and zoology shine new light upon the intricate mechanisms of all life, including our own. We are ordinarily completely unaware of what is really going on inside us, for our eyes look outwards, not inwards. We experience our own thoughts and feelings, but we cannot dissect and inspect them ourselves (although we can dwell upon our imaginings by introspection and rumination). But with knives and microscopes, scientists can slice up other people's heads and bodies and look inside them. And what they have discovered is that, contrary to what Rene Descartes (1641) thought, all the evidence suggests that the mind is not situated outside the body in an ethereal metaphysical realm, but in it, in the real physical realm of substance. There is no mind-brain duality – what we call the mind is our conscious perception of processes occurring in the brain.

But the revelation doesn’t stop there. One of the newest kids on the block of evolution, people have designed and manufactured some pretty amazing information processing machines. Only fairly recently - the last 100 years or so - have we been able to use some of these machines to look deep inside the tissues of living organisms and start to really unravel the complexity of what they are made of and how they work. And only very recently - the last 10 years or so - have those investigations revealed that all living creatures are related to one another much more closely than ever before imagined.

What all living creatures have in common is what they are made of, what they do and how they do it.

What they do is import things, process them inside themselves to perform various functions, including manufacturing new things from raw materials, and export things. These things - light waves, sound waves, molecules and forces - they are perturbations of the energy field of the Universe. They are information. Human cognitive processes such as sensations, perceptions, thoughts and feelings, are physical processes. They are information processes. They are biocomputations.

Anything that processes information in any kind of way is a computer. Plants, as well as animals, have senses and perform computational functions that process the inputs of their senses. Houseplant gardeners notice that they tend to grow towards the window, where the light comes from. This is no accident – cells in a plant's growing tip contain light sensors called phytochromes, which initiate a cascade of biocomputations resulting in differential expansion of the cells lining its stem so that it takes on a curved shape. Plants can also sense the direction of gravity – stems grow up whilst roots grow down from a seed, no matter which way up the seed is.

Phototropic plants, cloned sheep, IVF babies and all forms of life, including you and me, are information processors. A contemporary version of Descartes' celebrated statement: “I think, therefore I am”, might, 369 years later, go something like this:

“I live, therefore I compute”

Basic structure of an amino acid. Proteins are strings of amino acids linked by hydrogen bonds

The computational mechanism of an artificial computing machine is mostly made of silicon and metals. It represents its software (program and data) in the form of digital electrical signals; its computations are performed by electronic mechanisms that operate on these signals. The computational mechanism of a natural computing machine is mostly made of organic molecules called amino acids, which are hydrocarbons that can link up to form long polymer chains called proteins. A living machine represents its software in the form of chemical and electrical signals; its computations are performed by organelles, cells and organs that operate upon these signals and can be changed by them - whereas an electronic computer can change its software, but not (as yet) its hardware, a living machine can modify the structures that perform its computations and even make new ones by computational processes we call growth and reproduction.

The physics of the bonds of proteins enables them to bend and twist into different shapes without destroying their chemical integrity. The plasticity of proteins enables muscles to pulse, skin to stretch, and tendrils, roots and neurons to grow towards light, water and their neighbours. Just as silicon doesn’t do much by itself, neither do proteins. But a complex assembly of proteins, together with a few other kinds of molecules, can do a remarkable thing: it can change its physical structure. The physical plasticity of proteins is one of the keys to the behavioural plasticity of organisms made of them: the miraculous ability of living machines to manufacture their own componentry out of raw materials to grow, reproduce and, in response to the constantly changing flux of environmental conditions, to intelligently adapt the physical structures that realise the material being of their computational functions.

Octane, a hydrocarbon found in crude oil; black spheres are carbon, white ones are hydrogen

After death, the natural polymers and other chemical components of living machines accumulate in the subsoil and by the processes of decomposition over long periods of time, transmute into the constituents (such as octane) of coal and crude oil which can be dug up and reprocessed to produce synthetic polymers.

That which We Call a Rose edit

When you do a web search, you ask the search engine server farm to search its indexed database for matches. Input a key-string to it and it outputs a bunch of URLs. That's a computation.

In Japan and India, the traditional way of finding a spouse is to employ a Matchmaker - a person supposedly skilled at finding compatible partners. Matchmaking is a computation, so the Matchmaker is a computer.

When someone catches your eye (or, even more likely, your Vomero Nasal Organ is tickled by what it senses), and you fancy them, you are your own matchmaker. You are the computer. Your information input is their beauty (or pheremones) and your information process - your reaction - places you in a new state of mind. And body! The concept of computation as a state transition was first proposed by Turing (1936), although the idea of building a machine to perform computations dates back (at least) to the neolithic era; for example, Stonehenge is a computer made of stone whose only moving parts are the sun and the moon - Stonehenge accurately computes the coincidence of the solar and lunar solstices (Hawkins, 1963). When the coincidence occurs, the apparatus of Stonehenge changes as the two heavenly orbs appear framed by the viewfinder created by the angles and elevations of the henges, when viewed from a particular spot - there is more to Stonehenge than a bunch of arches standing passively in a circle.

Computers are processors of information. Stonehenge processes the information of the trajectories of the sun and moon across the sky, across the seasons. But what, exactly, is information?

"When I use a word", said Humpty-Dumpty in a scornful tone, "I use it to mean what I want it to mean. Neither more nor less!" - Lewis Carrol, Through the Looking-Glass.

Half Answer : Information is something said or written that means something.

This is what most people would probably say, and in one sense it is a perfectly reasonable operational definition. But it is a purely subjective one, as whether something means something to somebody depends completely upon the recipient. For example, an encrypted message seeks to conceal its meaning from all except those who know how to decrypt it. Yet the information in the encrypted message is still there, it's just that it's indecipherable to those who don't know what computations they should do with the message to work out its meaning.

And the same principle is true of unencrypted messages in a language you don’t understand because your brain hasn’t learned to process it. For example, below is this paragraph written in Swahili (according to Google)...

“Na huyo wa kanuni ni kweli unencrypted ujumbe katika lugha huelewi sababu ubongo wako haina programu zinazohitajika mchakato yake. Kwa mfano, chini ya aya hii imeandikwa katika Kiswahili (kulingana na Google)”

...which is unintelligible to anyone who doesn’t speak Swahili. But it’s not just foreign languages that can be indecipherable to us. Suppose you say something you believe to be deep and meaningful to someone, and they reply: "That's nonsense!". To them, what you said was devoid of information because they didn’t have the same software as you in their heads. Yet to you, it was a pearl of wisdom. The information - what you said (as distinct from what you meant) - once spoken, takes upon itself an existence independent of either you or them.

So to say that information is something that means something is only half the answer - it really is something that would mean something, if only you had the wit to understand it.

Full Answer : Information is distinctive variation in a physical field - a signal that can be output, transmitted, input, and processed.

We may be tempted to think of information as something abstract - as something without a physical form, because the same information can be represented by many different forms. But information has no existence without a physical form to depict it. Whether that form is a sound wave, ink on paper, a TV signal, a smell, or a molecule of adrenaline circulating in someone’s bloodstream, it is a physical form.

Some examples of information fields are sound, light, the chemical landscape, pressure and voltage. The process of taking information in one form and producing the same information in another form is called transduction. Microphones, loudspeakers and modems are examples of engineered transducers. In describing the actions of biological cells, signal transduction is the general term used to describe the response by a cell to a stimulus received from its environment. This stimulus could be a chemical messenger, or other forms of information such as heat, salinity, etc.

The information content of a signal can only be discerned from the general hubbub of white noise in the background field if the recipient has some mechanism for noticing it (whether they understand it or not is another matter). The discernment of a signal requires it to be different from the background in a distinctive way. Think of a cyclic sound (like the sound of your own breathing) - only when it changes its rhythm or pitch or amplitude do you notice it. Samuel Morse's colleague Alfred Vail realised that message encoding could be done most efficiently if the most likely message constituents were encoded (signalled) with the shortest distinct signatures. We all know this intuitively, as in when we say "ouch!" to urgently attract someone's aid/sympathy, rather than embark upon a long discourse as to the nature of our request for help. And when we write in dark ink on light paper or vice-versa.

A purposeful sender of information intends it to have a certain meaning, but the information itself has an existence independent of that intention. And information can come in forms other than words. For example, a smell. Or a picture. Or a tap on the shoulder. Or a flutter of the eyelids, a blush in the cheeks, a gleam in the eye, a tremble of the knee, a bead of perspiration on the upper lip, or a smile, a grimace, a wave, a turn of the head, a rustle in the undergrowth, a dark cloud on the horizon.

A tiger sneaking up on its prey tries hard not to transmit information about its presence, but if it steps on a twig that snaps with a sharp cracking sound, all the animals and birds within earshot will bolt for cover. At other times, that same tiger will deliberately announce its presence – not to prey but to other tigers, to lay claim to a territory - by leaving chemical messages (in their urine) all over the place. They will even urinate right on top of another tiger’s urination to erase its message and leave their own in its place. Domestic cats claim ownership of their human guardians by rubbing scent glands in their necks against the owner’s legs. People imagine that this is a sign of affection, but it’s actually their way of branding you as their property.

For organisms that have the ability to interpret it, light plays a very important role as a carrier of information about the object off which it last bounced, which enables them to see where they are and what is around them. The visual field is very complex and the processing of its information occupies a significant proportion of the human brain.

One of our most familiar forms of information media is the newspaper. Its stories inform (or misinform) us about what is happening in the world. Newspapers are made of paper, which provides a convenient monochrome background field on which information can be represented in the form of differently-coloured ink placed on the paper in such a way as to create distinctive shapes: pictures of symbols such as letters and numbers.

If a picture is worth a thousand words, then a thousand words can give you the picture. And words can be written as sequences of symbols. And symbols can be represented by little pictures. It sounds circular, but it's not: what makes symbols informational is their distinctiveness from each other. The property of distinctiveness is the key thing that separates information from noise. If someone tells you something you already know, they are not being very informative. If the sky is full of clouds, another one just like the others doesn’t mean much. But if the sky is full of fluffy white clouds, and an unusual black one appears on the horizon, it gets your attention. Because it’s information. It’s information because it’s distinctive.

Ancient Egyptians, Mesapotamians and Greeks, staring up at the heavens on a clear night, studied the twinkling lights that pepper the darkness of space. They noticed that the lights were not randomly scattered, but retained their relative positions within a great wheel that seemed to slowly revolve around the Earth, with the exception of wanderers the Greeks called planets and a particularly big one that the Mesapotamians called Sin - the one we call the moon. Because of their distinctiveness against the dark background, the ancient peoples of Earth could discern the planets and stars individually and track their progress across the sky. And so the planets, the stars and their constellations became heavenly symbols and they soon acquired anthropocentric earthly meanings in the imaginations of astrologers.

Computation = Symbol Processing

Symbols are signals that designate information values. What computers do – compute - is process symbols.

In the 17th century, a computer was a person who, with pencil and paper, worked out the arithmetic calculations needed to keep track of the financial position of a business. The title was later conferred on people who worked out other numerical calculations, such as the values to put in mathematical tables that listed the values of various functions, including the "Sight Reduction Tables" that were used by navigators for calculating their geographical position, based on their visual observation of the height of a known star and the date and time of the sighting. The original meaning of the word computer fell into disuse once the job ceased to exist when electronic machines were invented that could do numerical calculations quicker, cheaper, and more reliably, which was very important to the military - being able to reliably calculate where artillery shells would land (Weik, 1961) made the machines of war more effective.

Although reliable number-crunching was the design purpose of the early inventors of mechanical and electronic computers, we have since found that their machines can represent, transduce and process other forms of information too. All kinds - words, pictures, sounds, music, videos, speech and more. Electronic digital data-processing computers use just two symbols (digits). The English word digit comes from the Latin word digitus, meaning “thing to point out with” - from the Latin verb dicere meaning “to say, to point out” from which the English word diction is also derived. Fingers are handy for pointing, so digitus became the Roman word for finger too - and to this day, people often wag a finger when making a forceful statement. And as fingers are also handy for counting, the English word digit later became synonymous with the word numeral (which is a symbol designating a number).

Roman numerals were based on a mixture of multiples of 5 and 10 (we have 5 fingers on one hand). Our modern decimal notation, based on a single base of 10, came to Europe from Persia, somewhere around the 14th century. In this notation, numbers are written as denoting the sum of multiples of successive powers of 10, reading from right to left. But we don’t need a base of 10 (and 10 different symbols) in order to be able to write a string of numerals to denote a number. We could (and digital computers do) use a base of 2 (and just 2 symbols) instead. A binary digit (or bit, for short) represents one of 2 values (0 and 1). So, for example, the value represented in decimal notation as the string of symbols “23” is represented in binary notation as “1011”, which denotes 1x16 + 0x8 + 1x4 + 1x2 +1x1.

When we write a decimal numeral on a piece of paper, we don’t write the symbol itself, but a picture of it - for example, the third decimal digit is commonly drawn as two curves, one on top of the other. Digital computers “draw” their own pictures of our symbols as different sequences of 16 bits using an encoding convention called Unicode. The 52nd Unicode character is 000000000011000000, which denotes the third decimal numeral. With 16 bits, Unicode can encode 216 (over a hundred thousand) different pictures, more than enough to distinguish all the symbols of all the alphabets of every human language.

The only meaning a symbol has – by itself – is its uniqueness, which is informationally equivalent to its position in the alphabet, a position distinct from all the other symbols. And yet, with only this humble property of distinctiveness, and just the simplest form of structure – the one-dimensional sequence - symbols can represent all the wealth of meaning contained in a poem or a description of the laws of physics.

Living computers have their own physical symbol alphabet, one as old as life itself – about 4 billion years old. It is the alphabet of chemical signals by which biological cells manage their internal processes and communicate with each other.

Life = Computation

Artificial electronic computers are made of processors (hardware) and processes (software). The software tells the hardware how to remember and process its data. But in a natural living computer, there is no separation of hardware and software, nor is there a simple distinction between process and memory.

All living computers, from the smallest bacterium to the largest blue whale, are made of cells. Each experience causes the cell to modify itself in some way, according to its recipe of life, inherited from its ancestors and encoded in the form of DNA, contained within each and every cell. The inputs to a cell are chemical signals sent to it by other cells or (in the case of sensor cells, the environment) and substances it eats by endocytosis, its computational processes are its metabolic pathways, and its outputs are its movements and the products it exports by exocytosis:

CellProcess(signals, food) = <products, movements>


Metabolic pathways are characterised as anabolic (building up), which synthesise large molecules (peptides, proteins, polysaccharides, lipids, and nucleic acids) from small ones (such as amino acids), consuming energy in the process; and catabolic (breaking down), which decompose large molecules into small ones, transferring the energy of their chemical bonds to the universal energy-carrier of life , adenosine triphosphate (ATP), which anabolic pathways use in their reactions.

Power stations convert the energy in coal to electricity, which drives the computations of electronic computers. Living cells have their own built-in power stations to provide energy for their computations. Cell respiration is a cascade of metabolic reactions (glycolosis, the Krebs cycle and the electron transport chain) that transfer the energy of the intricately bound bonds of glucose into the more immediately utilisable ones of ATP.

Cell Respiration

When a cell needs energy, it breaks the bond between the second and third phosphate groups of ATP to release its energy, forming adenosine diphosphate (ADP) and a free phosphate molecule. When a cell has excess energy, it stores it by forming ATP from ADP and free phosphate. In this way, ATP is a kind of portable battery, able to transfer energy between spatially-separate metabolic reactions. For this reason, it is often called life’s energy currency. Originally, the energy comes from the sun. Plants capture it during photosynthesis and, with phosphorous absorbed from the soil, convert it to chemical energy in the form of ATP. Using this energy, plants produce carbohydrates, fats and proteins which are eaten by animals, which digest the plant’s fabric and use its constituent compounds to make their own ATP. When the animal dies, its phosphorus eventually goes back into the soil and the cycle of life continues.

Entropy is a measure of the disorder (unpredictability) of a system. A jumble of bricks has more entropy than those same bricks organised into a physical structure like a house. And a random collection of bits has more entropy than one organised into a computer program. “Entropy increases” says the second law of thermodynamics. Pour cold milk into hot coffee in a warm room and the heat energy of the coffee will flow into the milk until it has all more or less reached the same temperature. The thermal information of the different temperatures of the milk and coffee has evaporated – the entropy of the system inside the cup has increased.

But if you drink it, something remarkable happens: its entropy decreases (Schrödinger, 1944) - its atoms stop bouncing around in a Brownian motion way and become more organised (until you die, that is). As they enter your digestive system, the atoms of milky coffee become data to the information processors of the stomach, intestine, bloodstream and from there to all cells of your body and mind, where they become building blocks for metabolic processes that create cell structures, movements, and chemical signals.

For example, the atoms that made up the cholesterol in milk can wind up becoming part of a steroid hormone manufactured by your adrenal gland; far from having a Brownian motion in a cup-sized sea, the steroid flows purposefully through the bloodstream looking for cells to talk to through their andrenoreceptors. This is as it should be; you were designed by nature to process and repackage the goodness in milk, even if it was thinking of your mother’s milk rather than Daisy the cow’s, which has about half as much cholesterol as human milk.

However, you aren’t programmed to process the data of caffeine, which sneaks across your blood-brain barrier and tricks the adenosine receptors of your neurons into thinking they have received a “calm down” signal, because they can’t tell the difference between adenosine and caffeine. And this confuses the neurons completely, because caffeine doesn’t affect the adenosine receptor in quite the expected way, and ends up antagonising (blocking) its normal function, which is to tell its neuron to slow down. So the neuron doesn’t slow down. With this happening all over the place, the overall effect is that your neurons and you and become more active than usual. Even though this may make your behaviour more chaotic than usual, the entropy of the caffeine molecules themselves has decreased because they have become part of your organised system.

The processes of life reduce the entropy of the atoms of their raw materials by organising them into cells and systems of cells. Each cell is like a little factory, taking in raw materials and putting out manufactured products. But it’s not a factory made of inert materials rooted to the spot on an industrial estate – it can move. Plants may not have wings or legs, and are rooted in the ground, but they are anything but stationary. Their roots are constantly on the move, seeking out sources of water. Their stems climb upwards, seeking the light; their branches spread their leaves in the sun, breathing in carbon dioxide and breathing out oxygen.

Fish don’t have wings or legs, but they can swim circles around a hawk or chetah, simply by flexing their muscles. Muscle cells expand and contract, but they are not the only kind of cells that move – all cells, even bone cells and teeth cells, pulse with the vibrancy of life. Inside every cell are microtubules, miniature monorails along which motorised transporter proteins called kinesins truck cargoes of macromolecules manufactured within the cell by its ribosomes. Kinesins have two heads connected via a short, flexible neck to a long tail. Its cargo binds to its tail while its twin heads alternately bind to the molecules along the microtubule in a kind of stepping motion. Power for the kinesin’s work is provided by the ubiquitous energy-carrying molecule of life, ATP.

Common Sense edit

The Earth garden of plenty is full of odours to delight or appall our senses. Smell is one of our most elemental and immediate senses, capable of provoking the most intense feeling of intimacy, contentment and pleasure, or the strongest reaction of disgust and abhorrence. The processes that determine our reactions to different smells are the result of complex computations performed by the cells of the brain and body.

Odorant molecules are the symbols of the alphabet of smells. The olfactory epithelium has millions of tiny hair-like receptor cells packed into an area about the size of a postage stamp. The membrane of the tip of a receptor cell is studded with several different molecular receptor molecules. Each receptor molecule has a particular physical and chemical structure that enables it to bind with (recognise) only one kind of odorant molecule (Peterlin et al, 2008). A binding event causes the receptor molecule to change its shape inside the receptor cell, which initiates a series of chemical changes (called a signalling pathway) inside the cell, giving rise to an electrical signal that is sent to the olfactory bulb, which integrates signals from many receptors and passes on the information it has computed to the brain. Axel and Buck (1991) found that mice have about 1000 different types of olfactory receptor cell; humans have only about 400, but we can differentiate about 10,000 odorant combinations that we recognise as distinct smells, such as the aroma of coffee, which is a combination of about 19 of its 800 different aromatic compounds (ref).

But there is more to smell than meets the nose: information in the form of chemical signals is the fundamental mechanism by which all cells (including neurons) communicate with each other. Every cell in every body, from the smallest bacterium to the largest blue whale, is an autonomous agent that reads and processes chemical messages passed to it. The human body is made of 100 trillion cells of 210 different types, but all of them – skin, bone, muscle, heart, stomach, kidneys, intestines and all the rest, including the neurons of the brain and nervous system - communicate with each other by the same basic mechanism of sending and receiving chemical signals.

The origin of life and its early evolution is still uncertain, but all known lifeforms are made of cells constructed of the same basic structural and functional chemical building blocks: carbohydrates (such as glucose), lipids (such as cholesterol), proteins, and nucleic acids. There are two main types of cell: prokaryotes (Archaea and Bacteria) and eukaryotes (plants and animals). Prokaryote cells are much smaller (and much less complicated) than eukayote ones. Prokaryotes are ancient species, some of which (called extremophiles) can still be found living in places like the hot springs of Yellowstone Park and inside thermal vents miles under the sea, as well as many others living in less harsh environments.

Lifeform sizes

The pace of progress in computer hardware miniaturisation is truly remarkable - what just a few decades ago would have required an entire room full of electronics can now be packed into a tiny chip smaller than the nail on your little finger. The feature size (the width of an electrically conducting or insulating path) of the latest microprocessor is about 0.045μm – several hundred transistors made from this technology could fit on a human red blood cell about 8 μm in diameter (Intel, 2009). The microelectronics industry is undoubtedly a modern Achilles, flying across the sea of progress on the winged heels of technology, whereas Nature is only the lumbering tortoise of evolution, but she did have a 4 billion year head start on us...

...the nucleoid of a bacterial cell contains its genome, a multiply-coiled string of nearly 2 million nucleotides, of which there are just 4 kinds: Adenine, Cytosine, Thymine and Guanine. Thus the genome is like a book written in a language having just 4 symbols {A,C,T,G}. Because there are 4 symbols, each symbol carries two bits of information (because there are four combinations of two bits). The latest technology electronic flash memory is capable of storing 3 bits per floating gate transistor, so it would require 750,000 such transistors to store 4 million bits – which would require a space the size of 3,000 red blood cells (assuming we could fit 250 transistors on one of them). But Nature fits that same information into a bacterium which is 1/100 the size of a red blood cell – so her miniaturisation of information storage components is 300,000 times more precisely engineered than man’s.

Animal cells employ a host of chemical signals, each having its own special meaning in terms of the effect it has. The terminology of chemical signals is rather inconsistent, being derived sometimes from their functional roles and sometimes from their chemical structures: those of neuron synapses are called neurotransmitters, those of the endocrine system are called peptide hormones and steroids, paracrine signals are called growth factors and clotting factors and those of the immune system are called eicosanoids (prostaglandins and leukotrienes) and cytokines (interleukins, interferons and growth factors).

Major endocrine glands: 1. Pineal gland 2. Pituitary gland 3. Thyroid gland 4. Thymus 5. Adrenal gland 6. Pancreas 7. Ovary 8. Testes

Endocrine hormones are broadcast throughout an animal via the highway of the body – the bloodstream - which functions as a medium for the transportation and distribution of chemical signals as well as the raw materials such as oxygen and nutrients and outputs (manufactured products and waste products) of cell processes. There are many different hormones, but each one targets only cells that can read its message. Peptide hormone messages are read by receptors embedded in the cell membrane; the smaller steroids pass through the cell wall (how?) and are read by receptors in the cytoplasm or nucleus membrane.

The heart, lungs, kidneys, liver, thymus, skin, placenta and gonads release hormones, as do several glands specialised to hormone production:-

The pineal gland, located in the middle of the brain, secretes melatonin, a hormone that regulates the wake-sleep cycle.

The hypothalamus, located in the lower central part of the brain, is the primary link between the endocrine and nervous systems. Factors such as emotions and environmental conditions such as temperature and light influence the hypothalmus to produce chemical messages that regulate the activity of the pituitary gland.

About the size of a pea, the pituitary gland is often called the "master gland" because it makes hormones that control other glands: growth hormone stimulates the growth of bone and other body tissues and plays a role in the body's handling of nutrients and minerals; vasopressin, also called antidiuretic hormone, regulates body water balance through its effect on the kidneys and urine output; thyrotropin, or thyroid stimulating hormone (TSH), stimulates the thyroid gland to produce thyroid hormones; adrenocorticotropin hormone (ACTH), stimulates the adrenal gland; endorphins which act on the nervous system to reduce sensitivity to pain; prolactin activates milk production in women who are breastfeeding and many other functions related to reproduction (Bancroft, 2005); oxytocin has many effects associated with mating, childbirth (it triggers the contractions of the uterus that occur during labor), child-rearing, and pair-bonding.

Follicles in the thyroid produce thyroglobulin, a storage form of thyroid hormone. TSH signals from the pituitary gland cause conversion of thyroglobulin into thyroxine and triiodothyronine which control the rate at which cells burn fuels from food to produce energy. Calcitonin is also secreted by large cells in the thyroid; it plays a role in regulation of calcium, which is used in bone and teeth formation, nerve functioning, muscle contraction, and blood clotting. Attached to the thyroid are the parathyroids - four tiny glands that function together and release parathyroid hormone, which, together with calcitonin, regulates the level of calcium in the blood.

The adrenal glands, one on top of each kidney, have two parts. The outer part, called the adrenal cortex, takes its instructions from the pituitary hormone ACTH and produces three kinds of corticosteroid: mineralocorticoids, glucocorticoids, and androgens. The mineralocorticoid aldosterone regulates electrolyte balance of sodium and potassium ions in the body. It does this by influencing the kidney; when aldosterone is at a high concentration in the blood, the kidney retains more sodium and loses more potassium than normal. Glucocorticoids such as cortisol produce a long-term, slow response to stress by raising blood glucose levels through the breakdown of fats and proteins; they also suppress the immune response and inhibit the inflammatory response. The androgens dehydroepiandrosterone and androstenedione are converted to testosterone by hair follicles, the sebaceous glands, the prostate, and the external genitalia and are involved in estrogen synthesis in adipose tissue.

The inner adrenal gland, the adrenal medulla, produces catecholamines, such as adrenaline (called epinephrine in USA) and noradrenaline, which underlie the fight-or-flight response, directly increasing heart rate, triggering the release of glucose from energy stores, and increasing blood flow to skeletal muscle, dilation of pupils, dilation of air passages in the lungs and narrowing of blood vessels in non-essential organs. Glucose, which provides the energy needed for cell metabolism, is manufactured by the liver when so commanded by an adrenaline signal. Adrenoceptors are present in the muscle within the walls of blood vessels of the extremities; noradrenaline causes the muscle to contract, resulting in a narrowing of the blood vessels in the extremities, which redirects blood to essential organs such as the heart and brain. It also produces greater resistance for the heart to beat against, which increases blood pressure. With dilated blood vessels and air passages, the body is able to pass more blood to the muscles and get more oxygen into the lungs, increasing physical performance for short bursts of time to cope with dangerous and unexpected situations.

After a meal, blood glucose levels rise, prompting the pancreas to release insulin, which causes cells to take up glucose, and liver and skeletal muscle cells to form the carbohydrate glycogen. As glucose levels in the blood fall, insulin production is inhibited and glucagon is released, which causes the breakdown of glycogen into glucose, which in turn is released into the blood to maintain glucose levels within a homeostatic range.

Male gonads, the testes, secrete androgens, including testosterone, which regulates the production of sperm and has various side effects such as deepening of the voice, growth of facial and pubic hair, and muscle growth. Female gonads, the ovaries, secrete estrogen and progesterone which are involved in pregnancy and the regulation of the menstrual cycle and have various side effects such as breast growth, the accumulation of body fat around the hips and thighs, and the growth spurt that occurs during puberty.

Hormone levels are managed by negative feedback systems. For example, the pituitary gland senses the level of thyroid hormone in the bloodstream and adjusts its release of thyrotropin, the pituitary hormone that stimulates the thyroid gland to produce thyroid hormones. When the blood calcium level rises, the parathyroid glands sense the change and decrease their secretion of parathyroid hormone.

The hypothalamic-pituitary-adrenal axis (HPA) has a well-characterized circadian pattern that is under the control of the suprachiasmatic nuclei (SCN) of the hypothalamus, whose activation, in turn, is regulated by light. The periventricular hypothalamic network plays a role on coordinating neuroendocrine, autonomic and behavioural outflows to circadian, immune, and psychogenic stimuli. HPA feedback loops occur at different time domains, referred as slow (in response to chronic exposure to glucocorticoids), intermediate and fast feedback (both in response to stress and to circadian events). The control of HPA circadian rhythmicity depends on hypothalamic–pituitary activity driven by corticotropin-releasing hormone (CRH) and arginine vasopressin, ACTH, and adrenal responsiveness to ACTH. This activity is ultimately driven by light (at the SCN) and by food (at the ventromedial hypothalamus, which regulates the expression of the CRH gene). Stressful stimuli, either physical or psychological, can disrupt the HPA axis homeostasis. Those stimuli can originate from extra-hypothalamic sites, such as the cathecolaminergic cell groups throughout brainstem, the spinohypothalamic-spinothalamic-spinoreticulothalamic pain pathways, pro-inflammatory cytokines originated from the immune system, and psychogenic inputs from the medial prefrontal cortex and from the hippocampus. (Paz-Filho et al, 2003).

Gonadotropin-releasing hormone (GnRH) triggers a cascade of hormones that prime the body for sex and procreation. Humans, as well as birds, mice and sheep, release gonadotropin-inhibitory hormone (GnIH), which puts a brake on the cascade. GnIH receptors are found on GnRH neurons in the hypothalamus and in the pituitary gland, and in the gonads. Its overall effect is to inhibit reproduction, but at different levels of the reproductive axis. Stress affects levels of GnIH, explaining the drop in fertility attributable to stress. Because reproductive hormones often promote the growth of cancer cells, GnIH might have theraputic value as an anti-cancer agent. Treatment of hormone-responsive cancers involves GnRH antagonists or very high doses of GnRH, which cause side effects; GnIH therapy may be more effective. The human GnIH gene produces a precursor protein comprising 12 and 8 amino acid mature peptides. One of the peptides in sheep has the same amino acid sequence as the human peptide; the human hormone has been shown to inhibit the release of gonadotropin in sheep (Ubuka et al, 2009).

Reading the Message - Signal Transduction When a messenger molecule arrives at a cell, it triggers a cascade of biocomputations inside the cell. The cytoplasm of a cell is enclosed within a a kind of liquid crystal waterproof plastic (a phospholipid), which contains and protects the life machinery of the cell. The membrane is impermeable to water, but is studded with receptor proteins that can bind to particular messenger molecules. We can think of a messenger molecule as a key and the receptor as a lock. If the key fits the lock, the message is received. This is called a binding event because the molecules concerned temporarily bind together - for this reason, messenger molecules are called ligands. Later on, a feedback system within the cell undoes the binding and de-activates the messenger, freeing up the receptor to receive new messages in the future. The deactivated messenger is recycled by other processes.

A binding event between a messenger molecule and a receptor protein causes a conformational change in the receptor, which ripples through to the part of it exposed inside the cell. The process is called signal transduction because the information represented by the messenger on the outside is transduced into information represented by the different shape of the transducer inside the cell. Its new shape allows it to bind with another protein (called a transducer) inside the cell cytoplasm.

For example, adrenaline is a hormone secreted into the bloodstream by the adrenal gland when the brain decides urgent action is required. When it arrives at a cell capable of reading its message, the following signal transduction takes place (, 2004):

The message, a molecule of adrenaline (1), binds to its specific receptor (2). Thereby, the transducer, a G protein (3) is activated. This in turn stimulates the amplifier adenylate cyclase (4) to produce a second messenger cyclic AMP (5) with the help of ATP (6). This provokes a cascade of enzymatic reactions (7) which include the phosphorylation of glycogen (8) which is transformed to glucose (9), which the cell uses to generate ATP (10). Glycogen phosphorylation can also alter the proteins of gated ion channels (11). Intracell ion concentrations (mostly potasium and calcium ions) affect what cells do. Muscle cells, for example, require a certain concentration of calcium ions in the cell’s cytoplasm before contracting. Ion concentration also regulates exocytosis (the export of material out of a cell) of waste products and signalling molecules. Each ion channel conducts a specific species of ion, such as sodium or potassium. In free space, electrical forces drive charges from high energy pockets to low ones, just as variations in atmospheric pressure create winds of air flowing from high to low pressure areas. The cell membrane and the gating of ion channels manage this natural tendency, like the steel jacket and governer of a steam engine control the pressure inside its boiler.

Large molecule messengers such as adrenaline bind to receptors in the cell membrane, but small ones such as steroid hormones slip through the cell wall and bind to receptors within the cell cytoplasm or embedded in the membrane of its nucleus. There are five kinds of human steroid: Progesterone regulates events during pregnancy; Corticoids suppress inflammation reactions and regulate mineral and sugar metabolism; Androgens promote male sex development and maintain male sex characteristics; Estrogens promote female sex development

In plants, auxin is an example of a steroid hormone that regulates longitudinal cell structure so as to allow bending of the stalk or stem in phototrophic response. Most steroid hormones are neither basic nor acidic, with the exception of estradole, which is slightly acidic due to a phenol component. .....

All cellular functions involve, directly or indirectly, the genetic information contained in the nucleotides (made of nucleic acids) of a cell’s genome contained inside its nucleus.

When messenger molecules bind with receptors in the cell or its membrane, they cause cascades of anabolic and catabolic biocomputations which perform cell functions, such as the manufacture of protein products from the raw materials of amino acids according to the recipe of the cell’s DNA.

Plant cells have strong walls, but they are not shut off from the rest of the world. The wall is composed of layers of cellulose microfibres embedded in a matrix of pectin and hemicellulose. The matrix is porous, which allows nutrients and chemical signals made of small molecules to seep through.

The growing tip of a plant has light sensitive receptors which, when activated, induce (Yamamura and Hasegawa, 2001) the release of substances in the side of the plant towards the light that antagonize (inhibit the action of) auxin, a plant hormone. Its normal function provokes a proton pump within a cell that expells H+ ions in its wall which in turn activates enzymes called expansins that break bonds in the cell wall structure, making the cell walls less rigid which in turn causes the cell to swell from its internal turgor pressure. Inhibiting auxin function in the cells of the plant stem on the lit side of the plant means they do not swell, whereupon the different amounts of swelling on each side of the stem cause it to curve towards the light.

Some leaves have a specialized organ called a pulvinus at their base, made of thick-walled water-conducting vascular tissue surrounded by thin-walled motor cells which can undergo visible swelling and shrinking. In an analogy with the muscles that move animal joints, the motor cells on the lower site of the pulvinus are called flexor cells and on the upper site extensor cells. Upon receipt of a chemical signal called a phytochrome, transmitted from a light sensor in response to its stimulus, extensor cells open potassium ion channels which increases the cell’s tugor pressure causing it to expand. Upon darkness, K+ channels in the extensor cells close but open in the flexor cells which lose turgor pressure and shrink, the pulvinus joint looses its rigidity and lets the leaf droop.

Stomatal pores in the epidermis of aerial parts of plants facilitate gas exchange between plants and the atmosphere. Stomatal pores are surrounded by pairs of guard cells that mediate stomatal pore opening and closing. Guard cells respond to diverse stimuli, including blue light, CO2 concentrations, drought, pathogen attack, and plant hormones, including abscisic acid (ABA), mediated by ion transport across the plasma membrane and vacuolar membrane of guard cells and by organic solute content changes (Mori et al, 2006).

Now where did all this marvellous computational machinery come from?

The term used to describe the sensible behaviour of motile bacteria navigating in a field of molecules is chemotaxis: it expresses the observation that bacteria move toward concentrations of molecules they can use, and away from those that are toxic to them. Bacteria move by means of flagella, which whip around in circles by means of the motor apparatus shown here, which is one of very few biological examples of a wheel motor.

The multiple flagella of E-coli are coiled together clockwisely; when they all rotate anticlockwisely, they have the effect of little turning screws that move the fluid they live in, in just the way that Archimedes rediscovered 2 billion years later, when looking for a way to lift water from a well. As the fluid moves across its body, so the bacterium moves forwards like a helicopter or propellor. Just as Leonardo realised two thousand years after Archimedes, and John Ericsson (who patented the ship propellor), three hundred years after him. Who says bacteria aren’t intelligent!

Whereas anticlockwise rotation of its flagella drive the bacterium forwards, their clockwise rotation creates a turbulence that causes the bacterium to tumble over and point in a new direction. With just these two kinds of motion – propelling itself straight ahead or tumbling over, flagellate bacteria navigate their way in the world.

We can think of E-coli as a tiny robot programmed by instructions in the form of the functional properties of its inner assemblies and the proteins inside it, as shown in the flow diagram of its processes, which shows four different transmembrane receptors working together.

Each receptor can respond to a molecule of a particular range of chemicals from its environment, by becoming bound to it. When it does so, the receptor protein changes in such a way that the part of it on the inside of the bacterium can bind to CheB proteins that are hanging around waiting for their chance to do something.

The signal transduction chain results in two end products: (1) a signal (in the form of a phosphorylated CheY protein) is sent to the flagellum motor complex, where it can bind to a protein called FliM which affects the rotation of a flagellum, and (2) a feedback signal protein (CheR) is created, which antagonises any binding of the transmembrane receptors to other ligands. The feedback only lasts a few seconds - in the long run, this combines with the two possible modes of flagellar rotation to have the overall effect of causing the bacterium to climb concentration gradients of things it likes and drift away from those it doesn’t.

Bacteria often live in colonies and communicate with each other by sending and sensing small signalling molecules called autoinducers which identify them to their neighbours - each species has its own signature. In addition, there is one special autoinducer that all bacteria can produce and sense. In a process called quorum sensing (Waters and Bassler, 2005), bacteria adapt their behaviour according to the numbers they sense of their own and other species in their neighbourhood. As each individual in a species group has the same decision-making machinery, the entire group tends to behave in the same way. This gives them the survival advantage of strength in numbers in competitive situations, such as when they are under attack from an animal’s immune system.

Although bacterial signal transduction is a different set of reactions from those of eukaryotic cells, there is evidence (Marlovitz et al, 2002) that a bacterial protein called FeoB may be a primordial archetype of the G proteins. It took a long time - over 3 billion years - for evolution to get from stromatolites to Lucy, the earliest known hominid (Johansen & Edy, 1981), and just another thousandth of that time to get from her to us, but it would seem that the bacteria and we can thank our common ancestor for giving our cells their sense of smell, so they can work together, so we can exist.

Seeing is Believing edit

Many living things engage in commensal relationships, in which each party benefits in some way from the interaction. Commensal relationships involve communication, so that each side knows who and where the other one is. For example, flowers look pretty to us, but that is not their purpose in life - they are there to attract insects like bees.

Plants are mostly made of stems and leaves. Stems are usually dull-coloured and leaves are usually green. But flowers are strikingly painted in all the bright colours of the rainbow. Among marine life, bright colours usually signal toxicity. This deters predators. An aversion to eating brightly-coloured things seems to have carried across from the marine world (where life probably began) into the world of land animals. There are very few herbivores that eat brightly-coloured flowers - just as well, for if herbivores saw flowers as food, plant reproduction would be devastated and the entire food web of the ecosystem would fall apart.

Flowers advertise themselves using special ultraviolet colours that bees can clearly differentiate from the surrounding vegetation. As we are not bees, we don't know exactly what a flower looks like to a bee - but we can imagine what it might look like by frequency shifting the image that is reflected by a flower when it is bathed only in ultraviolet light. Bees are programmed by their instincts to associate glowing ultraviolet colours with the presence of nectar; the colours advertise the promise of nectar that bees need for their energetic lives.

Plant nectar is usually at the bottom of a deep well. To get to it, the bee has to crawl past the flower’s stigma (the entrance to its ovaries) and anther. The bee doesn’t manage to get away until its hairy body has brushed against the flower’s anthers, so that its sticky sacks of pollen get stuck to it, ready for transport to another flower later on when the bee feels hungry again.

Once the anthers have been emptied, the stigma opens so that any visitors carrying pollen from another flower will have it scraped off into the stigma, down the style and into the ovaries. By keeping the stigma closed until the anthers have been emptied, the flower avoids self-pollination, whence its children will reap the evolutionary benefits of a mixed marriage (over many generations, inbreeds tend to develop disadvantageous characteristics).

Flowers are computers - the process by which a flower recognises that its anthers have been emptied and decides only then to open its stigma is a computation. And so are bees - a bee has to process the information advertised by the plant to know where to go for that lovely nectar. Pooh Bear finds the honey produced by bees from nectar by its smell; as bees have olfactory receptors on their antennae, it is very likely that they can recognise the heady perfumes of flowers as well as identify their ultraviolet displays.

Some flowers have a more seductive – and deceitful - method of advertising themselves to insects. The Hammerhead Orchid of Australia tricks male Thynnid Wasps into visiting it by mimicking the shape and smell of a female wasp. As a male wasp attempts to mate with what it thinks is a female, a plastic hinge on the orchid causes the wasp to sway back and forth, brushing against the orchid's sticky pollen sacks which stick to its back. When it later performs the same act with another hammerhead orchid, the sacks will be detached and the flower fertilised.

Trickery for the purpose of exploitation is common within the kingdoms of biota, and often the false enticement is sexual allure - there is more to light than meets the gleam in the eye.

But you don't need an eye like ours in order to see what is going on. The brittlestar Ophiocoma wendtii is covered with hundreds of hemispherical calcite bumps on its upper body. Each bump acts as a lens, focussing light on a bundle of nerve fibres underneath. Beside each lens are dark pigmented cells (chromatophores) which migrate to the surface of the lens when the light is bright - the brittlestar puts on its sunglasses! The animal responds to sudden changes of light intensity by running away - even if it doesn't have what we would recognise as a brain, it certainly has a neural network capable of processing light information, making decisions, directing its muscles to move its arms rhythmically so it can scuttle away, and navigating itself to a place of safety.

A change of light intensity - whether across time or across space - conveys information. We don't yet know whether O. wendtii's miniature eyes create an image in its mind's eye, but experiments by Strange and Howard (1979) showed that the compound eyes of a dragonfly provide enough resolution for it to be able to perceive the position and orientation of its environment, and, when performing rapid turns, monitor its position so as to be able to adjust its flight attitude, which is vital for stability.

We may not be as smart as dragonflies - we haven't figured out how to fly like them yet - but we can read a newspaper. When we read, the patterns of light and dark reflected from the shapes of the pictures of symbols on the paper fall upon a light sensitive molecules such as rhodopsin in the membranes of photoreceptor neurons at the back of our eyes.

Ordinarily, a receptor neuron constantly emits signalling proteins (neurotransmitters) into its terminal synapses. But when light falls upon its rhodopsin, a chain of chemical events takes place that results in the photoreceptor membrane becoming hyperpolarised, which causes the neuron to stop releasing neurotransmitters.

The behaviour of a photoreceptor neuron is rather like that of an electronic transistor circuit called an inverter, or NOT gate, where a potential on the input (A) switches off the normal flow of the Vdd-Vss current, so that the output of the circuit (Q) is the opposite of its input.

Several transistors can be wired up in a circuit to compute a logical function of two single-bit signals. One logical function that is very useful for building electronic processors is NAND (the opposite of AND), because each and every logical function of two bits - such as the calculation of the sum or carry bit of binary addition - can be computed by a composition of NOT and NAND functions. Inverters and NAND gates are the standard building blocks of the integrated circuits of digital computer processors. With just these basic building blocks, processors can be built that are capable of being programmed to perform highly complex computations, such as predicting the weather or figuring out how to beat a world chess champion.

The principle of performing complex computations by the composition of simple ones is realised in living computers by networks of neuron cells connected by synapses. These are the computational building blocks of brains.

A synapse is a junction between an axon terminal of a transmitter (presynaptic) neuron and a dendrite of a receiver (postsynaptic) neuron. The transmission of information across a synapse operates by the mechanism of chemical signal transduction: a nerve impulse (action potential) in the axon of the transmitter neuron causes signalling molecules (neurotransmitters) to be released from its terminals into the synaptic gap.

The neurotransmitters bind to receptors in the receiving neuron, prompting a signal transduction sequence which results in the opening of ion channels in the receiver dendrite, causing electrical charges to flow into it (excitatory) or out from it (inhibitory).

If at any time the result of all the synapses engaged in by receiver dendrites is that the axon hillock of the receiver neuron becomes sufficiently energised, a pulse of action potential is propagated along its axon. The combined effect of as many as 10,000 synapses determines whether the electrical potential induced in the receiver neuron's axon hillock is sufficient to reach the threshold required to trigger a nerve impulse (action potential) along its axon, for passing on to other neurons or effector cells. Often, a single input (a) is not sufficient and several inputs have to co-occur (b) or overlap in time (c).

The only kind of signal that can be propagated along the axis of a neuron is a pulse of action potential. Axons carry binary signals; the signal induced in a receiver axon is either a pulse or no pulse. Whether a receiver pulses or not is dependent of the strengths and frequencies of presynaptic pulses to the extent that their combined effect determines whether the receiver's threshold potentiation is reached. The amplitude of the pulse induced in the receiver once its potentiation threshold has been reached is purely a function of the receiver neuron's particular chemistry. Hence we can think of it as a binary (logical) signal, or maybe a vibration - higher frequencies are stronger signals, more likely to induce action potentials in neurons with which they have synapses.

Either way, a synapse performs a computation that lies at the heart of any logical reasoning system - an inference. A synapse expresses an inference rule: 'IF this THEN that' - it will infer that that is true if it thinks that this is true. An inference rule is an association, a belief that its output can be inferred from its inputs. An autonomic reflex is made by a synapse at the spinal cord between an sensory neuron carrying a message from a sensor and a motor neuron, carrying a message to a muscle:

IF toohot THEN withdraw

IF kneetap THEN kneejerk

But even in the case of an autonomic reflex, a single synapse is not the whole story - the effector neuron also receives messages from the brain and the reflex doesn't always happen (Kennard, 1947). A more complex emotional response, thoughtful assessment, or perceptual impression, is made in the brain as a result of thousands, maybe millions, of synapses making inferences.

We are born with many synapses already in place, put there by the creative miracle of gestation, which also created our biological machinery for making new synapses, for learning through feedback.

There are lots of things to learn. A baby will put anything within reach into its mouth; but very soon, thanks to feedback from experience, the developing child starts to acquire a sense of discrimination, that some things are yukky and others are tasty. It forms beliefs.

Our beliefs are inferences and inference rules; they express our understanding of the relationships between things. For example, that things that look or smell a particular way can be inferred to possess certain properties (such as being nice or nasty).

Once formed, beliefs become powerful filters of new information, so much so that much available information can be ignored or misinterpreted: "Only a part of what is perceived comes through the senses from the object. The remainder always comes from within" (Luckiesh, 1922). We don't see what is in front of our eyes - we see what we think is there. We see what we can believe.

The intuitive semantics of inference rules that neatly capture the concept of a heuristic (a 'rule of thumb' that works most of the time), led Newell and Simon (1972) to use them as the basis of their theory of the computational basis of human problem solving. Their work was very influential in efforts to produce computer programs capable of demonstrating Artificial Intelligence, particularly those that later came to be labelled 'Expert Systems' (McCorduck, 1979). Although such programs could demonstrate some impressive abilities, they also suffered from a kind of myopia - they could only reason within a limited sphere, and lacked the ability to handle situations beyond this limit. Minsky (2006) observes that human reasoning doesn't suffer this same brittleness, and seems to involve more use of analogy than deduction. However, even though humans have very wide-ranging powers of commonsense, they too are capable of abject myopia, turning a blind eye to things that don't fit in with their existing beliefs and maintaining beliefs that are mutually inconsistent.

Part of the reason for this phenomenon may be the physical localisation of brain function. For example, the fusiform gyrus, located just inside the inner surface of the temporal lobes, is where face recognition is performed.

Damage to this area causes Capgras syndrome in which a person can no longer recognize people's faces even though she can still recognize their voices - she can't even recognize herself in the mirror. A rarer syndrome, called the Capgras delusion, is when a patient believes a loved one has been replaced by an imposter. Ramachandran and Blakeslee (1997) deduce that his curious disorder could be due to localised damage to the link between the fusiform gyrus and the amygdala, so the patient does not experience their normal emotional response to seeing a loved one. Lacking this normal feedback, the cortex rationalises that the person they are seeing is not the one they love.

The cortex also has trouble reorganising its circuits that interpret data from the peripheral nervous system when that data ceases to arrive. After a damaged limb has been amputated, the patient can continue to vividly sense its imaginary presence called a phantom limb. She knows that the limb is not there, but nevertheless receives compelling sensory experiences of its imaginary existence because although the limb no longer exists, the nerve fibres it was once connected to are still intact and being stimulated by other connections. Anyone who once had a paralysed but painful limb amputated can subsequently experience having a paralysed phantom arm, with the same pain, because the brain has learned the sensation of having a painfully paralysed arm, and this learned belief remains even after the arm itself has ceased to exist. To help a patient with this condition unlearn his false belief, Ramachandran came up with the idea of giving him a virtual arm through the technically simple device of a box with a mirror in it. The patient puts his normal hand beside the mirror, so that when he looks in the box, it looks to him like he has two hands. When asked to move his normal hand, he receives the illusion of an imaginary other hand also moving. He knows it's not real; he knows it's just a mirror reflection, but it's a vivid sensory experience. After some weeks of repeated practice, this visual feedback creates new connections in his brain which circumvent the old circuits that had produced the feelings of pain.

Seeing, then, is a matter of fitting perceived images to existing belief structures. But you don't need eyes to see. When a sound signal is sent into a medium such as air or water, part of it will be reflected back if it strikes an object. The distance to the object can be calculated by measuring the time between when the signal is sent out and when the reflected sound, or echo, is received. The size of the reflecting object is visible by the spread of the echo. Anyone experienced in using an inexpensive “fishfinder” can tell the difference between echoes produced by the seabed and those of a school of fish on a sonar image visual display screen. Whales and dolphins use active sonar to identify underwater objects and to help find food. These marine mammals produce very sophisticated sounds such as frequency sweeps and chains of clicks that tell them much about the target when they are reflected back.

Bats emit constant frequency strobed pulses of sound waves. Some emit sounds from their mouth, which they hold open as they fly; others emit sound through their nose. These sounds are reflected by objects in the environment giving rise to a unique “auditory power spectrum signature” for each type of object. By recognising the special echo signatures of particular prey species, the bat can distinguish them from the background auditory profile including such things as reflections from stationary objects such as trees.

A bat's prey are small, quick-moving insects. A bat listens carefully to the echoes that return to it. The bat's brain determines how long it takes a noise to return and figures out how far away an object is. The bat can also determine where the object is, how big it is and in what direction it is moving. The bat can tell if an insect is to the right or left by comparing when the sound reaches its right ear to when the sound reaches its left ear: if the sound of the echo reaches the right ear before it reaches the left ear, the insect is obviously to the right. The bat's ears have a complex collection of folds that help it determine an insect's vertical position. Echoes coming from below will hit the folds of the outer ear at a different point than sounds coming from above, and so will sound different when they reach the bat's inner ear.

A bat can tell how big an insect is based on the intensity of the echo. A smaller object will reflect less of the sound wave, and so will produce a less intense echo. The bat can sense in which direction the insect moving is based on the pitch of the echo. If the insect is moving away from the bat, the returning echo will have a lower pitch than the original sound, while the echo from an insect moving toward the bat will have a higher pitch. This difference is due to the Doppler Effect. (Doppler shift occurs when sound is generated by, or reflected off, by a moving object.)

One species of bat uses frequency modulated echolocation (FM) and another uses constant frequency echolocation (CF).. In both cases, the bat emits pulses of sound and listens for echoes of insects or suitable preys in between the intervals. After detection of a target, the pulse rate increases. This increases the rate of information update when the bat is getting nearer to its prey. Part of the signature recognition processing performed by bats requires taking into account the effect of movement; as the bat moves towards an object, reflected waves are Doppler-shifted. If the object contains moving parts – such as the flapping wings of a moth - these too will affect the Doppler-shifted echo signal. As the bat approaches a moth, it increases the frequency of its pulses to obtain a higher resolution auditory image and avoid interference between transmitted and reflected signals. Precisely what neural processes occur in the bat to perform the very complex computations involved is still unknown, but it has been shown that different areas of the bat’s brain are active at different phases of the (approach - capture – depart) sequence.

Of Mice and Microbes edit

One problem that has interested mathematicians is to figure out whether a mathematical formalism could pose and answer any question - the Entscheidungsproblem. Perhaps the most famous contribution to this question of all questions is the mathematical proof of Kurt Godel (1931), which basically concludes that the problem is inherently unsolvable. His formalism led to a version of the self-contradiction paradox which goes something like this: if a man says "i am telling you a lie", then if his statement is true, it must be a lie, and vice-versa.

Alan Turing (1936) realised there was a parallel between what could be defined by a mathematical formalism and what could be computed by a machine. To do this, he reduced the concept of computation to its simplest elements: input, process, output. He conceived an abstract model of computation, now called a Turing Machine (TM), composed of two basic elements: a long strip of paper on which the machine could read or write one symbol at a time, and a registry of numbered "states of mind" that processed the symbols on the tape.

Each state contained a single transformation rule that read the symbol on the tape, erased it or wrote a symbol in its place (which could be the same one), moved the tape one place left or right (or left it where it was) and specified the next state to be entered. Since each state in the registry is uniquely identified by its state number, we can imagine stringing them together in a tape of their own. So a TM is a string of symbols that operates upon another string of symbols.

Turing was able to show that with just this simple apparatus, his imaginary automaton was theoretically complete in the the sense that for every mathematical (recursive) function of the type considered by Godel, there was a TM that could compute its function, by recasting the liar paradox in the form of a machine which would go into an infinite loop when fed the definition of another machine that would complete its computation and halt. Then, on feeding a copy of the machine to itself, it would halt if and only if it didn’t.. His thesis provided the foundation for a host of subsequent efforts to reduce the concept of computation to its simplest elements, and the phrase 'Turing-complete' entered the textbooks.

A few years later, Turing raised the question: "Can a machine think?" (Turing, 1950). At that time, there was no clear definition of what thinking actually is, but then, as now, it was generally agreed that thinking is something that brains can do. We might turn Turing's question around, and ask instead: can a brain compute? In one sense, this is a trivial question - since the first computers were people, they must be able to! But since mathematicians have gone to so much trouble over the Entscheidungsproblem, we perhaps ought to enquire as to whether biological brains are Turing-complete, if only for the sake of completeness.

Mice have brains made out of neurons that communicate with each other via synapses. The biochemistry of neurons and synapses is complex, but their basic function is simple - in essence, the dendrites of a neuron receive chemical messages through their synapses, which open ion channels, which collectively create an electrical potential in the receiver, which, if it is enough to reach a certain threshold, will induce a pulse (spike of action potential) in the axon of the receiver neuron. Successive excitations can lead to successive pulses. We could think of the frequency of pulses along an axon as being the signal carried by it (Wilson & Cowan 1973). but for the present purposes, it is more convenient to consider just the two extrema, 0 and 1.

In the diagrams below, a neuron is abstractly depicted as an oval with a line (its axon) coming out of it. Axons have multiple terminals, each of which can synapse with a different receiver neuron (or with different dendrites of the same receiver). An excitatory synapse is depicted as an arrowhead and an inhibitory one as a little circle.

Assuming a potentiation threshold of 1 in a receiver neuron, we could construct an inverter from two synapses as shown here. The upper input neuron supplies a constant pulse (1) to an excitatory synapse and the lower one provides a variable signal A (which could be either 1 or 0) to an inhibitory synapse. If A=1, the inhibitory synapse will cancel out the effect of the excitatory one, and no pulse will be induced in the receiver neuron. If A=0, there will be no inhibitory effect, and a pulse will be induced in the receiver. Either way, the signal induced in the receiver is ~A, the negation of A.

Synapses could be configured to perform logical computations in various ways; for example, one way to construct an AND (&) function is to make both inputs excitatory and use a receiver potentiation threshold of 2.

Another way to make an AND function could be to have a third constant input to an inhibitory synapse, and use a receiver potentiation threshold of 1; the inhibitory synapse and potentiation threshold combine to prevent an impulse being induced in the receiver in every combination of input signals except when they are both 1.

Any function of two bits can be computed by a composition of NOT and AND functions - shown here is one way to make a NAND function. And as previously mentioned, any function of any number of bits can be constructed from just these basic building blocks.

Hence we can conclude that any computable function can be computed by a neural network. In principle, of course! In practice, the functionality of a physical network is limited by its size. The human brain contains about 100,000,000,000 neurons, a bee's has 950,000 and Aplysia, a sea slug, manages to get along just fine with only 20,000 (Chudler, 2008).

Mice and sea slugs have brains, but bacteria don't. They don't have neurons and synapses; they are a hundred times smaller than a single neuron. It is tempting to imagine they are completely stupid, incapable of anything remotely resembling computation. But this image is false.

Bacteria compute - they navigate through chemical fields like sensible robots, reacting sensibly to the smell of the chemical information that arrives at the receptors in their membranes. And they can adapt as well. Bacteria subjected to high temperatures respond by manufacturing extra chaperone proteins (McCallum and Inniss, 1990) which protect them against the stress of future strange encounters of the thermal kind.

A protein is a ribbon of up to 20 different kinds of amino acid chained together, that folds around into a particular shape by the attractive and repulsive electromagnetic forces of its constituent atoms. Protein folding can be adversely affected by heat; chaperone proteins act to prevent misfolding in a new protein being formed. Just as this sentence is a string of letters and other symbols, so a protein is a chemical message expressed by a string of amino acids - the 20 different amino acids are the 20 different symbols of which protein messages are made.

Proteins are manufactured by ribosomes to a prescription given to them in the form of ribonucleic acid (RNA) - the formula of all life as we know it. RNA is a chain (polymer) of 4 different nucleotides: adenine, cytosine, guanine and uracil, better known by their initials A, C, G and U. A triplet of nucleotides is called a codon because it encodes the prescription of a particular amino acid.

The encoding is unambiguous - each triplet encodes one and only one amino acid, but it is redundant in that there can be more than one code for a particular amino acid. For example, the sequences CAU and CAC both encode an amino acid called histidine. There are 64 possible permutations of four different things taken three at a time, which encode the 20 amino acids that occur in proteins. How the code originated is still unknown, but it probably evolved from simpler forms.

The eukaryotic cell's genome is stored in chromosomes, each of which is a single molecule of deoxyribonucleic acid (DNA), a double-sided chain of paired-up nucleotides, probably evolved from RNA (Leon, 1998). DNA uses a methylated form of uracil, called thymine (T); whereas uracil will happily bond to any of the other nucleotides, T only bonds to (pairs with) A. And C only pairs with G. The pairing up, the deoxygenation of its sugar-phosphate backbone, and the methylisation, make DNA hydrophobic and unresponsive to the attentions of most enzymes, enabling it to serve as a long term memory for the information encoded by its sequence of nucleotides.

A change of temperature, or some other cause such as the arrival of a molecule at its cell membrane, prompts the genome to express a gene by transcribe the information content of the appropriate part of itself into RNA (called messenger RNA, or mRNA for short), containing the recipe of the protein it wants to be made. There are two general types of gene in the human genome: non-coding RNA genes and protein-coding genes.

Non-coding RNA genes represent 2-5 per cent of the total and encode functional RNA molecules. Many of these RNAs are involved in the control of gene expression, particularly protein synthesis. They have no overall conserved structure.

The job of translating mRNA into a protein is performed by a ribosome, which walks along the mRNA message, matching each codon (nucleotide triple) on it with a stock of aminocyl-tRNA molecules made from the other strand of DNA. The matching process causes the amino acid to become divorced from its tRNA partner and married to the end of the polypeptide chain (protein) being assembled. Meanwhile, the matching of tRNA and mRNA doesn't result in a lasting marriage as the ribosome cuts them up into their constituent parts for recycling.

A ribosome is itself made of RNA and protein. Hence a ribosome is also a string of symbols - rather a long one, made of 2.64 million atoms (lanl, 2005). So, a ribosome is a machine made of a string of symbols (RNA and protein) that reads a string of nucleotide symbols (mRNA) and writes a string of amino acid symbols (a protein) - just like a Turing Machine.

Shapiro (1999) realised that "the basic operations of certain biomolecular machines within living cells - recognition of molecular building blocks, cleavage and ligation of biopolymer molecules, and movement along a polymer - could all be used, in principle, to construct a universal computer based on Turing’s conceptual machine". He has taken this idea along the road of practical application: "We have already succeeded in creating a biological automaton made of DNA and proteins able to diagnose in a test tube the molecular symptoms of certain cancers and “treat” the disease by releasing a therapeutic molecule." (Shapiro and Benenson, 2006).

In theory, a ribosome-like machine could be made to compute any function that a Turing Machine can; that is, any function at all that can be defined by any machine - by any computer, even the most powerful supercomputer. In principle, a ribosome computer could be made that could beat the world chess champion, write stories as entrancing as Harry Potter, compose symphonies and plot clever ways of making money on the stockmarket - all at the same time!

But only in theory. In practice, what a ribosome can and can't compute is limited by its physical size. But that's also true of brains. The human brain has 100 billion neurons, each at least 100 times bigger than a bacterium which is 100 times bigger than a ribosome. Given all the remarkable computations just one ribosome can do, just imagine what 100 x 100 x 100 billion ribosomes could compute if they all worked together ....

At first glance, stromatolites and people appear to be very different things. Stromatolites - prokaryote colonies - are the oldest known form of life on Earth, dating back over 3 billion years (Allwood et al, 2006), whereas hominids are the newest eukaryote kids on the block of evolution who did not emerge as a distinct species until some 2.995 billion years later (Kumar, 2005) – viewed from this perspective, our most ancient hominid ancestor did fall out of her tree only yesterday!

Despite the huge stretch of time from the birth of the stromatolites to our arrival on this planet, we and they have one key thing in common: our life recipes are encoded by the information content of deoxyribonucleic acid, or DNA, packaged within the single chromosome of a prokaryote bacterium and within the multiple chromosomes within the nucleus of every eukaryotic cell of our bodies.

Left alone, DNA just sits there, doing nothing. The nucleotides of its twin spirals are pair-bonded, rendering the molecule inert and largely unresponsive to the attentions of other molecules passing by. In this way, it is able to function as a permanent memory that does not degrade, even over millions of years, although it can be damaged by mutagens such as oxidizing agents, alkylating agents and high-energy electromagnetic radiation such as ultraviolet light and X-rays. In human cells, the different chromosomes occupy separate areas in the nucleus called "chromosome territories".

Steroid hormones cause changes within a cell by first passing through the cell membrane of the target cell. Steroid hormones, unlike non-steroid hormones, can do this because they are fat-soluble. Cell membranes are composed of a phospholipid bilayer which prevents fat-insoluble molecules from diffusing into the cell.

Once inside the cell the steroid hormone binds with a specific receptor found only in the cytoplasm of the target cell. The receptor bound steroid hormone then travels into the nucleus and binds to another specific receptor on the chromatin. Once bound to the chromatin, this steroid hormone-receptor complex calls for the production of messenger RNA (mRNA) molecules through a process called transcription. The mRNA molecules are then modified and transported to the cytoplasm. The mRNA molecules code for the production of proteins through a process called translation. (movie).

Steroid hormones are derivatives of cholesterol and include products of the adrenal cortex, ovaries, and testes as well as the related molecule, vitamin D. Unlike protein/polypeptide hormones, steroid hormones are not stored in large amounts. When needed they are rapidly synthesized from cholesterol by a series of enzymatic reactions. Most of the cholesterol needed for rapid steroid hormone synthesis is stored intracellularly in the tissue of origin. In response to appropriate signals, the precursor is moved to organelles (mitochondria and smooth endoplasmic reticulum), where a series of enzymes (eg, isomerases, dehydrogenases) rapidly convert the molecule to the appropriate steroid hormone. The identity of the final steroidal product is thus dictated by the set of enzymes expressed in that tissue.


Mice are genetically closer to humans than you might imagine - “Indeed, practically every human gene appears to have a counterpart in the mouse genome” (Human Genome Project, 1996). If we imagine that the genes of the 23 pairs of human chromosomes were reorganised into smaller blocks, those pieces could be reassembled to produce a model of the mouse genome, shown in Figure 5-12.

Because of the mouse-human genomic homology, a gene on a human chromosome can often lead to a confident prediction of where a closely related gene will be found in the mouse - and vice versa. For example, a crippling heritable muscle disorder in mice maps to a location on the mouse X chromosome that is closely analogous to the map location for the X-linked human Duchenne muscular dystrophy gene - these two similar diseases are caused by the mouse and human versions of the same gene; the mouse and human genes produce proteins that function in very similar ways and that are clearly required for normal muscle development and function in the corresponding species. Likewise, the discovery of a mouse gene associated with pigmentation, reproduction, and blood cell defects, was the crucial key to uncovering the basis for a human disease known as the piebald trait. Owing to such close human-mouse relationships as these, together with the benefits of transgenic technologies, the mouse offers enormous potential in identifying new human genes, deciphering their complex functions, and even treating genetic diseases. (Human Genome Project, 1996).

The eukaryote genome is represented in chromosomes which come in homologous pairs. Each member of the pair contains information on how to build the same protein products. One member of each pair comes from the mother and one comes from the father. Humans have 22 pairs of autosomal chromosomes and 1 pair of sex chromosomes: XX for Females and XY chromosome for Males. The Genotype of an individual is defined by what alleles an individual has at a locus of a chromosome - the Phenotype is the observable expression of the genotype. For example, the phenotype for a genotype with one A allele and one O allele at at the ABO (blood group) locus would be blood type A. There are approximately 30,000 genetic loci on the 23 pairs of chromosomes; many loci are involved in specifying the form of a protein. For example, the locus of the beta gene for the Haemoglobin molecule is near the tip of the short arm of chromosome 11 and the locus of the alpha gene is near the tip of the short arm of chromosome 16. Similarly, the expression of a character depends upon many loci in different chromosomes. For example, the blood group locus is on chromosome 9 whereas the Rhesus (Rh) locus in on chromosome 1 - blood type is a function of both factors.


Prokaryotes reproduce by fission, transformation and transduction. During binary fission, the single DNA molecule replicates and both copies attach to the cell membrane. The cell membrane begins to grow between the two DNA molecules. Once the bacterium has doubled in size, its cell membrane begins to pinch inward. A cell wall then forms between the two DNA molecules dividing the original cell into two identical daughter cells.

A bacterium reproduces by fission: its single chromosome replicates and both copies attach to the cell membrane. The cell membrane then begins to grow between the two genomes. Once the bacterium has doubled in size, its cell membrane begins to pinch inward. A cell wall then forms between the two chromosomes, dividing the original cell into two identical daughter cells.

In conjugation, one bacterium connects itself to another through a protein tube structure called a pilus. Genes are transferred from one bacterium to the other through this tube. Bacteria are able to laterally transfer plasmid genes by conjugation, in which one bacterium connects itself to another through a protein tube structure called a pilus, through which plasmids are transferred.

Some bacteria are capable of taking up DNA from their environment. These DNA remnants most commonly come from dead bacterial cells. During transformation, the bacterium binds the DNA and transports it across the bacterial cell membrane. The new DNA is then incorporated into the bacterial cell's DNA.

Transduction is a type of recombination that involves the exchanging of bacterial DNA through bacteriophages - viruses that infect bacteria. Once a bacteriophage attaches to a bacterium, it inserts its genome into the bacterium. The viral genome, enzymes, and other components are then replicated and assembled within the host bacterium. The newly formed bacteriophages then lyse (split open) the bacterium, releasing the new viruses. During the assembly process, some of the host's bacterial DNA may become encased in the viral capsid instead of the viral genome. When this bacteriophage infects another bacterium, it injects the DNA fragment from the previous bacterium. This DNA fragment then becomes inserted into the DNA of the new bacterium. This type of transduction is called generalized transduction. In specialized transduction, fragments of the host bacterium's DNA become incorporated into the viral genomes of the new bacteriophages. The DNA fragments can then be transfered to any new bacteria that these bacteriophages infect.

Homeotic Genes

Homeotic genes govern the development of bodies. Nüsslein-Volhard and Wieschaus (1980) identified and classified 15 genes of key importance in determining the body plan and the formation of body segments of the fruit fly Drosophila melanogaster. Embryonic development is controlled by three functional groups of homeotic genes: (1) gap-genes lay the foundations of a rough body plan along the head-to-tail axis, (2) pair rule-genes govern formation of every second body segment, (3) segment polarity-genes refine the head-to-tail polarity of each individual segment, meaning that the head-end and the tail-end of a segment look different.

The homeotic genes of the fly are homologous to homeotic genes in other vertebrates, including man. It is actually possible to transfer a homeotic gene from man to the embryo of a fruit fly, where it is can perform some of the functions that the corresponding Drosophila gene normally executes.

In humans, a common genetic variation in the promoter region of the serotonin transporter (the SERT-long and -short allele: 5-HTTLPR) has been shown to affect the development of several regions of the thalamus in adults. People who inherit two short alleles (SERT-ss) have more neurons and a larger volume in the pulvinar and possibly the limbic regions of the thalamus. Enlargement of the thalamus provides an anatomical basis for why people who inherit two SERT-ss alleles are more vulnerable to major depression, posttraumatic stress disorder, and suicide.

Psychosomasis edit

The brain encodes a mathematical model of what’s going on in the world around it, and what’s going on in the world of the body it lives in. This model is physically organised into three subsystems: the forebrain, midbrain and hindbrain, each of which performs a particular class of function. It is likely that these major divisions, layered one on top of another, are evolutionary developments (McClean, .

The hindbrain (shared with reptiles) consists of the structures of the brain stem - medulla, pons, cerebellum, mesencephalon, globus pallidus and the olfactory bulbs. The oldest of the three, it controls the basic mechanisms for staying alive body's vital functions such as heart rate, breathing, body temperature and balance. The midbrain, or limbic system, comprising the hippocampus, amygdala, and hypothalamus, is found in mammals. It is responsible for what are called emotions in human beings and is the seat of the value judgments that we make, often unconsciously, that exert such a strong influence on our behaviour. The forebrain, or cortex, is the site of language, rationality, imagination and self-awareness. The cortex is divided into left and right hemispheres connected by the 200,000 nerve fibres of the corpus calosumTemplate:Fix/category[check spelling]. The left half of the cortex controls the right side of the body and the right side of the brain the left side of the body. The right brain is more spatial, abstract, musical and artistic, while the left brain more linear, rational, and verbal, although either side can assume the functions of the other in someone who has had a hemispherectomy.

The ache in the heart, the knot in the stomach, the tingling in the loins, the shine in the eye, the shiver of excitement - these somatic sensations are the brain’s perceptions of what is going on in the body, based on signals sent to it along the affector neural highways of the spinal cord.

At the core of the brainstem, reticular formation nuclei receive input from the body’s sensory systems and other parts of the brain such as the cerebellum and cerebral hemispheres and route them to the thalamus. The ascending reticular activating system (RAS) pathway originates from a group of neurons around the fourth ventricle in the rostral pons (near midbrain). Most of these neurons are acetylcholinergic, and project to the thalamus, controlling whether the gate is open or closed. The key is in the action of acetylcholine. Acetylcholine cannot, by itself, activate or shut down the neurons of the thalamus. Instead it sensitizes them. By slightly depolarizing the thalamic neurons (it does this by closing a hyperpolarizing potassium channel), the ascending system can make the thalamus more sensitive to sensory input. This situation would correspond to an awake, alert state.

The RAS is vital to learning, self-control, inhibition, attention and motivation. When functioning normally, it provides the neural connections that are needed for the processing and learning of information, and the ability to pay attention to the correct task. The frontal lobes help us to pay attention to tasks, focus concentration, make good decisions, plan ahead, learn and remember what we have learned. Glutamate pathways from the pre-frontal cortex to other areas of the brain are mediated by dopamine and norepinephrine. It has been said that 70% of the brain is there to inhibit the other 30%; the inhibitory mechanisms of the cortex keep us from being hyperactive, from saying things out of turn, and from getting mad at inappropriate times. RAS norepinephrine deficiency is correlated with the symptoms of Attention Deficit Hyperactivity Disorder (ADHD), whose subjects show more theta and alpha brainwave activity, indicating a lack of control in the cortex of the brain; subjects exhibit the symptoms such as impulsive behaviors, quick temper, poor decision making and hyperactivity. An abnormally high level of synchronized discharge of large numbers of neurons creates a condition known as epilepsy, which manifests as jerking movements and/or loss of awareness (petit mal) or can be more intense, resulting in convulsions and loss of consciousness (grand mal). The hippocampus has the lowest seizure threshold of all cortical regions, and a loss of neurons in the hippocampus is commonly seen with epileptics.

Information about the external environment, perceived by all the senses, with the significant exception of the olfactory system, is routed from the RAS to the cortex via the thalamus, perched on top of the brainstem. The thalamus also receives signals from the cortical sites to which it relays information, which suggests it performs some filtering or preprocessing of new data based on what is already being perceived.

Projections from the solitary tract nucleus to the limbic system (hypothalamus, insula, and amygdala) are believed to account for the behavioural relation between tastes and emotions.

- please can someone format this table?

title of this table
inputs from via thalamus nucleus to
visual eye lateral geniculate occipital lobe VI
auditory ear medial geniculate inferior collicus
proprioceptive sacral, lumbar and lower thoracic gracile postcentral gyrus
proprioceptive upper thoracic and cervical cuneate postcentral gyrus
gustatory tongue solitary tract limbic system
tactile face posterior medial
temperature, pain face ventro-posterior medial
temperature, pain spine posterior, ventro-posterior lateral, intralaminar, reticular formation periaqueductal gray

The thalamus behaves differently during different states of consciousness. When awake, glutamate receptors of the dorsal lateral geniculate nucleus (LGN) receive signals from the optic nerve and relay them to the visual cortex V1, mediated by cortical feedback. But during sleep, the relaying function is suppressed and the LGN exhibits rhythmic activity. The awake/asleep “switches” on thalamic neurons are acetylcholine and nitric oxide (NO) receptors that receive signals from the parabrachial complex (PBR). Activation of the cholinergic component of the PBR projection shifts relay cells from burst to tonic mode of firing. Neuronal nitric oxide synthase (nNOS), the calcium-dependent enzyme that generates NO, is contained within the presynaptic terminal fields of the PBR. Higher levels of NO are released during waking and rapid eye movement (REM) sleep compared with slow wave sleep (Alexander et al, 2006).

Serotonin is synthesized from the amino acid L-tryptophan by a short metabolic pathway consisting of two enzymes: tryptophan hydroxylase (TPH) and amino acid decarboxylase (DDC). Ovarian hormones can affect the expression of TPH in various species, suggesting a possible mechanism for postpartum depression and premenstrual stress syndrome.

About 10% of bound neurotransmitters are destroyed on receptor binding release; the other 90% are freely released and reuptaken, via monoamine transporters, by the presynaptic neuron.

Serotonin transmission from both the caudal raphe nuclei and rostal raphe nuclei is reduced in subjects with depression compared with non-depressed controls. Reducing serotonin reuptake is one of the therapeutic approaches to treating depression. Selective Serotonin Reuptake Inhibitors cause it to stay in the synaptic gap, so it is recognized again and again by the receptors of the recipient cell. SSRIs are frequently prescribed for anxiety disorders, such as social anxiety, panic disorders, obsessive–compulsive disorder, eating disorders, chronic pain and occasionally, for posttraumatic stress disorder, irritable bowel syndrome, Lichen simplex chronicus and premature ejaculation.

But preventing reuptake also floods the presynaptic neuron’s autoreceptors which provide it with feedback. The neuron adapts gradually to heightened feedback by reducing the sensitivity of the autoreceptors, which in turn reduces presynaptic serotonin production and hence creates an addiction to the SSRI. SSRIs have also been shown to reduce fetal growth.


In studies with mice and rats, Gillum et al (2008) found that a chemical messenger called N-acylphosphatidylethanolamine (NAPE) is made in the small intestine after the animals ate a greasy meal. NAPE crosses the blood-brain barrier and becomes concentrated in the hypothalamus, a specific region of the brain that governs hunger. Rats treated with extra NAPE for five days ate less and lost weight, hinting that studying NAPE could help researchers design better appetite suppressants or obesity drugs.


Beauregard (2007) studied magnetic resonance images of the brain activity of people experiencing emotions and concluded that “conscious and unconscious mental processes/events, which are neurally grounded, are selectively translated, based on a specific code, into neural processes/events at the various levels of brain organization (biophysical, molecular, chemical, neural circuits). In turn, the resulting neural processes/events are translated into processes and events in other physiological systems, such as the immune or endocrine system”. Other neurons monitor what goes on in the body, creating a feedback system.

The “feel” of a feeling is a combination of the cognitive reaction to an initial perception and the perception of the significance of somatic feedback signals. In the case of phantom limbs, the afferent neurons that were connected to the limb form new endpoints in the stump and that those endpoints sense stimuli and transmit them to the brain which thinks the signals are still coming from the now non-existent limb. One experimenter found that stimulating various areas of the face made a limbless patient imaginine that her phantom fingers were moving - her cortical neurons that used to process limb information had joined up with nearby afferent neurons. Experiments growing neurons on top of silicon chips found that growing neurons seek out electromagnetic activity to join up with. So in an amputee's brain, when neurons that used to be turned on by limb effectors are no longer stimulated (because the limb has gone), they turn to other neighbours to get some action...a kind of unguided learning.

Causes of Animal Behavior.

In humans, the reproductive instinct manifests itself at the conscious level by the emotions we call desire and love. But what are desire and love and how do they arise? This deceptively simple question does not have a simple answer.

Based on his studies of a freshwater fish called the three-spined stickleback, Tinbergen (1951) discovered the fundamental causes of animal behaviours, diagrammed here.

All behaviours, including physical movements, dilation of pupils, and vibrations of the vocal chords producing sounds are the external outputs of a complex cascade of computations between cells of the body. The configuration of cells and signalling mechanisms of an animal derives from its ontogeny (growth) which at its heart is a process of cell division dictated by the instructions expressed by the animal's genome, in the context of proximate influences.

What we call feelings are the brain's perceptions of the body's chain of reactions to external or internal stimuli. For example, oxytocin levels are elevated when one is feeling love, but the feeling of love itself is somatosensory. Love and unrequited love are psychosomatic states. Poets wax lyrical about the pain of a broken heart. This metaphor may reflect real events in the mammalian brain; areas that are activated during the distress caused by social exclusion are also those activated during physical pain (Panksepp, 2003).

Studies of people experiencing emotions have found correlations between the kind of emotion being felt and the concentration of certain hormones in the bloodstream, which is the chemical communications network of the endocrine system as well as being the pathway for carrying nutrients and waste products to and from cells. For example, excitement is correlated with adrenaline levels and feelings of love are correlated with oxytocin levels. Adrenaline, secreted by the adrenal gland, signals liver cells to synthesise glucose and release it into the bloodstream for delivery to cells throughout the body, which use it as an energy source for their metabolic reactions. Oxytocin is made in the hypothalamus and has a wide variety of physical effects associated with bonding and trust behaviours.

In the peripheral nervous system, acetylcholine activates muscles, and is a major neurotransmitter in the autonomic nervous system. When acetylcholine binds to acetylcholine receptors on skeletal muscle fibers, it opens ligand gated sodium channels in the cell membrane. Sodium ions then enter the muscle cell, stimulating muscle contraction. Acetylcholine, while inducing contraction of skeletal muscles, instead inhibits contraction in cardiac muscle fibers. This distinction is attributed to differences in receptor structure between skeletal and cardiac fibers.

In the brain, Acetylcholine pathways form the cholinergic system, which extends from the brainstem and basal forebrain to the midbrain and cortex. It is involved in the regulation of memory and learning - it is this system that degenerates in Alzheimer's disease. Acetylcholine has been shown to enhance the amplitude of synaptic potentials following long-term potentiation in many regions, including the dentate gyrus, CA1, piriform cortex, and neocortex. It is also involved in arousal and reward. ACh has an important role in the enhancement of sensory perceptions when we wake up and in sustaining attention.

There are two main classes of acetylcholine receptor (AChR), nicotinic acetylcholine receptors (nAChR) and muscarinic acetylcholine receptors (mAChR), named for the ligands used to activate the receptors. Nicotinic AChRs are ionotropic receptors permeable to sodium, potassium, and chloride ions. They are stimulated by nicotine and acetylcholine. They are of two main types, muscle type and neuronal type. The former can be selectively blocked by curare and the latter by hexamethonium. The main location of nicotinic AChRs is on muscle end plates, autonomic ganglia (both sympathetic and parasympathetic), and in the CNS.

Although nicotine can interact with a variety of receptor at numerous tissues, it is its interaction with specific receptors in the brain that creates the dependence associated with smoking. Within the midbrain, nicotine interacts with the alpha 4 beta 2 nicotinic acetylcholine receptor. Acetylcholine is the natural ligand for this receptors. However, nicotine also an acetylcholine receptor agonist, has a higher affinity for the α4β2 (alpha 4 beta 2) receptors. Located on postsynaptic neurons, these receptors are comprised by two α4 and three β2 subunits that form a channel for transporting ions through the membrane.

When two molecules of nicotine or another ligand engage binding sites within the ionotropic receptor, the ion channel is activated. Looking into the receptor, we see that it is closed, but activation by ligand triggers channel opening for the passage of calcium, sodium and potassium ions.This generates action potentials to the reward area of the brain. Here, the impulse stimulates the release of neurotransmitters including dopamine. Dopamine triggers additional signaling events that stimulate the reward circuit generating short feelings of well-being, increased attention and improved mood. Every time tobacco is used, dopamine levels surge. However, nicotine is eliminated rather rapidly, causing dopamine levels to decline. The result: a craving for more nicotine.

With continued use, α4β2 (alpha 4 beta 2) nicotinic receptors undergo complex adaptive changes including up regulation and desensitization. Over time, these and other downstream changes contribute to a stronger need for nicotine stimulation to achieve the rewards of smoking.


Behavioral scientists have explored the biological roots of love as well as the psychological factors that determine love feelings and behaviors. The action of neurotransmitters has been linked to major thinking and emotional disorders. Many medications used to treat conditions such as depression or schizophrenia act at the synapse on these chemicals. The indications are that the broad actions of specific neurotransmitters are involved in feelings and thoughts of love.

Dopamine is thought to increase attention, motivation, and goal directed behavior. In excess it can produce feelings of exhilaration, increased energy, hyperactivity, sleeplessness, loss of appetite, symptoms of addiction, anxiety, and fear. To the extent that the feeling of ecstasy in romantic love involves these symptoms, it may be the result of dopamine. Norepinephrine can also cause exhilaration, excessive energy, and loss of appetite. Serotonin has been associated with depression when levels at the synapse are too low. Some anti-depression medications, such as Prozac, work to increase the level of serotonin at the synapses. Decreased levels of serotonin also have been linked to obsessive-compulsive disorder. Lovers often become obsessed in thinking about their loved one. They cannot turn off their racing thoughts. While chemical action in love presents an appealing hypothesis, it remains unproven.

The anatomy of love rests in nerve centers in the brain. These connect with the limbic system, controlling motivation and emotion. The brain's pain and pleasure centers are also part of this complex neurological mechanism. These centers are responsible for sexual drive and feelings. Another important role in the love process may be played by endorphins, chemicals produced by the brain, similar to morphine, that increase feelings of pleasure and reduce pain. Other brain chemicals, similar to amphetamines, are thought to control the "high" experience when we are in love and the "crash" when love fails.

Helen Fisher at Stony Brook University studied people who had just fallen madly in love. The subjects were shown photographs of their loved ones as well as love letters, music tapes and colognes that reminded them of the person they loved. The subjects completed questionnaires and turned a dial to indicate the strength of their romantic feelings. These were compared with their reactions to photographs of other persons. Their reactions confirmed that photos and other objects could elicit feelings of passion. Objects unrelated to their love partner did not arouse such feelings. MRI scans were made of the subjects under loved-one stimulus and non-stimuls situations. Many brain areas became active as the love-struck subjects viewed photographs of their loved one. Two brain regions were predominant. The first was the caudate nucleus, a C-shaped area near the center of the brain that is part of the most primitive areas of the brain. This area has been found to be part of the reward mechanism of the brain. It is necessary for detecting and discriminating among rewards and providing sensations of pleasure. It produces motivation and movements to obtain preferred rewards. The more passionate the subject felt, the greater the activity of the caudate nucleus. A second active area is called the ventral tegmental area (VTA). It is also an important part of the reward circuitry of the brain. More important, it is the center for dopamine-making cells. From this area, nerve fibers distribute dopamine to many other brain areas. Dopamine is responsible for feelings of euphoria. While many other areas may also be active, Fisher's findings highlight the importance of brain networks and brain chemistry in generating feelings of romantic love.

Serotonin is synthesized from the amino acid L-tryptophan by a short metabolic pathway consisting of two enzymes: tryptophan hydroxylase (TPH) and amino acid decarboxylase (DDC). Ovarian hormones can affect the expression of TPH in various species, suggesting a possible mechanism for postpartum depression and premenstrual stress syndrome.

About 10% of bound neurotransmitters are destroyed on receptor binding release; the other 90% are freely released and reuptaken, via monoamine transporters, by the presynaptic neuron.

Serotonin transmission from both the caudal raphe nuclei and rostal raphe nuclei is reduced in subjects with depression compared with non-depressed controls.

Reducing serotonin reuptake is one of the therapeutic approaches to treating depression. Selective Serotonin Reuptake Inhibitors cause it to stay in the synaptic gap, so it is recognized again and again by the receptors of the recipient cell. SSRIs are frequently prescribed for anxiety disorders, such as social anxiety, panic disorders, obsessive–compulsive disorder, eating disorders, chronic pain and occasionally, for posttraumatic stress disorder, irritable bowel syndrome, Lichen simplex chronicus and premature ejaculation.

But preventing reuptake also floods the presynaptic neuron’s autoreceptors which provide it with feedback. The neuron adapts gradually to heightened feedback by reducing the sensitivity of the autoreceptors, which in turn reduces presynaptic serotonin production and hence creates an addiction to the SSRI. SSRIs have also been shown to reduce fetal growth.

Mentality edit

Our memories are associative. One thing leads to another and one thought leads to another. Without associations, we wouldn't enjoy flowing thoughts or dialogue; we wouldn't be reminded of things.

Experience is a great teacher - feedback of the consequences of an action in a given situation modifies existing belief structures and creates a new behaviour when faced with the same situation again. Being able to learn from experience is one of the signs of intelligence. Our understanding of the world, and of ourselves, is a consequence of personal experience. That experience includes subjective observation and the absorption of beliefs transmitted to us through various media such as conversation, written texts, television and so forth.

Belief formation is a learning process, and beliefs once formed become memories. In the brain, memories are expressed by configurations of neural networks. When we remember an experience, our brains make physical changes to networks: "short-term forms [of memory]... are expressed as alterations in the effectiveness of pre-existing connections... the long-term form often is associated with the growth of new synaptic connections" (Hawkins et, 2006). Physical changes create a functional change in the operational rules implemented by neurons and their synapses. The rules express how information is interpreted, which in turn determines which actions (such as muscle movements) are undertaken, thereby creating behaviours: “Rules are derived from contingencies, which specify discriminative stimuli, responses, and consequences” (Skinner, 1984).

The limbic system is responsible for things like emotions, instincts/basic drives, motivation, mood, pleasure/pain, smell and memory. The hippocampus (so named because in section its shape reminded early anatomists of a seahorse) is believed to mediate the construction of episodic and spatial memory which enables you to “play the scene back” by reactivating memories formed in the cortex. It is well known that experiences which involve an emotional response are more memorable; it could be that the hippocampus is activated by emotions and functions as a gatekeeper which decides whether short-term cortical memories should be reinforced to turn them into long-term ones. The hippocampus receives modulatory input from the serotonin, norepinephrine, and dopamine systems, the nucleus reuniens of the thalamus, and cholinergic and GABAergic signals from the medial septal area.

The entorhinal cortex serves as the main "interface" between the hippocampus and other parts of the brain. Within the hippocampus, the flow of information is largely unidirectional, as shown in Figure.

The hippocampus shows two modes of activity. Theta rhythm, characterised by regular waves, appears during active behavior and REM (dreaming) sleep. During non-REM sleep and when an animal is resting or otherwise not engaged with its surroundings, the hippocampus shows a pattern of irregular slow waves, occasionally interrupted by large surges called sharp waves. Wilson and McNaughton (1994) found that when hippocampal place cells have overlapping spatial firing fields (and therefore often fire in near-simultaneity), they tend to show correlated activity during sleep following the behavioral session. The "two-stage memory" theory (Buzsáki, 1989) proposes that memories are stored within the hippocampus during behavior, and then later transferred to the neocortex during sleep. However, it is also possible that memories are formed within the cortex and that hippocampus acts as a memory reinforcer through its sharp waves.

Procedural memory, such as knowing how to ride a bike, does not appear to involve the hippocampus, but appears to be associated with modifications in the cerebellum, the basal ganglia, and the motor cortex, all of which are involved in motor control (Dubuc, 2002).


Little E-coli, squirming her way through her world of chemical fields, doesn't have a brain of neural networks, but she does have a memory, one inherited from her mother, who died in the act of producing twin daughters made of her own body and materials she had made from the food she had eaten during her 20-minute lifetime. Her daughter's knowledge of which chemicals are good to eat and which are not is stored in her genome, the magic recipe of life, which tells her how to make all her body parts, including those involved in the processing of signals from the external environment.

And the genome also tells her how and when to make daughters of her own, with their own membranes and receptors and transducers. She will pass this knowledge on to them, the same way her mother taught her, by providing each of them with her genome. The genome is the master memory of a bacterium, responsible for building all of its apparatus. Daughters of bacteria are as identical as identical twins can be, but they are not exactly identical: "The variability in genome content among closely related strains of prokaryotes has been one of the most remarkable discoveries of genomics" (Cuadros-Orellana et al, 2007).

It used to be thought that variation arises as a result of random mutations during reproduction, but Nature is not random. To every thing, there is a reason. Some diseases and deformities of bacteria arise from imperfect cloning during reproduction, but this is caused by environmental conditions (and perhaps "flaws" in their genomes); others occur because different mothers have different learning experiences, some of which can change their genomes: "Specialized enzymes under regulatory control can remove DNA segments, move segments from one site to another, reverse a segment’s orientation, or insert foreign segments" (McAdams and Arkin, 1998). This could explain how successive generations of bacteria learn to defend themselves against antibiotics; far from being sheer luck through random mutation, it is the consequence of some mothers being able to develop protection mechanisms, and passing this ability on to their daughters through an altered genome.

Populations of bacteria and viruses undergo genetic drift, as the descendants of individuals which have learned to defend themselves against antigens manufactured by eukaryotic immune systems and medical laboratories survive these onslaughts and come to dominate the population.

Making Decisions edit

One evening, or so the story goes, a stressed-out business executive comes home.

"You look tired, darling", says his wife, "would you like a cup of tea?"

"For heaven's sake!" explodes the executive, "Do I have to make all the decisions around here!!"

To tea, or not to tea, that is the question. Back in 18xx, Jeremy Bentham pondered this kind of issue and proposed a principle that has since come to be called Utility Theory. His idea was that the expected net value (utility) of the outcomes of each alternative to the decision maker could be measured or calculated and compared arithmetically, the one with the greatest utility becoming the preferred choice. This general approach became the foundation of Operations Research.

Let ut be the calculated utility of having tea, and let un be the calculated utility of not having tea Then if ut > un, the choice is tea.

The function calculated by an axon hillock lends itself readily to this kind of arithmetic judgement. In the example, suppose that the utilities can somehow be calculated, and transmitted along two neurons, t and n. Neuron t makes an excitatory synapse with neuron d, and neuron n an inhibitory one. An impulse will be induced in d if the degree of excitation from t sufficiently exceeds the degree of inhibition from n to reach the potentiation threshold of d.

Only our ability to learn and change our minds protects us from living in the darkness of prejudice. However, one of the mysteries of human nature is how organisms that are built to learn can stubbornly refuse to do so, and cling on to beliefs that fly in the face of all the available evidence or continue behaviours like smoking that provide short-term gratification at the expense of foreseeable long-term demise. The same kind of phenomenon is observable in people’s rationally inconsistent decisions about time-shifted alternatives (Gilbert, 2005). One possible reason may be that the conscious reasoning mind is not the one in control, but is merely an advisor to an emotion-ridden subonsciousness – what Freud (1923) calls the “ego”.

What is consciousness? The least-well scientifically understood biological phenomenon, it is the one subject on which non-scientists have the fewest doubts. They just know it is something they have – and most are equally convinced that it is something unique to humans. As convinced as religious adherents of the 19th century were that humans were qualitatively different from all other creatures – even when Darwin came along and suggested otherwise. The very same egocentrism that led men to believe they were closer to God than the beasts of the field drives people these days to assert, unequivocally, that whereas computers might be able to beat them at chess, and translate languages better than they, and calculate faster than they, computing artifacts could never reach the mystical heights of self-awareness, simply because they cannot possibly imagine how such a thing could be possible.

But such a mindset is anthropocentrism in its purest form. We can take a different view. However, different views can take a long time to become accepted in the face of opposition from individual egos and social forces. For example, around 400BC, Pythogoreans deduced from the regularity of day and night that the Earth orbited a fire. We don’t know whether they realised this fire was the same Sun they saw rising at dawn and setting at dusk. Their theory was hotly contested by Plato and his followers, who were devout geocentrists. Such was their social dominance, even when 100 years later Aristarchus – perhaps having seen an eclipse – formulated the first mathematical theory of heliocentrism, he was dismissed to obscurity as a crank. Another 100 years after that, the empire of Rome arose and institutionalised geocentricity as a religious dogma that ostracised science right up to the time of Galileo1500 years later. But in the last 200 years or so, science has steadily gained greater social respectability and it may not require another 2000 years before the currently ubiquitous belief that Man is something more than merely an intelligent plastic machine is itself consigned to the history of myth.

Like many mysteries, consciousness might best be understood by approaching it from a lateral-thinking direction – by considering its counterparts: unconscious and subconscious thought (unfortunately, American language does not discriminate the two). A good punch on the head can make a boxer fall to the floor, incapable of voluntary musular control. He can neither hear nor see, does not respond to touch, although his heart keeps beating and his electroencephalogram (EEG) doesn’t flatline. He’s not dead, he’s been knocked out. He’s unconscious.

An electric current travelling through a wire produces an electromagnetic field around it. So does an electric potential travelling down an axon. The EEG is an image of the changing electromagnetic radiation produced by neurons, which is picked up by scalp electrodes. With increasing levels of anesthesia, the EEG shows more regularity, and frequencies are decreased (Jordan et al, 2008).

Brain rhythms

Delta: 0.01 – 4 Hz Delta wave activity occurs most frequently during stage N3 slow-wave sleep, accounting for 20% or more of the EEG record during this stage. These waves are believed to originate in the thalamus in coordination with the reticular formation.

Theta: 4 – 8 Hz In rats, hippocampal theta is seen when an animal is active or fearful and during REM sleep In humans, cortical theta appears during drowsy, meditative, or sleeping states, but not during the deepest stages of sleep. It is commonly argued that cholergic receptors do not respond rapidly enough to be involved in generating theta waves, and therefore that GABAergic signals must play the central role.

Alpha: 8 – 12 Hz Alpha waves originate from the occipital lobe during wakeful relaxation with closed eyes. Alpha waves are reduced with open eyes and drowsiness and sleep. They are thought to represent the activity of the visual cortex in an idle state.

Beta: 12 – 30 Hz Low amplitude beta waves with multiple and varying frequencies are often associated with active, busy, or anxious thinking and active concentration. Over the motor cortex, bursts of beta activity are associated with motor control and reduced when there is movement change.[2] Beta activity is increased when movement has to be resisted or voluntarily suppressed.

Gamma: 30 - 100+ Hz Gamma waves are observed as neural synchrony from visual cues in both conscious and subliminal stimuli. This research also sheds light on how neural synchrony may explain stochastic resonance in the nervous system. They are also implicated in rapid-REM sleep, which involves visualizations, and also during anesthesia.


The brain creates a model of the world around it. That’s how we navigate, how we avoid obstacles and why we refrain from trying to walk through walls. The brain also creates a model of the body it inhabits. It can sense the heart’s beat, the stomach’s ache, the muscle’s cramp, the flush’s heat. The very same mechanisms that create the brain's model of the body can be applied to enable it to create a model of itself – and, recursively, a model of its model of itself, and so on.

... A single cubic millimeter of cortex – the spatial resolution of current fMRI technology – contains an astounding array of energy- regulating equipment: 13,000 pyramidal neurons, 24,000 glial cells, 100 billion synapses, and 100 metres of axons (Pakkenberg et al, 2003).


Emergent Intelligence edit

The combined effects of simple operations can produce complex behaviours.

Figure shows three of the 13 rules that make up the logic of a virtual spider (Krink and Vollrath, 1998) that creates a virtual web similar to a garden spider. These rules control the building of the Capture Spiral (CS). Each rule consists of a precondition (upper box) and an action (lower box). The directions of the arrows represent the activity flow. Each rule activation and execution starts with a query to the sensory system (grey arrows) and results in the execution of a behaviour pattern that performs a conditional activity loop (black arrows).

The first rule (1) directs the spider along the auxiliary spiral (AS) with its inner leg (IL) to the next crossing with a radius. Afterwards, the second rule (2) makes the spider search for outer frame threads and previously constructed CS threads with its outer leg (OL). When the spider finally grasps a thread or stretches its outer leg entirely (3), it attaches a new CS thread on the previously detected radius. The location of the new attachment point is determined by a compromise between the previous distance between the AS and the last CS turn as well as the spider's anticipated mesh size of the CS.

It takes an orb weaver spider about 30 to 45 minutes to make its web. The process of making a web starts with an initial adhesive silk thread extruded from the silk spinning organs (spinnerets). Air currents are used to waft the thread across a gap in the foliage to entangle in leaves or twigs on the other side. When it sticks to a surface the spider will carefully walk over the thread and strengthen it with a second thread for added strength. The spider moves back and forth across this bridge line, strengthening it by laying down more silk. It then drops from the bridge line's centre to attach a vertical line to the ground. This provides the basic Y shaped framework to which are then added supporting outer frame-lines, and the radial lines (the 'spokes') on which the spiral lines are laid.

A non-sticky, temporary spiral line is laid down first, starting from the centre and running outwards. This temporary spiral gives the spider a 'scaffolding' from which it then lays down the more closely spaced, permanent, sticky spiral, starting from the periphery toward the centre or hub. The spaces between each spiral are proportional to the distance from the tip of its back legs to its spinners. The spider removes and rolls up the temporary spiral as it lays down the sticky spiral. The excess silk is eaten and recycled.

Emergent behaviours are displayed by many species: colonies of ants, mounds built by termites, swarms of bees, schools of fish, flocks of birds, and herds of mammals. Individually, ants aren't particularly clever. But collectively, a colony of ants can solve complex problems, such as finding the shortest path to the best food source or allocating workers to different tasks.

Swarm intelligence emerges from collective decision-making and communication. In an ant colony, no-one's in charge. No generals command ant warriors. No managers boss ant workers. The queen plays no administrative role except to lay eggs. Yet even with half a million members, an ant colony functions just fine without hierarchical management. It relies instead upon countless interactions between individual ants, each of which is following simple rules of thumb, reacting by its innate programming to local stimuli in the form of chemical scents from larvae, other ants, intruders, food and waste. As it moves, it leaves behind a chemical trail, which provides information to other ants. Using distributed decision making and chemical communication, ant colonies exhibit complex behaviors.

Consider the problem of task allocation. How many workers should go out foraging for food?. The number can change, depending on conditions. Ants communicate by touch and smell. When one ant bumps into another, it sniffs with its antennae to find out if the other belongs to the same nest and where it has been working - ants that have been outside smell different from those that have stayed inside. Before they leave the nest each day, foragers normally wait for early morning patrollers to return. As patrollers enter the nest, they touch antennae briefly with foragers. After a forager has contact with a several patrollers in a short space of time, it will go out and follow the scent trails laid by the patrollers. Once ants start foraging and bringing back food, other ants join the effort, depending on the rate at which they encounter returning foragers.

A forager won't come back until it finds something. The less food there is, the longer it takes the forager to find it and get back. The more food there is, the faster it comes back. Thus, effectively, the colony decides whether it's a good day to forage, but no single individual is managing the process. That's how swarm intelligence works: simple creatures following simple rules, each one acting on local information. No ant sees the big picture. No ant tells any other ant what to do.

The impressive capabilities of heterarchical organisations has inspired the development of artificial swarm intelligence systems, made up of a population of simple agents interacting locally with one another and with their environment. The technique has been applied to packet routing in telecommunication networks (Dorigo and Stützl, 2004). Software 'ants' - simulation agents – collectively converge upon an optimal solution as they move through a simulated parameter space. The 'ants' record their positions and the quality of their solutions, so that in later iterations, more agents locate better solutions.

In nature, many species travel in large numbers. That's because, as members of a big group, whether it's a flock, school, or herd, individuals increase their chances of detecting predators, finding food, locating a mate, or following a migration route. For these animals, coordinating their movements with one another can be a matter of life or death. It's much harder for a predator to avoid being spotted by a thousand fish than it is to avoid being spotted by one; news that a predator is approaching spreads quickly through a school because fish sense from their neighbors that something's going on. When a predator strikes a school of fish, the group is capable of scattering in patterns that make it almost impossible to track any individual. It might explode in a flash, create a kind of moving bubble around the predator, or fracture into multiple blobs, before coming back together and swimming away.

Google uses swarm intelligence in its search engine. When you type in a search query, Google surveys billions of web pages on its index servers to identify the most relevant ones. It then ranks them by the number of pages that link to them, counting links as votes (the most popular sites get weighted votes, since they're more likely to be reliable). The pages that receive the most votes are listed first in the search results. In this way, Google uses the collective intelligence of the web to determine a page's importance.

Wikipedia has also proved to be a big success, with millions of articles in more than 200 languages about everything under the sun, each of which can be contributed by anyone or edited by anyone. It's now possible for huge numbers of people to think together in ways we never imagined a few decades ago. No single person knows everything that's needed to deal with problems we face as a society, such as health care or climate change, but our collective knowledge is quite extensive.

For a group to be be intelligent, whether it's made up of ants or attorneys, it needs each and all its members to do their individual jobs. Think about a honeybee as she walks around inside the hive. If a cold wind hits the hive, she'll shiver to generate heat and, in the process, help to warm the nearby brood. She has no idea that hundreds of workers in other parts of the hive are doing the same thing at the same time to the benefit of the next generation. None of us knows what society as a whole needs, but we could look around and say, oh, they need someone to volunteer at school, or mow a communal lawn, or help in a political campaign. If you're looking for a role model in a world of complexity, you could do worse than to imitate a bee (Miller, 2007).

Social and political groups are swarms. During mass protests in Seattle, anti-globalization activists used mobile communications devices to spread news quickly about police movements, turning an otherwise unruly crowd into a "smart mob" that was able to disperse and re-form like a school of fish.

The collective power of swarms is not news; religious and political elites have known it for millennia and devote most of their time and effort to propaganda to program the swarm. Once programmed, swarm members rally to the cause, even to the extent of martyring themselves for the sake of what they have been conditioned to believe is an moral obligation greater than their personal survival.

The collective intelligence of a web of neurons emerges from the interactions of its individuals. The time-lapse series of brain images in Figure, taken from 14ms to 60ms after the single touch of a single whisker, shows the spreading flame of thought in a rat’s brain.

But what are thoughts? One thing is for sure, they are not simple reflexes like those of the....


The Big Picture edit

If this book ever manages to evolve into a coherent document, this chapter will have moved to the beginning, rather than the end. As it is still here, that state of perfection has not yet been reached!

No scientist worth his denarius would claim to know everything - and no scientist worthy of the name would claim to know anything for sure; there is just so much to know about even the tiniest of details (does 1 plus 1 always equal 2? how about one quark plus one quark? graviton plus one graviton?), it surely needs more than one head to contain all that knowledge. The preceding chapters are an attempt to reliably report on the findings of science so far; all their statements are corroborated by cited experimental evidence. This chapter is different - it addresses open questions; questions that have not yet been resolved (and in some cases, not even raised). Its hypotheses are intended as food for thought for the next generation of scientists to explore and shed more light on the miracles of life and existence.

Although, because of the nature of written communication and the limitation of having only two spatial dimensions (plus hyperlinks) to work with, this chapter is organised into sections - but they are not conceptually separate topics; they are all intricately interrelated. Some of the phenomena and their interrelationships will be mentioned, the rest - the majority, it must be admitted - are for you to find out.

Who's in charge?

In feudal societies, the King or Lord is the Chief Executive Officer of the population. In ant colonies, the Queen is a humble servant of the brood.

Most human-engineered systems are hierarchical; the principle of "divide and conquer" was successfully deployed by Caesar for building his empire, by Polya (ref) for solving mathematical problems, and by von Newman (ref) for designing the architecture of the artificial computer, an architecture that remains the blueprint of today's technology.

But who's in charge of the human? Is it the mind or the body?

more to come

The mathematics of beauty

From the earliest days to the present, mathematicians have been obsessed by the beauty of simplicity: the perfection of the straight line, the circle, the dot, the equation. We delight in Einsten's famous equation relating mass and energy - it's so elegantly simple, it must be right (as far as we know....!).

But why, you might wonder, do they have this obsession? Is there some underlying cause? Many people would argue that it is "self-evident" - but self-evidence is a substitute for a rational argument; it really means " I don't know, go ask your mother!"

more to come

Why do children ask "Why?"?

Entrepreneurs, engineers and people struggling to make ends meet are largely unconcerned about "why?" as they are too preoccupied trying to figure out "how?". But for children (and adults beset by a tragedy, and the odd philosopher (most philosophers are a bit odd)), the question "why?" always raises itself in their minds. Why is the sky blue? Why do girls like pink things? Why do i have to go to bed? Why can't I do (whatever it it is that is being denied them)? Why do clapping hands make noise? Does a tree falling in the forest make a sound if no-one's there to hear it? Why bring that up?....

"Why?" is the ultimate academic question. But that doesn't mean it is of no practical value - on the contrary, it is essential for survival. Knowledge is not power (only knowledge of the tricks and techniques of persuasion is power), but it is necessary for being able to build a causal model of how the world works, one that is sufficiently accurate as to be able to survive in it.

Bacteria manage to get along just fine without causal models... as a population, but not as individuals. Their machinery of life is highly adaptive, as indeed is the machinery of viruses, which can evolve faster that antibioticians can catch up with them; like Achilles, the antibiotician only manages to reduce the difference bween himself and where the tortoise-paced virus was last time he saw it - but by the time he gets there, the virus has moved on. Yet a given individual bacterium is as helpless as a new-born babe - and a new-born babe is (almost) as helpless as a bacterium. It has the equipment for its cells to respire, but beyond being programmed to seek what it "thinks" is its mother's nipple, it doesn't know how to get the food its cells need to respire.

more to come

References edit

Aizenberg, J., Tkachenko, A., Weiner, S., Addadi, L., & Hendler, G. (2001) Calcitic microlenses as part of the photoreceptor system in brittlestars. Nature 412, 819-822.

Alexander GM, Kurukulasuriya NC, Mu J, Godwin DW (2006) Cortical feedback to the thalamus is selectively enhanced by nitric oxide. Neuroscience. Jul 28.

Allwood, A.C., Walter, M.R., Kamber, B.S., Marshall, C.P. and Burch, I.W. (2006) Stromatolite reef from the Early Archaean era of Australia. Nature 441, 714-718.

Aristotle (350BC), Metaphysics.

Bancroft, J. (2005) The endocrinology of sexual arousal. Journal of Endocrinology (2005) 186, 411-427

Bear MF, Connors BW, Paradiso MA. Neuroscience - Exploring the Brain. Lippincott, Williams and Wilkins.

Beauregard, M. (2007) Mind does really matter: Evidence from neuroimaging studies of emotional self-regulation, psychotherapy, and placebo effect. Progress in Neurobiology 81 218–236.

Buck, L. and Axel, R. (1991) Cell, 65, 175-187.

Buzsáki, G. (1989) Two-stage model of memory trace formation: a role for "noisy" brain states. Neuroscience 31: 551–70.

Cuadros-Orellana, S., Martin-Cuadrado, A., Legault, B., D'Auria, G., Zhaxybayeva, O., Papke, R.T. and Rodriguez-Valera, F. (2007) Genomic plasticity in prokaryotes: the case of the square haloarchaeon. The ISME Journal 1, 235–245.

Dorigo, M. and Stützle, T. (2004) Ant Colony Optimization. MIT Press.

Dubuc, B. (2002) The Brain from Top to Bottom.

Ferezou, I., F. Haiss, L. Gentet, R. Aronoff, B. Weber, and C. Petersen (2007) Spatiotemporal Dynamics of Cortical Sensorimotor Integration in Behaving Mice. Neuron 56:907-923.

Freud, S. (1923) The Ego and the Id. Norton.

Ghizzoni, L., Mastorakos, G. and Vottero, A. (2003) Adrenal Androgens. In: Chrousos, G. (Ed) Adrenal Physiology.

Gilbert, D. (2005) Our Mistaken Expectations.

Gillum,MP, Zhang, D., Zhang, X-M, Erion, DM., Jamison, RA, Choi, C, Dong, J, Shanabrough, M, . Duenas, HR, Frederick, DW, Hsiao, JJ, Horvath, TL, Lo, CM, Tso, P, Cline, GW, Shulman, GI. (2008) N-acylphosphatidylethanolamine, a Gut- Derived Circulating Factor Induced by Fat Ingestion, Inhibits Food Intake. Cell 135, 5, 813-824.

Ginsburg and Jablonka (2008) Epigenetic learning in non-neural organism. J. Biosci. 33(4),

Godel, K. (1931) Über formal unentscheidbare Sätze der Principia Mathematica und verwandter Systeme, I. Monatshefte für Mathematik und Physik 38: 173-98.

Hawkins, G.S (1963) Stonehenge Decoded. Nature 200, 306 - 308.

Hawkins, R.D., Kandell, E.R. and Bailey, C. (2006) Molecular Mechanisms of Memory Storage in Aplysia. Biol. Bull. 210: 174–191.

Hebb, D.O. (1949) The organization of behavior. Wiley

Human Genome Project (1996) To Know Ourselves. US Dept of Energy.

Intel (2009)

Johanson, D. & Maitland, E. (1981) Lucy, the Beginnings of Humankind. Granada.

Jordan, D., Stockmanns, G., Kochs, EF., Pilge, S. and Schneider, G. (2008) Electroencephalographic Order Pattern Analysis for the Separation of Consciousness and Unconsciousness: An Analysis of Approximate Entropy, Permutation Entropy, Recurrence Rate, and Phase Coupling of Order Recurrence Plots. Anesthesiology 109, 6, 1014-1022.

Kennard, M.A. (1947) Autonomic Interelations with the Somatic Nervous System.

Kolb, H. (2003) How the Retina Works. American Scientist 91, 28-35.

Krink, T. and Vollrath, F. (1998) Emergent properties in the behaviour of a virtual spider robot. Proc. R. Soc. Lond. B (1998) 265, 2051- 2055.

Kumar, S., Filipski, A., Swarna, V., Walker, A. and Blair Hedges, S.B. (2005) Placing confidence limits on the molecular age of the human–chimpanzee divergence. PNAS 102, 52, 18842-18847.

Luckiesh, M. (1922) Visual Illusions: Their Causes, Characteristics and Applications. Van Nostrand.

Nuesslein-Volhard, C. and Wieschaus, E. (1980). Mutations Affecting Segment Number and Polarity in Drosophila. Nature 287, 795-801.

Markov, A.A. (1960) The Theory of Algorithms. American Mathematical Society Translations 2, 15, 1-14.

Marlovits T.C., Haase W., Herrmann C., Aller S.G., Unger V.M. (2002) The membrane protein FeoB contains an intramolecular G protein essential for Fe(II) uptake in bacteria. PNAS 99(25):16243-8.

Miller, P. and Wang, X-J (2006) Inhibitory control by an integral feedback signal in prefrontal cortex: A model of discrimination between sequential stimuli. PNAS 103,1, 201–206.

Mori IC, Murata Y, Yang Y, Munemasa S, Wang Y-F, et al. (2006) CDPKs CPK6 and CPK3 Function in ABA Regulation of Guard Cell S-Type Anion- and Ca2+- Permeable Channels and Stomatal Closure. PLoS Biol 4(10): e327.

Nobel (2004) The Nobel Prize in Physiology or Medicine 2004.

Ochsner, and Barret, () A Multiprocess Perspective on the Neuroscience of Emotion. In T. Mayne & G. Bonnano (Eds.), Emotion: Current Issues and Future Directions. Guilford Press.

Panksepp, J. (2003) Feeling the Pain of Social Loss, Science 302, 5643, 237 – 239.

Paz-Filho, G., Wong, M-L., and Licinio, J. (2003) Circadian Rhythms of the HPA Axis and Stress. In: Chrousos, G. (Ed) Adrenal Physiology.

Peterlin, Z., Li, Y., Sun, G., Shah, R., Firestein, S., and Ryan, K. (2008) The Importance of Odorant Conformation to the Binding and Activation of a Representative Olfactory Receptor. Chemistry & Biology 15, 12, 1317-1327.

Purves et al., () Life: The Science of Biology. Sinauer Associates.

Ramachandran, V. S. and Blakeslee S. (1998). Phantoms in the Brain. Harper Perennial.

Schrödinger, E. (1944). What is Life - the Physical Aspect of the Living Cell. Cambridge University Press

Selfridge, O.G. (1959) Pandemonium: A paradigm for learning In: D. V. Blake and A. M. Uttley, eds, Proceedings of the Symposium on Mechanisation of Thought Processes, 511-529.

Shakespeare, W. () Henry V, Act 3, Scene 1.

Shannon, C.E. (1948) A Mathematical Theory of Communication, Bell System Technical Journal, 27, 379–423.

Skinner, B.F. (1984) An Operant Analysis of Problem Solving. Behavioral and Brain Sciences, 7, 583.

Stange, G. and Howard, J. (1979) An Ocellar Dorsal Light Response in a Dragonfly J. Experimental Biololgy. 83, 351-355.

Tinbergen, N. (1951) The Study of Instinct. Clarendon Press.

Turing, A.M. (1936) On Computable Numbers, with an Application to the Entscheidungsproblem, Proc. of the London Mathematical Society, 2 42: 230-65, 1937.

Ubuka T, Morgan K, Pawson AJ, Osugi T, Chowdhury VS, Minakata H, Tsutsui K, Millar RP, and Bentley GE (2009) Identification of Human GnIH Homologs, RFRP-1 and RFRP-3, and the Cognate Receptor, GPR147 in the Human Hypothalamic Pituitary Axis. PLoS ONE 4(12): e8400. doi:10.1371/journal.pone.0008400

Veraksa,A., Del Campo,M. and McGinnis,W. (2000) Developmental Patterning Genes and Their Conserved Functions: From Model Organisms to Human. Molecular Genetics and Metabolism 69, 85–100.

Waters, C.M. and Bassler, B.L. (2005) Quorum Sensing: Cell-to-Cell Communication in Bacteria. Annual Review of Cell and Developmental Biology, 21, 1, 319-346.

Wilson M.A., and McNaughton, B.L. (1994) Reactivation of hippocampal ensemble memories during sleep. Science 265: 676–7.

Yamamura, S. and Hasegawa, K. (2001) Chemistry and biology of phototropism-regulating substances in higher plants. The Chemical Record, 1, 5, 362-372.