WHO'S IN YOUR GENES »
A United States Presidential proclamation declared the time spanning from 1990 to 2000 "the Decade of the Brain".
This Presidential decree was a symbolic act, giving us a ten year line that divides mankind's technological preoccupation
with the brain from our psychological preoccupation with the mind. That isn't to say that we have magically given up on
our quest of understanding what exactly "mind" is, but a lot of intellectual guess-work took place before we had scanning
and imaging technology to monitor blood
flow to the head, giving us a topography of the dynamic, ever changing landscape of the human brain. This Decade of the Brain
was practically buttressed with a new Decade that was ushered in a mere three years later: The Decade of the Genome. In 2003
the human genome was mapped in its entirety, allowing later researchers to elucidate the genetic origins of human behavior. In all honesty, we still have a soft understanding of the hard roles that genes play in human behavior, so you'll often hear non-committal words like this gene is "associated" with this behavior. (Kind of like introducing the person with you as an "associate" and not a friend or lover). Never the less, the two decades from 1990 to 2010 seemed to bring closure to many of our age old controversies of the human mind: it is a symptom of neurobiological activity of the brain.
However, a few ancient questions still seem to linger in the face of all of our hard evidence about the brain. These questions stuck around not despite hard evidence, but because of it. Philosophers and psychologists still wonder if consciousness and observed reality are one (monism) or two (dualism). Another way to put it: What exactly was matter before or after we observed it? This may seem like tired philosophy, but I remember a physicist once saying about the electron: "when we're looking at it, it behaves like a particle. When we're not looking at it, it seems to behave like a wave." We may never know what matter is when we leave it well enough alone, but we have a pretty good idea of what it is when we are touching, tasting and seeing it. The brain is our ambassador to matter, breaking it down into "electrical signals" to be processed by neurons and transmitted along neuropathways so that we can have some kind of idea about what we're dealing with.
The operative word here is "idea".
An idea is pretty much all we are allowed to have. If you were to hop in your car and take a tour across the country – or better yet, take a global tour – making stops along the way to ask people about the world they live in, you'd see just how many ideas about reality are floating around out there. Like snowflakes, no two are alike. It's a strange outcome considering that matter should provide the basis for some type of consensus. But it seems to do the exact opposite. The mind is a cesspool of subjectivity, and even in our world of objects every one of us struggles to be objective. Or, to put it another way, my "objectivity" clashes with yours. If we humans somehow lived in a CyberPunk fantasy where we were mind only, consensus would be extremely difficult. Matter may be nothing more than a stage that we all stand on. This material stage is about the only thing we have in common. The performances that take place on this stage are as varied as the perceptions of the many performers. This goes without saying, but if this is truly axiomatic, then why throughout human history has there been so much effort to subjugate the perceptions of one group by another? Why do we seem to always demand a conformity to our spin on reality? Exhibit A would be religion. A large portion of our body count in history can be attributed to man's insistence on imposing our religious perceptions on another person or people. If they resist, things get bad. But religious imperialism isn't the only offender. In fact, religion isn't the offender at all. Religion is just the lens that shapes a persons view of reality. Science is another lens that shapes our view of reality. It should not surprise us that science, like religion, has been just as imposing and imperialistic, using its materialist view as a ruler for diagnosing the level of "primitivism" or "sophistication" of other people and cultures.
Of course, the materialistic view as a standard view is not shared by all people, and so we have a conflict of perceptions that is very similar to the conflicts between Christian missionaries and their prospective converts. Michael Adas, in his book Machines as the Measure of Men, provides an excellent account of how "achievements in material culture, especially those relating to technology and science, shaped European perceptions of non-Western peoples even before the Industrial Revolution." Adas describes a 17th Century Doctor Francois Bernier whom "had strong reservations about earlier travelers' brief but generally positive assessments of Indian medical knowledge and practices." Bernier made his own trips to India and was forced to "concede that Hindu physicians had achieved some success with cures that stressed dietary restrictions", but in the end his observations of a medically advanced culture was overshadowed by his observations of their other cultural aspects. That the Hindus were not vapid in their view of the body, resulting in a prohibition of such practices as dissection and "bleeding" is a perfect example of the clash between a culture that holds some things to be sacred and one that holds all things to be secular.
Adas also provides a compelling illustration of the arrival of the technologically ambitious Europeans in China during the Qing Dynasty. The Europeans, apparently, were prepared to impress their hosts with all of their gadgets. However, they were quite shocked to discover that the Chinese had many gadgets of their own. Qing China had a long relationship with invention, and it probably made the visiting Westerners feel better about themselves to say that the Chinese failed to take their own inventions such as printing, the compass and gunpowder beyond "mere curiosities" and that it was the Europeans who perfected these technologies. Consequently, the visiting Europeans became increasingly frustrated by China's complacency with old and antiquated technology and their sluggish adoption of upgrades. When the Chinese summarily rejected Western science, technology – and religion – this was taken to be an insult. Before long English writers who had never even been to China began to describe them as a "miserable people." As it turns out, the feeling was mutual. The Chinese, at least according to French writer and historian Voltaire, placed moral philosophy, ethics and conduct above the pursuit of technological gadgetry. So while the Europeans may have seen the Chinese as sluggish inventors, the Chinese saw the technologically reckless Europeans as "hairy barbarians."
Actually, the Chinese take on their European visitors makes for an interesting thought-experiment. If ethics and conduct of living, instead of cool gadgets, were the rulers for levels of "primitiveness" and "sophistication" how would technologically advanced countries measure up to such standards? Would the negative symptoms of our technological progress such as land and air pollution, obesity and "civilization diseases" be evidence of our primitive or even pathological barbarism?
The Western world still clings very tightly to this rigid scientific and technological ruler. Countries with minimal technological abilities are called "developing" countries, a word that may not be as saturated in superiority as such pejorative terms as "primitive" or "savage", but nevertheless still implies the same perceived backwardness that was imposed on Qing China. Western science and technology continues to be noticeably impatient with cultural values that act as impediments. We express this impatience, not just abroad, but even within our own borders. In a 1982 Economic Summit in Versailles, government heads of advanced industrial capitalist countries decided that the progress of the World Economy would be dependent on the "exploitation of scientific and technological development". The heads of this Summit knew (perhaps from lessons in history) that "the fate of our scientific and technological innovations is largely a function of the willingness of the public to accept them. More attention to the problem of public acceptance of new technology is needed." (Public Acceptance of New Technologies: An International Review, Roger Williams & Steven Mills). Each of the participating developed countries conducted their own research to identify reasons for public resistance to new technologies; the US, specifically was able to identify that "moral and religious values (played) a large role in shaping the American public's response to new technology." Consequently, PR campaigns were launched to bring the American public out of its perceptual dark ages.
The term "unscientific" became the new silver bullet term to shoot down those antiquated ideas by modern people still lingering around in the dark ages; still clinging to "primitive" and "superstitious" notions that their bodies and their environment were sacred and special entities that should be respected. Of course, neither "sacred", "special" nor "respect" are scientific terms and therefore are scarcely admissible in any counter argument against scientific expansion.
This, at last, brings us back to the conundrum of what is reality to the brain. Is the only permissible perception of reality the scientific one? If the answer is no, then why are terms l ike "sacred", "special" and "respect" not permitted in those debates that will determine the limitations of scientific expansionism? Words such as "sacred", "special" and "respect" all belong to the broad vocabulary of a human brain which seems to favor its own neurodiversity. It can be said that the rationalist's passive-aggressive preclusion of emotionally driven values, especially as it presents itself in cultural clashes, is a form of neurodiscrimination.
Science functions as the sole arbiter of knowledge and information in our society with its ideas being accepted by most of us as fact. Any other ideas formed outside of science are suspected to be but mere "beliefs" or "myths" lacking scientific verification. Science is granted this authority partly because of its track record of explaining how matter – the stage we all stand on – works. This authority, however, now extends beyond the stage and to its very performers – explaining to us how we work, and why we perceive the way we do. "Scientifically minded" people will often use reason, logic and rational thinking to talk down the exotic whims of their irrational counterparts, explaining that, say, their superstitions are just excitations of some area of the brain (as if their own rational thinking was not also a product of neuronal excitations). The implication is always that the irrational person is a victim of these excitations and has little or no control over the erratic symptoms. The rational person, on the other hand, is supposedly in complete control, which, of course, is false. Neither the rational nor the irrational person is in control of their neurobehavioral symptoms, they are instead proof of these symptoms. In other words, the rational view is just another perception of matter. The rationalist is just another performer on this stage we call reality.
Rita Carter in her excellent book Exploring Consciousness, speaks to how the "objective observations" of science is doomed to the same perceptual pitfalls as the rest of us: "Beliefs operate at an unconscious as well as conscious level, so once a person believes something it is impossible for them, however hard they try, to refrain from seeing evidence in the light of that belief. Careful experimental design – double-blind controlled trials and so on – may prevent belief from influencing how results are arrived at, but at some point those results have to be interpreted. There is thus nearly always a crack through which unconscious prejudice might skew things." The inability to remove the "Perceptual Self" from the scientific process is not a flaw of science but rather a symptom of science – and all other forms of knowledge and information acquisition. In fact, the flaw of science is the hallucination that the Self and all of its perceptual subjectiveness has somehow not made it into its "objective" conclusions.
Later in Carter's book we read that "brain imaging studies show
that in schizophrenics' part of the frontal cortex concerned with
constructing the self are underactive during stages of the illness
in which the feeling of free will is undermined. Conversely, there is
some evidence to suggest that it is overactive in patients with grandiose ideas about their power of controls." In other words, the irrational person with supposedly no control of his or her reality and the rational person with complete control are equally hallucinatory about the level of control they think they have.
In the end, everything we humans think is symptomatic of the irreducible
work of neurons. From animal-like instinct to human-like intuition; from
abstracted creativity to concrete calculation; and, yes, from the
certitudes of religious belief to the convictions of scientific
theory, all of it can be paired down to
the same neurobiological stuff.
There is no hierarchy that places one mode of thinking above another. If anything our different modes of thinking are points laterally (not vertically) distributed on a neurological spectrum, very much like a color spectrum. However, this neurological spectrum describes, not possibilities of colors, but possibilities of consciousness. If we limit our vocabulary to only those terms that are "scientific" we'd be a "mono-lingual" species that restricts itself to only one very small area of our neurological spectrum. Everything from the poet's prose, the priest's praise, the artist's aesthetes and the scientist's syllogism are all phenotypic diversities belonging to a continuum of neuronic perspectives that are equanimous with each other.