SNIFFCODE.COM

WHO'S IN YOUR GENES »

Thinking Digitally

"you are what you think,
I think."
– S.C.

"Call all...

This is our last call before eternal silence."

The mind is like a MIX TAPE

Those were the last words of a French marine before Morse code was dissolved as the official communication language.

The year was 1999.

When we think of dead languages, we usually don't think 1999 – more like 1999 B.C., never the less, on the cusp of the 21st Century a modern language was brought to rest. Morse is officially a dead language. And yet where would we be without it? Wireless technology works pretty much the same way that Guglielmo Marconi had planned the radio telegraph system: get rid of the wires and send information over radio waves. Instead of today's 0s and 1s, his system relied on the dots and dashes of Morse code. As we stand inside the 170+ years that spans the dots, dashes, 0s and 1s, it's not difficult to miss the profundity in our communication evolution.

The popular consensus is that communication most likely began
with a binary code that comprised grunts and growls. Every so often
a swift whack over the head to get the attention of another was needed.
Needless to say, when someone finally wrote something down that was
nothing shy of a major breakthrough. Back then, communication was mostly
visual: point at something or, perhaps, draw something and then point at that.
This was the start point, and from there language bustled its way from the concrete
criteria of cuneiform to its incarnation of symbolic and phonetic value with the Egyptian
hieroglyphs. From Egypt's writing system came Proto-Sinatic, Proto-Phoenician to Phoenician.
From Phoenician we arrive at the Aramaic scripts, but also the chain of languages that bring us to today: Greek, Etruscan, Latin, Paleo-European and Modern European.

To Morse.

Transmitting Code/Mind

Morse can be deemed, analogically, as the cuneiform of digital communication. Where we stand today, since the inception of digital cuneiform, is, say, hieroglyphics. Some could argue that Morse Code was not a language but, as its name suggests, a code. The final form of all those dots and dashes was the real language – the alphabet. However, the Alphabet can also be seen as a type of code. The letter "A" gives us a variety of sounds, but originally it was simply a picture for "Ox". Flip the "A" on its head and you can see the last vestige of its pictographic source: the feet become the ox's horns. Just as dots and dashes are symbols for letters; letters are symbols for sounds. Pictographs broke through a communication chrysalis to become letters, and now it seems as if language is ready for its next evolutionary breakthrough, struggling beyond letters to upgrade the human mind to the next mode of communicating: perhaps, a digital language.

But what exactly is digital language?

We know what it means when someone says "I think visually"; it is similar to our ancestors who were thinking pictographically. Today, like the Egyptians, we can think both visually and phonetically. Kind of how algebra can describe geometry without pictures, our phonetic vocabulary allows us to use words like "poignant" to elicit visceral, not visual, stimuli.

But what exactly will be going through our minds when we think digitally? What is thinking digitally? That is difficult to answer because we aren't quite thinking digitally yet. Right now we are in an early digital communication civilization. We will have to pass through many proto-digital languages, burying the dead languages like Morse as we move into new incarnations of digital speaking and thinking. For now, thinking digitally probably means, like our Sumerian ancestors, thinking with pictures: with avatars, emoticons, acronyms and other keyboard characters.

WTF

LOL. What a crude symbol, when you think about it.

LMFAO. Even cruder.

WTF... you get the idea.

We can probably find something akin to LOL and LMFAO somewhere on a cave wall. Today we leave these crudities on a Facebook Wall. Paper was the interface for the written word. Today we have the Graphical User Interface. Graphical... in other words, pictographs. Or perhaps, hieroglyphs. Speaking of Hieroglyphs, here's another analogy:

The Greek philosopher Iamblichus, like many Greeks, took a special interest in the language and writing of the Egyptians. He wrote a fascinating manuscript on The Origin of Egyptian Symbols and informs us that the Egyptian priests modeled their hieroglyphs on:

–––––––––––––––––––––>>>

"the productive principles of the universe... and how Nature (the productive principle) makes a likeness of invisible principles through symbols in visible forms"

Egypt's Priests recognized their own language and civilization as being quite advanced, and saw that there were, as Iamblichus described them, "inferior tribes" that lacked this development. Consequently, they devised:

––––––––––––––––––––––––––>>>

"a mode of initiation into the mysteries (of Nature) which is appropriately concealed in their symbols"

LOL

At the risk of aggrandizing the geeks of our society, I like to think of computer programmers as the "priests" of our digital society who initiate the rest of us into the act of thinking digitally. I make this comparison for two reasons: 1) Geek-speak is often as difficult to discern as, doubtless, hieroglyphs were to the "inferior tribes". Their sharp, critical minds have a way of making mediocre thinkers scurry away like small animals. They can quote Richard Feynman and Obi Wan extemporaneously. They can also become noticeably impatient with our obtuseness. 2) Despite this, there's a sense of egalitarianism with these folks. The roundtable for intellectuals used to be very small and exclusive. With the Geek-Priests not only have the tables turned; they have also enlarged to make room for amateurs and even intellectual tourists. Like the Egyptian priests who, if we are to believe Iamblichus, intended for their symbols to be inherited by future generations; Geeks sort of do the same thing when they convert all those 0s and 1s into graphical symbols for the laity. Programmers have established for the "inferior tribes" (of which, I belong) a repertoire of graphical interfaces (hieroglyphs) as an initiation into eventually thinking the way they do – thinking in code. Thinking digitally.

To help my claim, I'm going to be audacious and turn Iamblichus from Greek philosopher to Geek philosopher. What that means is that I am going to tweak his insights with the addition of a few words: digital, technology and icons. Here goes:

"The productive principles of the digital universe... and how technology makes a likeness of invisible principles through symbols and icons."

See what I mean?

Are you here?

I just pulled all of this out of my ass so don't get too anal about my analogy here. But when we use a historical context such as this one, we can perhaps see that where we stand right now is only the beginning of where we are headed.

Book 1 in our Communication Chronicles was the transmission of information via pictographs, ideographs and phonemes. With the advent of Morse Code we closed Book 1 in our chronicles and began Book 2, which is where we are now, bookmarked at some chapter on smart-phones. BTW, we are, appropriately, most likely reading Book 2 on an iPad or Kindle. If we step back and review the changes from Book 1 to Book 2 we see that both the medium and the messaging has evolved. That is to say, both the symbols that we use to embody information and the methods that we use to carry that information have become more abstract with time. So much so that, unlike a papyrus scroll, the "document" that contains this very article you are reading now sort of, well, doesn't really exist. Geeks know that this is only partially true. It exists as information (whatever that is).

Contemplating Mind

They also know that the same thing can be said about me, my computer, the table I'm working at and well, everything else. In his book Programming the Universe, Seth Lloyd speaks like a true Singularitarian when he writes that "Life, language, human beings, society, culture – all owe their existence to the intrinsic ability of matter and energy to process information." Right as Mr. Lloyd may be, and even though I am constantly reminded that all matter at the atomic level is mostly empty space, I seem to have the uncanny fortune of always stubbing my exposed toe against that minority part of some living room object that is not empty space but definitive matter. The proof is in the profanity: I've invented new expletives that confirm that contrary to what Morpheus said in The Matrix -- that *%@#$*!!! table is definitely there. So, yeah, there's something about "me" as information that seems a bit more concrete than this document I'm writing.

But I get the idea.

Nature has created a graphical user interface called matter, and in many instances it is as real and self ruminating as the stuff of that iconic Thinker statue. And it's there to help me navigate Nature's spaghetti code of information. Humans have done the same to help me navigate the spaghetti code of any Microsoft software. To finally get back on course with my original point (whatever that was), I imagine that Book 3 in our beloved Communication Chronicles will be, as Kurzweil anticipates, the point of convergence between Nature's information and Man's. This would ultimately close the gap between the two because Nature's Information is, effectively, us. We are a cluster of genetic information that has ambitiously created our own cloud of virtual information. The gap that exists between these two is where we disconnect (and eventually we do) from our 1s and 0s... a "gap" that is shortening with time.

Upload your mind!

In his book World Wide Mind: The Coming Integration of Humanity, Machines and the Internet, author Michael Chorost describes himself as "irreversibly computational" on account of his cochlear implants which compensate for his deafness. To replace the loss of inner-ear filaments which transmit sound to the auditory nerves via vibration, Chorost's implants uses electrodes to send similar signals to his auditory nerves. He writes: "an external device sitting on my ear picks up sound, digitizes it and radios a stream of 1s and 0s through my skin to a ceramic-encased microchip. The chip receives the radio signal with a tiny antenna and decides how to strobe the electrodes on and off. By choosing which electrodes to fire at any given moment, it makes my auditory nerves transmit sound information to my brain." Transhumanists and Ghost in the Shell fans are probably the first to appreciate the broader implications of Chorost's testimony (and should probably buy his book if they crave more). Such implants are the beginning stages of the afore mentioned Point of Convergence. The rest of us who interact with 1s and 0s through a Graphical User Interface are a step removed from what Chorost describes. He is integrated with the 1s and 0s. So much so that he has had to relearn how to hear. He writes:
"(The Implants) stimulate the auditory nerves in a way that is quite different than in a normal ear. Because of that, I had to learn how to hear all over again. Voices sound like gibberish at first. It took me months to learn how to interpret the software's representation of vowels and consonants in English."

To be clear, the implanted micro-chip in Chorost's head was put there by a surgeon with a power drill. As Chorost offers up in a keeper one-liner: "Technology advances by integrating." He is not working outside the 1s and 0s, but working directly with them. He is, quite literally, Thinking Digitally and consequently a step closer to being wholly "inside" the cloud of information.

But the gap still remains. Just as the dream of World Peace and human beings merging together as one is retarded by divisive "Us and Them" thinking; the much anticipated Singularity remains postponed by the as yet unreconciled dualistic thinking of "Us and I.T." Of course, in the case of the latter, it is not a subjective barrier that impedes us: our Geek Priests literally can't figure out a way to seamlessly merge our thinking with that of our Information Technology. Can't, however, is a strong word and certainly doesn't mean they aren't trying. Oh, they're trying alright...

The Blue Brain Project

The Blue Brain Project (whose official website uses a green color palette) began in 2005 with project leader Henry Markram and entails a
team of neuroscientists attempting to reverse engineer the mammalian brain into a working digital counterpart - a super neuroware
that would run on a super computer.
At the start of the 20th Century
man acquired a habit of comparing
himself to his machines: cameras,
telephone switchboards and,
eventually, computers being
the usual metaphoric
suspects. All of
these metaphors
were off.

Delete your Mind!

We now compare a computer's power to a single neuron. As Rita Carter points out in her essay "Brains Within Brains", "neurons used to be thought of as simple on/off switches, but now they appear to be little 'brains' in their own right. Each one may receive information from thousands of others, and instead of merely passing on this information, it sifts and prioritizes it, favoring input from 'reliable' sources, ignoring that from others, and adjusting its output accordingly." She concludes by writing, "The most powerful computer, according to one communications expert, does not have the information-processing skill of a single neuron." The expert who made this claim, however, did so in 1999, six years before the advent of The Super-Computers that ostensibly will support all the countless brain-units of the brain. The Blue Brain Project is intended to shed light on the nature of consciousness, but hopefuls believe the project will result in something bigger. If we merge the "stream of 1s and 0s" passing through Chorost's head with a Blue Brain (or Green), we can imagine, with a slight stretch, the wholesale digitization of a person and sending that entity into the info-sphere. I call this Project 2501 (with my apologies to Ghost in the Shell) but you can call it The Singularity like everyone else. Whatever the nomenclature, Thinking Digitally would mean to Be Digital. Virtual Technology is Visceral Technology.

The motivations for a Singularity are as varied as the seemingly reckless anticipation for it. For some it is simply a new frontier to be explored. For others, however, it is the long sought after prospect of escaping the annoyance of aging and the permanence of death. But if this evolution of existence is really a part of the Communication Chronicles – where we go from using language to transmit information to becoming the language and information – then we may be no better protected from the dreaded eventuality of death and dying. Every language is subject to the cycles of life, death and eventually extinction. A language becomes extinct when the people do. With digital languages it may be the other way around: we become extinct when the language does. Think Morse. And now imagine the fate of any person short-sighted enough to have "merged" with it. Some might say that the solution is upgrades, but believe me when I say that I have a box full of three-inch "floppy" discs and even zip discs that will never see such a promised land. Sure the information on there still lives (if you can call that living) but to me, it seems like the digital equivalent of having been condemned to a senior's home. Even if I were to unpack these discs, excavating the data will still be troublesome since the programs used to create them, and the computers compatible for said programs, are all now extinct.

Information programmed us all to die and, alas, even digital death may be unavoidable. So long as there is a delete key – or simply a delete function – there is always the looming possibility of, as our Morse-Marine put so poetically, the "eternal silence."