Worldblindness

The All Too Human Disorder

It is like a pair of glasses on our nose through which we see whatever we look at. It never occurs to us to take them off.
– Ludwig Wittgenstein

Professor Simon Baron-Cohen has become these days the prominent face behind autism research and scholarship, and he has a very pleasant face. It is a face expressing warmth, friendliness, curiosity, intelligence, ambition and a deep willingness to be of service to others. It is a face expressing pride, and has profound reason to express pride: Dr. Baron-Cohen’s numerous books, articles and research reports have opened a veritable cornucopia of alternative theories and lines of inquiry into the workings of autistic minds. His skill as an experimental scientist was prominently on display from the very beginning, when as a graduate student at the University of London and at a time when autism research was badly in need of a scientific antidote to the fuzzy psychological theorizing of the day, he along with colleagues Uta Frith and Alan M. Leslie helped construct, perform and popularize the now legendary Sally-Anne test. That test, a meticulously crafted architecture of parameters, controls and experimental technique, has helped demonstrate conclusively that autistics fundamentally process information in an entirely different manner than the rest of humankind. Autistics think differently than non-autistics—the Sally-Anne test has left little room for doubt about that.

In attempting to characterize the disorder they believed their experiment was crystallizing, Baron-Cohen and his colleagues employed the phrase theory of mind deficit, a depiction intended to highlight how autistics are disabled or delayed in their ability to ascribe thoughts, beliefs and deception—the workings of mind, in other words—to other humans and even to themselves. That phrase, however, proved over time awkward to use and was sometimes misinterpreted, so Baron-Cohen later coined the word mindblindness as an alternative designation for the underlying deficit. Mindblindness is in many ways an effective choice. It captures something fundamental about the condition and also about the challenge developing autistics will face as they begin realizing how other humans are both crucial to their biological well-being and also utterly different in how they perceive the world. As strange as this may sound, autistics effectively mature by attaching themselves to their own species, by learning how to perceive their environment in a manner similar to that of the other Homo sapiens they live, play and work with. That process is not easy for autistics; it involves assimilating a broad assortment of techniques associated with species awareness, species imitation, social ranking, aggression, cooperation, sex and so on, a set of behaviors and mutual cohesion forged from the unfathomably slow furnace of evolutionary time and implanted so firmly inside the instincts of neurotypical humans they do not give it a second thought—in fact, it is the basis of human thought, the foundation of what we call human mind—and yet all this remains remarkably foreign to autistics. The tremendous struggle autistics must wage to acquire through effort what comes naturally to the others is evidenced by the discomforting fact that a number of autistics make only limited progress. In effect, those we might describe as being the most severely autistic are those who remain the most detached from their species.

Nonetheless, and perhaps counter to the prevailing wisdom, severe detachment is more the exception than the rule. Our growing awareness of autism’s prevalence and its genetic underpinnings makes it clear that most autistics do learn to become part of and to fit into their human surroundings, both in perception and behavior, and this has been quietly taking place for a very long time. Autistic attachment was undoubtedly happening as biologically modern humans were spreading outwards from the African hills and savannahs into Europe, the Middle East, Asia and beyond. It was happening in early settlements along the Euphrates and Tigris rivers, near the banks of the Nile, and at the edges of the Indus and Yellow too. It was taking place across the plains of Sparta and around the bustle of the Athenian agora, and it was most certainly happening under the shadows of the Himalayas and along dusty roads connecting Nazareth and Jerusalem. It was happening within pockets of medieval enclaves and under the ceiling of the Sistine Chapel. It was happening famously in many great European cities—Florence, Vienna, London, Dublin, Copenhagen—and it was happening in the virgin woodlands of the New World, most notably near the shores of solitary ponds. Autistic attachment has been taking place and continues to take place all around us, and its impact has not been benign. Although we remain uncertain exactly how and when autism first made its appearance within the species man, its thin yet stubborn thread has been stitching a far-reaching pattern into the many corners of our known history, helping shape a stunning transformation we seem only now to be slowly awakening to.

Autistic attachment has been happening also at the great universities: Göttingen, Paris, Edinburgh, Princeton, Cambridge—numerously and significantly at Cambridge—but perhaps not so much nowadays. The great universities were once the enabling centers of mankind’s explosive advancement in spatial and temporal knowledge, a widening gaze into the richness of our universe that found expression in such a myriad of forms—literature, theology, science, mathematics, art, music, philosophy. And like their monastic precursors, the great universities were also the occasional provider of refuge or the lone receptive audience to those few strange souls utterly lost in their dealings with the everyday world, but ironically enough having something of importance to say about that world. Alas, it appears to be no longer so. Mostly gone are those days of the disheveled, half-mad hall wanderer, enigma to his colleagues but sudden light to the world. Today’s universities are mostly a business enterprise, the transitory home to paying thousands, and to maintain respectability each institution prides and touts itself on being a foremost center of scholarship and research—legitimate sounding enough, to be sure, but at heart little more than a packaging of safe attitudes and acceptable procedures designed to keep the enterprise from wandering too far off course. Today’s most highly regarded academicians are also the great popularizers, the widely appealing personalities such as Dawkins, Pinker, Diamond and Dennett, those who can mix and match pieces and remainders of the most ancient hypotheses and the latest buzz, sprinkle all with enough spoonfuls of linguistic sugar to help the hard thoughts go down, and deliver finally a plausible-sounding stew of ideas and theories cooked up from seemingly every officially stamped and over-researched ingredient ever stocked by scholarly man, and failing in just that one spice only a few in the know would ever miss—vision. As centers of personal advancement, today’s universities now excel. But as the enablers of mankind’s advancement—well, that has mostly moved on.

Professor Simon Baron-Cohen is indeed today the most prominent face behind autism research and scholarship, but his is not the face of autism—far from it. In perhaps one of the greater ironies of modern academic science, Dr. Baron-Cohen more accurately reflects and represents autism’s inverse image. With his research, writings and theorizing drawing deeply upon the foundations of scientific and other world-altering revolutions, and with his prominence and appointments placing him in close contact with nearly every known experimental fact and genetic detail regarding the subject of autism, Baron-Cohen stands within the swirling center of an array of forces coalescing into a major key for comprehending mankind’s amazing transformational history—in fact, he practically holds that key—and yet he does not see it. Perhaps distracted by the politics and prestige that now attach to modern scholarship and academic office, and certainly overwhelmed inside his flood of charts, graphs and endless statistics that serve as today’s only acceptable currency for backing theories on the nature of autistic thought, Baron-Cohen fails to apprehend that his truly ingenious Sally-Anne test speaks volumes about non-autistic cognition as well. Non-autistics experience a condition I would label as worldblindness, an innate difficulty at stepping far enough outside of personal, species-driven mind to see clearly the forces creating, shaping and enveloping that mind. Much like mindblindness, worldblindness too is a condition that can be compensated for and overcome, and has been overcome in large degree by mankind as a whole. But here too the tremendous struggle involved is evidenced by the fact many individuals make only limited progress. In effect, those we might describe as being the most severely non-autistic are those who remain the most detached from their surrounding world.

Of course, I realize to many this must all sound like the ravings of a half-mad hall wanderer. While it is true that all generations have found the broader, more daring visions easier to explore in hindsight than in the current moment, in this era and in this culture we seem to have developed some particularly crippling ways to remain trapped inside our immediacy. The academic institutions have now bogged down from the same dogmatic and pecuniary-inspired tendencies as befelled their ancestor, the medieval church. The less sheltered world of the everyday man, recipient of late of so many material benefits and life-increasing forces, seems to show little interest in employing the dividends from those benefits to explore the nature of their source, and thus has settled into the dubious comfort of an uninspired obesity. We celebrate our glory as man without recalling that a mere moment ago on the cosmological scale there was no glory at all, and we honor the supposedly unique qualities of our human brain as though wisdom is to be plumbed from a mass of neurons. As has happened so many times in our past, we remain reluctant to go outside ourselves, hesitant to look beyond what we think we already know, and thus it is we remain so agreeably content inside our tenures and our comfortably appointed offices, and offer up as fresh knowledge the burgeoning awareness of a modular, experimentally probeable, statistically averaged Homo sapien mind—an offer which indeed is human, all too human.

* * *

In taking a careful look around us, we see first that the rewards of science, technology and mathematics now thoroughly dominate our human landscape. The cascade of knowledge we have gained regarding our physical world has allowed us to blast a path far beyond the conquest of such immediate needs as food, shelter, warmth and health, and has opened to us banquets of materiality and realms of understanding so vast as to be dizzying. Thus we might be forgiven for so easily forgetting how greatly different our image of the physical world was just a few hundred—not to mention a few thousand—years ago. How today can we accurately recall, how are we to truly see, that our not-so-distant forebears once woke to a sun circling a course around them, slept under stars churning on nearby, intricate spheres, and touched with their hands the fallout constituents of earth, water, air and fire? As quaint as those notions must sound to us now, they were brilliant perceptions compared to the viewpoints of more ancient times. It would be nearly impossible to depict the natural world as it must have appeared through the eyes of Cro-Magnon humans, for instance, although if we could form a guess as to the nature of the gaze of the apes, chimpanzees and bonobos, we would have a fairly close estimate.

Human history—extremely recent human history by the cosmological scale—has been the kaleidoscopic drama of man continuously augmenting and re-arranging the internal image of his external world, with the pace of this transformation accelerating of late to the point of breathlessness. Those scientists prone to marvel over the supposedly specialized skills of the human brain, intimating that the core power of our cognitive abilities has been within us almost all along, are surely betraying a particularly poor sense of the flow and reach of time. Nearly all the mathematical, scientific and logical gains we humans are apt to attribute to our pragmatic, modular minds have been the by-products of just the last several generations. By contrast, that filled-with-potential Homo sapien brain of just a few thousand years ago, despite being physically constructed exactly like our own, had nary a clue. Thus it is I would like to suggest science and mathematics are best described not as the story of mankind using its prodigious mental skills to discover and explain the nature of the external world. Far better is to reserve the names of those disciplines for the evidence that the external world has been thrusting its structure and form back upon us, with each delivery building ever more rapidly upon foundations previously sent. The only marvel here is the unusual nature of the delivery mechanism.

Consider what we know of the lives of those individuals who have most opened our eyes to the compelling visions of science and mathematics—individuals such as Archimedes, Da Vinci, Newton, Gauss, Darwin, Edison, Einstein and Turing. This is not an ordinary collection of men, and I do not mean that in the sense they are extraordinary by virtue of their achievements or discoveries. This collection is remarkable in that it is composed almost universally of individuals who have held an uneasy attachment to their fellow humanity. Here we find dreamers, late talkers, loners, grumps, misfits, obsessives, social gaffers and the occasional pariah to country and neighbor. What we do not find is the colleague who will give you the hearty hail down the hallway and buttonhole you in undertones about what you thought of the dean’s behavior at the previous night’s cocktail party. No, almost to a man they are nothing like that. These men thought differently—literally thought differently—than the majority of their contemporaries, and it is this strangeness of perception that accounts for both their ongoing awkwardness with their fellow man and their powerful receptiveness to the previously unseen form and structure of the natural world.

I realize we have not customarily regarded scientific genius in quite this fashion. Even those able to recognize the quirkiness and the social fallibilities that often accompany our mathematical and scientific giants are apt to attribute such characteristics to the pressures and conditions of genius itself, never pausing to consider whether this might be a confusion of cause and effect. And as for the broader community, the confusion there tends to run much deeper, for in that shifting arena genius is mostly a matter of acclamation, the need to crown the man more than the inspiration. Consider, for instance, that undisputed scientific icon of the twentieth century, Albert Einstein, mostly unknown and sometimes scoffed at during the years of his best work, but today universally proclaimed the foremost scientific mind of the modern era because—well, because everyone says so, it is acknowledged by all. Acknowledged by all. Such a revealing phrase considering that the all knows next to nothing about relativity or the photoelectric effect or Brownian motion, with even many of today’s most capable physicists—relying more so upon mathematical models, textbook analogies and chalkboard explanations—able to perceive little from Einstein’s original, childlike visions into time, space, matter and energy.

Even on those rare occasions the masses do place more emphasis on the discovery than on the discoverer, this would seem to be motivated by the discovery’s ability to at least temporarily flatter us, perhaps never more so than with that other great twentieth century icon, the computerized machine. So alluring has this concept been as an explanation for human intelligence that today’s cognitive and neuroscientists have nearly trampled themselves in the mad rush to apply its wired mechanics to the workings of our own supposedly prodigious biochemical minds, and in their haste somehow overlooking the fact that in the workings of Turing’s superbly simple model, the intelligence is on the tape.

Of course, to bring up the names and lives of individuals such as Einstein, Turing and all the rest seems as quaint today as talk of celestial spheres and the four classical elements. Science and its related disciplines have become in this culture so legitimatised, so prestigious and so dominant they are no longer the realms for paradoxical inspiration within the solitary individual—they are instead the employment for millions. Ever-larger teams of researchers now generate the vast majority of our new discoveries, with the immense volume of their published work compensated for only by its inexorable drive towards minutia. It seems as though we are no longer making ourselves receptive to the nature of the external world so much as we are competing to announce its leftover nooks and crannies. Successful scientific and mathematical research these days is measured by its ability to generate fresh headlines and win the additional grants. We find ourselves fighting for remnants of a scientific glory as though we do not recognize crowdedness has mostly destroyed that glory, and out of vague fears we might now be treading the byways of the irrelevant, we find ourselves insisting ever more stridently, ever more dogmatically, on the sole priority of scientific method. But this is just another form of blindness, a form of worldblindness. All endeavors, even those scientific and mathematical, will degenerate without the influx of broader, more revolutionary vision. In the flush of our current technological successes we have been ignoring this inevitability, far too willing these days to rest content upon the laurels of our numerous proofs and our well-funded experiments, evidence we take as conclusive for the progressive nature of a scientific, logical and mathematical mind—a theory which indeed is human, all too human.



The two disciplines of science and philosophy would appear these days to be so distinct from each other one might hardly suspect they must have originated from the same upwards gaze. But picture those humans who were the first to receive the vague impressions of pattern and repetition from the lights of the surrounding sky, and ask yourself what might have compelled them to break their solitude and attempt to convey to the others the impact of those impressions. From what history has left to us from the earliest known philosophers—who of course came along much much later—we find them still as equally struck and awed by the nature of stars, number and substance as they were about the nature of themselves. This concentration that runs outwards through the expanses of time and space, here is where we can find the beginnings of astronomy, mathematics, chemistry and all the other sciences. And it is from the reversed direction of that gaze, the intense gathering of such abundance into the singularity of a particular point in time and space—the singularity of the individual—here is where we can find the origins of philosophy.

Or can we? I am aware I must be attempting a kind of literary trompe l’oeil with such a description, for a survey of the philosophy as we might learn it today in our schools would reveal nothing of the subject I have attempted to picture here. Blessed with the benefit of perfect hindsight, we know full well today the value that would arise out of those first skyward stares, so it cannot really surprise us to realize that when man would finally forge the languaged means to diffuse communally the strange new concepts, he would not long afterwards cast thought and knowledge themselves as subjects worthy of mastery, control and pay. From the sophists of ancient Greece all the way through today’s government-funded think tanks and well-endowed professorial chairs, we see the abundant evidence that this alternative, more businesslike approach to the study of human thought has been continuously upstaging and usurping the original. Today it seems we can no longer afford to remain content with mere experience and impression alone. Today we must consult the system, follow all the processes, navigate the mirrored intricacies of synthesis and analysis, and above all else we must repeat as often as possible the ever-multiplying layers of classification upon classification—the innumerable slices, dices and reconstitutions marketable to almost any purpose, the very reason itself behind mankind’s arbitrary division of its endeavors into such a motley assortment of seemingly separate disciplines. So cataloged, so multivariate has our knowledge base become today it is now the larcenous target of a sleek new breed of philosopher, the kind who can step forth onto our world-historical stage and proclaim all words and concepts so overripe with meaning they might just as well be assigned a random meaning, and then proceed to make a prolific and profitable career out of doing precisely that.

Am I sounding a little too peevish? By no means do I intend to ridicule what has been truly valuable from this long, much-honored and very worldly branch of philosophical tradition. It has served tremendous and constructive purpose inside cultures that might hardly have been expected to know much better, and it has prompted breathtakingly brilliant assemblies from many of our history’s most talented intellectual giants. Where might we be after all without Aristotle’s massive reorganizations, Descartes’ cool rationalities, or Kant’s thorough critiques? No, I would not for the life of me speak out against what has proven best about this fine tradition, but sometimes much oppressed under the weight of what has been not so fine about it, I wish to ask but one simple question. Whatever happened to those first individuals, the ones with their eyes transfixed on what we now call the heavens?

Consider that line of thinkers we might best represent with the names Socrates, Thoreau, Kierkegaard, Nietzsche and Wittgenstein. Here again we find a surprisingly consistent and barely species-attached collection of men, for in addition to the designations applied to the prior list of scientists and mathematicians, we can now add those of madman, idler, blasphemer and corrupter of youth. These are men who often felt compelled to live apart—even made experiments out of living apart—and their combined histories of failed engagement proposals and one bitter marriage might serve as testimony enough to their inherent social awkwardness. By the customary standards that hold sway within human society—wealth, friendship, community, lineage—these individuals could in no way be described as successful, even their now famous words having produced little of impact during the course of their lifetimes, unless of course we wish to count that one case where they earned a sentence of death. And it must strike us as somewhat puzzling now that such words have indeed become famous, for the utterances from this breed of philosophers are nothing like those of the respectable academician, or even the worldly wise old man. Instead we find here an odd assortment of loosely organized aphorisms, fanciful flights, cryptic remarks, bombastic pronouncements and highly defensive barbs—more like the stutterings of an adolescent in the grip of some kind of language impairment, or perhaps more like the breathless stammerings of a child who has just run in from the far distant field and cannot pull himself together quickly enough to describe what he has just seen. From the quietly desperate perspective of our day-to-day—too busy now to get to know ourselves well, and too exhausted to attempt to leap über Mensch—it would seem there is scarcely time enough or reason enough to heed the words of those least capable of seeing and toiling as the rest of us do. So why is it that we can still find ourselves so transfixed, transfixed now on the stutterings and stammerings that strike us, like once did the stars, as both bewildering and beguiling?

Of course, my literary deception must surely be going too far to suggest the persisting impact of rogue thinkers in the likes of Socrates and Wittgenstein. Philosophy today has settled in to being far too worthy and far too dignified a profession to tolerate still more of that kind of thing. The current tradition is safely guarded, tucked away now under the mindful care of those most thoroughly schooled in its most practicable arts. Do you not believe me? Ask any graduate student, the ones chasing their degree and eager for the most sought-after positions in the academic world: of utmost importance these days is to align yourself with the prevailing camp, the most recently published book and the latest reining ism, and God help you should you find yourself on the outside looking in, just one poor soul left to fend for one’s own self, no suitable letters of recommendation firmly in hand. Perhaps not quite as touched these days by the mass popularity science and technology currently enjoy, philosophy compensates by building for itself an exclusive and posh neighborhood, topped off with ivory tower and fenced in by a seemingly endless array of dense sentences and jargon-strewn paragraphs—bewildering yes, but beguiling to absolutely no one. All this is just another form of blindness, justification for worldblindness. In truth, we are deluded to regard the finest in human thought as something hidden or overly complex or accessible only to the greater intellect, the highest cognitive bidder. What is finest in human thought exists all around us, lives and grows within an environment we have all helped to build, that we all share. Those who would most loudly proclaim the superiority of the most superior thinking animal are only desperate to claim that top billing for themselves, and thus it is that despite our rapidly expanding cognitive awareness we find ourselves still overlooking what exists right before our very eyes, lost inside the reveries that exalt the virtues of an ethically noble, linguistically complex and oh-so-thoughtful mind—an idea which indeed is human, all too human.



All hail the artist, for the artist widens a path for us all. The earliest stargazers—their halting gestures too strange and foreign, their motives too vague and suspect—would not by themselves have been persuasive enough to urge the huddled bands of kinsmen into a broader acceptance of previously unseen worlds. What was needed here were the magical and enticing tones of a Pied Piper’s flute, the inducements towards the first steps of a long, snaking and rhythmic dance that would pull each generation ever more teasingly, ever more inexorably away from an animal existence until then so deeply and forever known. By no mere coincidence, as humans would begin forging the languaged means to diffuse communally the strange new concepts, they would begin also strewing their expanding paths with beads, cave paintings, grave markers, drums and chant. This incense-sweet smoldering accompanying the transformation of the world’s eternal structure into forms immediately pleasurable to see, hear and hold would inevitably grow one day into a fire too intense to be contained any longer. We should not forget that the Greeks were set ablaze first and foremost by the sung words of a wandering poet, his listeners’ ears made at last too full with the vividness of flaming towers, seething jealousies and rosy-fingered dawns. Not just the sciences and philosophy alone were seen now more clearly in that glowing light—music too, architecture, history, drama, sculpture, and of course, poetry itself. Those classical flames feeding upon themselves would eventually burn themselves out, but on their ash heaps we blaze today yet once again, the current conflagration now five hundred years into its torrid running, with nearly every corner of our massive planet now seared and charred. Every direction we turn: stories, instruments, gardens, games, costumes, symphonies, skyscrapers, tapestries, dance, jewelry, bell towers, cuisine, fountains and song. And these are not the playthings of just the select few, they are the beckoning signposts suggesting a direction forward to us all. That for nearly every human alive today the memories of animal existence appear to be so remote—despite being so remarkably near in actual time—we owe to the charms of the artist as much as to anything else.

Spreading flames, however, carry inherent danger. Although the origins of art have always been inspired and revolutionary, the vast majority of art by necessity must remain imitative, fueled by our instinctive gregariousness and behavioral copying. There is nothing to be criticized in this replicating arrangement—its regenerative value is beneficial to everyone, and it provides a means for bridging the long gap between the unique visions of solitary individuals and the common understandings so necessary in a society of many. But art’s widespread accessibility and reproducibility also makes it vulnerable to the ravages of envy, pilferage, sewn conflicts and mass confusion, for once the true artist has been hailed by the many, it is only a matter of time before the many will want to be hailed as the true artist.

Today’s universities, workshops and other art institutions churn out each year literally thousands upon thousands of new novelists, painters, architects, composers, sculptors, playwrights, critics, designers, and of course, more poets, the finest of these slated to do their greatest work in—the universities, workshops and other art institutions. If today’s academic factories were able to be honest with themselves and acknowledge that these mass productions align almost without exception to the imitative aspect from art’s long history, then perhaps there would be little harm in these assemblies of assembly lines. But today’s academic factories, too hungry for the wherewithal to keep the assembly lines running, are not quite able to be so honest with themselves, and thus continue to market a veneered fiction that their classroom products go forth not only degreed, but also inspired and revolutionary. They do not go forth so inspired, and trust me, having been one of those classroom products myself, I am uncomfortably well positioned to say. These self-lauding programs designed to perpetuate a self-recognized glory result in exactly the kind of art one might expect from a tinsel-bedecked stampede—excessively self-conscious, focused entirely on style and technique, and ending finally in a veritable stink fest of art for art’s sake. For all the countless creative births our academic institutions now offer to serve as midwife to, the barrenness of the results has never been more evident.

For some much needed contrast and relief, let us consider instead a set of non-degreed artists: Michelangelo, Blake, Beethoven, Van Gogh, Tolstoy, Dostoyevsky. We need hardly mention we have gathered here yet once again another bad-tempered, standoffish and slightly deranged collection of men—that theme by now is quite recognizable, its dissonant strains more than a little familiar. Artistic temperament has not always been something easily feigned and staged by our modern flashes in the pan; a glance through the biographies of men like these reveals them as genuinely frustrated, sullen and vexed—and not just in their artistic endeavors but indeed from straight out of the cradle. Their rudeness is a talent almost too natural, their lack of regard a little too easily gained, and in the brazen opening chords of an heroic Third, across arson-savaged nighttime skies, and throughout that most insolent ceremony of them all, an unholy union of heaven and hell, we discover that the truly inspired artist snubs all form of manner, curses all degree of convention, and introduces the power and beauty from the next world into the home of this one by first crashing down the door. As already stated, the vast majority of a culture’s art need not be quite so creative, and therefore not quite so blatantly destructive—art’s imitative aspect allows room enough for a broad range of practitioners, providing perhaps mankind’s most effective forum for mixing together its diversely productive traits. But at that first setting of the blaze, at the initial spark against the flint, one does not find a cadre of workshop wonders standing about. Artistic fires originate from a source we do not easily recognize and are channeled through individuals we do not at first accurately perceive, the ones who are mistrustful, petulant, sensory overloaded, the ones who have been building a lifetime’s capacity to struggle and to struggle well—artists such as Michelangelo, Blake, Beethoven, Van Gogh, Tolstoy, Dostoyevsky.

Of course, now that our palettes are so richly vibrant with fellowships, editorships, associate professorships, now that our trained artists are so well versed in stipends, grants, prizes and offices of the laureate, surely we can write the final chapter on these themes called struggle, and declare all insolent revolution to be a technique gone completely out of style. No, wait—that last sentence does not quite belong here, does it? The modern artist is exceedingly capable I hear of waxing eloquent and poetic when a most sorrowful song is needed: that is him there at the bursar’s window, crooning his meager share of the endowment funding, his crowded place at the public trough, and the barely adequate living conditions at the local college. Is it any wonder a larger community at last turns disgustedly away, content to settle for what might be called some mindless entertainment? Well, that too is a spreading danger, but at least there might still be felt something of the warmth from once-spectacular fires, at least there might still be found strewn about glowing embers waiting to catch and blaze yet once again. Artists who prattle endlessly to one another do not see what they are called to see—they are blind, worldblind. The genuine artist can conjure up images of that same expansive universe as does the philosopher or the scientist, and paint its truth and beauty in shades making it plain for all to see that the direction forward is indeed more desirable than the direction back. The current fashion for self-indulgence has left us wandering aimlessly and tunelessly about, and thus it is that despite art’s ubiquitous and burning reach, we find ourselves still too lost beneath the darkening spell of those who would take their craft too lightly and themselves far too seriously, the fame-seeking templates of a talent-enabled, patron-worthy and self-reflecting mind—a creation which indeed is human, all too human.



There is no world but the world, and all can be its prophet. From out of a dreamlike biological existence we can no longer quite recall, humans have found themselves suddenly awakened to an expanding awareness of space, time, substance and form, and naturally enough have been experiencing fright and confusion as much as they have awe. As useful as the disciplines such as science, art and philosophy have been in helping mankind come to terms with this fast-unfolding drama, there have always been those who have preferred the more direct approach to grappling with the unknown. As best as we can tell today, man approached early on these strange new phenomena by animating them first with spirit and shape already familiar enough—ancestor and animal, friend and foe—and society began crystallizing its strange new knowledge by relying hard upon custom and ritual, an instinctive choice for a species already bound to the safety of family and neighbor. In our religions today we still carry something of the essence from these earliest attempts to grasp and know, in our inclination to cast our gods in man’s image—the father, king and watchmaker—and in our preference to worship by congregation, sanctuary and comfort still to be found in the greater numbers. Nonetheless, to survey the tattered landscape of our religious history, it would seem such huddled relief has never been able to hold for very long. Forces have always called us back out of ourselves and made us curse what we had once sworn we knew, and always someone has been working remarkably nearby, attempting to pry the human eye open yet a little wider. Rituals unravel, customs finally crumble. And as our deities have become ever more sophisticated, and as our congregations have grown ever more crowded, the world has just kept on unveiling itself anew, its human-beckoning dance growing into an ever-faster, ever-widening gyre, and indeed it must seem as though the center cannot hold.

Spirituality has become today like two sides of one coin, attached but irreconcilable, and we should make no bones about the fact we as a species face in opposite directions. Established religion, conventional thought—they serve a purpose and have their place: they are the counterweight to strained, advancing edges, they provide grounding to those who might otherwise take too recklessly to flight, and in gathering up the mass of men into communities of teeming will, they power a means whereby human progress can be made all the more inclusive. But when from out of willful ignorance convention digs its heels in too deeply, it trips the course of a delicate, intertwining step. And when from out of excessive fear religion glances forever backwards, it drags the species towards a time and place in which there were no gods at all. In branding as heretics, atheists and blasphemers the ones who would dare to look anew and yet a little further on, we fail to see that in a forwards-stepping human dance, it is the role of teeming crowds to be the partner and to follow, and the task of strange, unique and lonesome souls to be the ones who lead.

Let us contemplate yet one final list of men, those we most commonly revere as having brought to us our truer visions of God—Luther, Confucius, the Hebrew prophets, Buddha, Muhammad, Christ. But tell me first, just exactly whom have we come to these deserts to see? Are we seeking here the brightly clad, the fortune blessed, the most powerful leaders who have once walked—and owned—our lands, and were we brought here by way of an assuring word that these indeed were the kind most acknowledged and acclaimed by all? I have heard it said we must seek such attributes someplace else—in gleaming palaces on top of fortressed hills, or perhaps in some high-tabled dining halls—for in this list of men being contemplated here I see only the attributes we have already well discussed: social isolation, verbal incomprehension, strange obsessions, and at last, a bitter clashing with the community of men. What worthy religion has been founded yet that did not begin inside the lonesomeness of a cave, or the isolation of the desert wilderness, or the solitude beneath the peepul tree? Has a sermon ever been heaven blessed that did not at first befuddle—the roiling recitations, analects so cryptic, and these theses nailed to the door? And what humanity-healing balm has yet been crafted not arising from behaviors hard to fathom, from asceticism on a grain of rice to self-coverings of dust and dung? These men were something other than human, I think their compatriots would have been the first to say, but not meaning that in any supernatural sense, but that these men were not quite normal, did not think and act, oh my sapient brother, did not think and act like you and me.

I realize we have not customarily regarded the sacred names in quite this fashion. Perhaps it is because we are forever viewing these men with such perfect hindsight that we continue to see them so poorly. When millions can overflow St. Peter’s Square, when the Hajj tents stretch on for mile and mile, when the venerable temples teem from wall to altar wall, what earthly reason might we have to doubt the appeal of messiahs, saviors and prophets, and how can we question the popularity of men so widely hailed and hallowed? What might strike us, however, as we survey this modernly pious planet, is how incredibly little change has taken place from a formerly pious time. For if God were to send representatives again, if the universe beckoned directly, I am certain there would still be many—very many, a majority in fact—who if given this chance would answer that call themselves by bearing witness to a familiar duty. Drawn by the promise of ancient hypotheses and the latest buzz, so many would come to see and hear once more, would follow again from place to rocky place, and if hesitant for just a moment, as though not catching the words quite right, one glance at the size of the burgeoning crowd, just one nod from a trusted authority, would be enough to guide them to their appointed place and there help them stand their ground. Self-assuredly shoulder-to-shoulder, their sanctuary now complete, all could join that enlightened chorus and resume its ancient cry, “Barabbas! Barabbas! Barabbas!”

Of course, I must be committing a blasphemy to claim we have not today learned better. Our colleges are now flush with honored theologians, their learned words bulge our crowded shelves. The pulpits ring with polished instruction, and all is denominationally pure. Assuring family and knowing neighbor fill out the spaces all around us, and everyone can speak the tongue, it seems, of a confident, assembled glory. Still, I wonder what comfort we will take in the days ahead from having recalled and worshipped the resurrected prophets, when it is the stone we are casting at the present saint that is making a martyr of our future. God’s representatives do dance all around us, the world currently beckons at the tip of our finger. So much certainty attached to over-gilded faiths, such reliance upon the overflowing coffers, too much prayer directed to declaring war as holy—these are making us forget to build our heaven right here, and thus we are getting stuck in a living hell. That beam is still within our eye, my friend—our worldblind eye. If the sacred texts are to have taught us anything, it is that God does not speak through the gathering crowd, but only through those souls who have learned to bear an awakening both paradoxical and isolating. On this planet, the world is attempting to come to consciousness through the means of our own human biology, and each individual—each potential prophet—has the freedom to assume that duty. In the all-too-common clamoring to receive blessings not yet fairly earned, we have too often ignored these struggling, striving souls, or worse, trampled them underfoot, and thus it is we fail to hear their uplifting words, we lose their saving grace, too numbed by our mumbled, buzzing, repeating words rattling about inside a limited skull, “Oh, Lord, do not forget us at that last dread hour, for we have so many times made answer to you with our pre-anointed, destiny-favored and already godlike mind”—a prayer which indeed is human, all too human.

* * *

For more than sixty years—since Leo Kanner and Hans Asperger first described a set of developmental characteristics each would label after the Greek word for “self”—autism’s unveiling and increasingly public visage has not been viewed as a pretty sight. The condition has invoked a profound sense of fear in many, for in addition to its three universally recognized diagnostic symptoms—difficulties with socialization, language delays and peculiarities, and atypically patterned behaviors and interests—autism has also been linked to a long list of unsettling characteristics suggesting inevitable tragedy behind its perceived human impact: muteness, self-injury, mental retardation, severe emotional detachment, required institutionalization. Along with the related disorders of schizophrenia and manic depression, autism has been bringing into prominence the severe burdens anticipated with an abnormal psychology, and has cast into sharper societal focus the immense challenges raised by a mental process ranging far outside the human standard. Furthermore, autism’s specter has been made all the more haunting by the fact it is applied first and foremost to our society’s most cherished members—its children—and by the realization that autism’s discovery has come seemingly just in time for its prevalence to explode suddenly out of control. To read the many alarming accounts and horrifying descriptions being put forth by politicians and researchers alike, one would have to assume autism is indeed a medical monster set loose upon our land, damaging the brains of infants, destroying the joys of toddlers, stunting the intellects of adolescents, and leaving behind a trail of adult-like shells housing the person who might have been. This looming, growing, grotesque face of autism must surely be the most frightening nightmare now walking the human landscape—the condition’s devastating impact, unquestioned and asserted by so many, seems to have left no room for doubt about that.

In attempting to characterize autism’s underlying cause and pave the way to finding an eradicating cure, scientists have embarked upon a feverish quest for that holy grail most dearly treasured—the elusive, explanatory detail lying at the heart of autism’s puzzling pathology. Embarrassed by a self-serving history of easily debunked notions such as refrigerator mothers and vaccine-induced poisonings, autism researchers these days direct their chase more cranially inwards, into the hunting grounds deemed more safe and acceptable—the organic deficits just waiting to be discovered within the flawed mechanics of the autistic mind. Brain tissue damage. Synaptic breakdowns. Neuronal glitches. Genetic defects. Just one glance through the reams of modern medical literature is more than adequate to reveal a veritable cornucopia of likely candidates behind autism’s dreaded scourge: excessive prenatal testosterone, faulty mirror neurons, too much white matter, too little white matter, insufficient blood in the frontal lobe, low amygdala density, mutant PTEN genes, mutant neuroligin proteins, anomalies on chromosome 2, anomalies on chromosome 4, anomalies on chromosomes 7, 11, 15, 17 and perhaps a dozen others, unregulated glutamate neurotransmissions, some missing Purkinje cells, axons both under- and over-pruned, a way-too-dainty fusiform gyrus, and doubtlessly some further discoveries to be written up next week. It would seem the psychiatric technique of tracing a mental aberration back to a specific biological cause has found its apotheosis in the fertile field of autism, for here we have stumbled upon a one hundred-fold flowering of the technique’s explanatory effect, autism’s complexity now matched perfectly by the many-tentacled nature of its presumed genesis. In effect, modern cognitive science has been overwhelmingly verifying the suspicions we have already had—that autism is indeed the most hellacious disease ever to have cursed our human species—for what other malady could possibly lay claim to having been spawned by so many devilish details.

Nonetheless—and clearly counter to the prevailing wisdom—autism’s mystery will not be solved within the confines of a medical laboratory, its etiology cannot be explained by its most narrowly defined detail. Not our understanding, or even our lack of understanding, of autism’s neurobiology compels this contrarian view; rather, it is just one glance through the history of human progress that prompts the second look, urges a stepping back from this hurly-burly of modern medical research. From our not-so-long-ago struggles as foragers on the African plains to our more recent adventures as Big Bang photographers and genetic code breakers, we as Homo sapiens have been transforming ourselves and transforming our planet not by digging ever more deeply into the details of immediate biological need—a clinging to what we as a species have already known. On the contrary, we have built up our intelligence, awakened our consciousness, recast our entire world by opening the human gaze ever wider—farther into space, deeper into time, always beyond the immediacy of our former selves. It has been so in our religions, where hunger for the unknown has driven a progression from spirited animation of rocks and trees to an awareness of the cosmic forces shaping our lives and suffusing our world. It has been so in the practice of our arts, where joy with surrounding form has inspired a re-creating brilliance, from cave wall hunts to the music of noble tragedies and divine comedies. It has been so in the maturing of our philosophies, where awe with the abundance passing through each individual has forged an increasingly honest attempt to know ourselves and describe our place. And it has been so in our sciences, where reaches throughout the expanding realms of space, time, matter and energy have brought us to the very edge of paradoxical self-knowledge. We have been busy these last fifty thousand years, busily engaged in broadening our horizons, expanding our context, for it has always been the widening of our understanding—not a narrowing in—that has been the hallmark feature of a stunning transformation we seem only now to be slowly awakening to.

It has been so also in our study of human psychology, where a desire to grasp the essence of mindful experience has inspired cognitive modeling both elegant and sublime—but perhaps not so much nowadays. As one of the newer sciences in name but certainly one of the oldest in practice, psychology stands at the intriguing crossroads of human experience, at the place where the enormity of the world’s substance and structure meets the immediacy of the living creature. This confluence of all and now, object and subject, world and self has piqued the imaginative, constructive efforts of many blazing pioneers—Freud, James, Rogers, Jung, and again that devil Wittgenstein—men who themselves may have struggled with their own individual, social and celestial experience, but ironically enough had something of importance to say about that experience. Alas, it appears to be no longer so. Today the fields of psychology and psychiatry are mostly a business enterprise, their human impact measured out in terms of sessions, hourly rates and prescribed pills, and to keep these splintering disciplines from sounding too much market driven, all are backed increasingly by ever more extravagant scholarship and research, useful sounding enough to be sure, but at heart little more than a subtle propaganda that only the tools and techniques of modern medical science—MRIs, microarrays, Ritalin—can cure the ills of the human condition. Today’s most highly regarded psychiatric researchers are also the great gatherers, the ones who can mix and match ever larger teams of junior assistants, ply them with the latest theory and the most impressive pieces of equipment, sprinkle all with enough grant funding to help the long hours go down, and deliver finally a plausible-sounding catalog of factors and treatments stitched together from seemingly every officially stamped and over-centrifuged idea ever concocted by medical man, and failing in just that one approach only those with the condition might one day miss—a broader vision. As a means for gainful employment and an opportunity to play with the most expensive toys, the specialized fields of human psychology now greatly excel. But as a source for understanding just who we are and what we might one day be—well, that has mostly moved on.

The current face of autism is indeed a gruesome sight, but only because of the way we have chosen to see it. In one of the greater ironies of human judgment, we have been measuring the souls and capacities of autistic individuals against yardsticks of human normalcy, without recognizing that upon this planet we as a species have turned so astoundingly abnormal ourselves. We are now aloof strangers to our former animal selves, we speak abstract, expanding languages incomprehensible to any other beast, and we persevere in many unusual, world-altering behaviors—some subtle, some grand—all running counter to the expectations of our own biology and to the course of millions of years of evolutionary prelude. The traits of autism echo in the spaces all around us—in our books, our machines, our behaviors, our lives. The traits of autism echo throughout our known history, in the work and biographies of so many catalyzing individuals, the ones we have occasionally accepted, revered and understood in retrospect, but have nearly always shunned, reviled and misunderstood in their blazing moment. That shunning, reviling and misunderstanding—it is now being applied to the autistic population as a whole, for in our psychiatric will to blindness we are being badly misinformed. The majority of autistic individuals are not mute, they do not self-harm, they are not mentally deficient, they are not emotionally detached, and they most certainly are not in need of institutionalization, the prevailing wisdom clamor all it will. The majority of autistic individuals live quietly and productively among us, and they always have. Struggling through the difficulties of difference, and attaching themselves—some more, some less—to a species they do not easily recognize, autistics have changed the course of human existence by opening to it a strange new vision, one extending far beyond our biological selves and transcending our limited past. Only blindness could make of autism nothing but disorder and disease. On this fast-transforming planet, for this quickly turning species, autism would be much better linked to something more like the voice of God.

Of course, I realize I must be asking far too much from this already enlightened age. With our religions bound up inside selfish prayer, our artists enraptured by slavish imitation, our philosophies content with mere sophistry, and our sciences stuck in a dogma of brain-sufficient mind, we cannot see much beyond ourselves, we cannot look outside what we think we already know, and thus it is we suffer the illness of entrapment inside immediacy. To ignore the autistic influence on mankind’s now diversely melded traits is to extinguish the spark of a sudden and very real transformation. To suppress all forms of autistic expression—be they in individuals or in society as a whole—is to deprive this culture of its most forwards-looking glance. And to eradicate the condition itself would be to doom this species to a species’ ultimate fate. Humanity’s current circumstance, wondrous and tenuous all at once, is not the work of normalcy but the result of difference supporting difference. Human mind and consciousness, ancient and modern all together, are not the birth of limited brain but the product of world built around us. Here are the beginnings of the next transforming context, here might be the end of what I have labeled as worldblindness. If not from the despair of disabled society, then perhaps from the hope of individuals willing to look anew and yet a little further on, we might develop once more our taste for destiny, we might learn again to spurn complacency, we might join the many forces of our diverse humanity, and we might conjure again the power of struggling, striving will. This is how it has always been, you know. This is how we have constructed the newer knowledge. This is how we have expanded our hungering reach. And this is how we will dare to embrace a burgeoning awareness of a much broader, more universal and all-suffusing mind—an act of courage which would still be human … oh, so splendidly human.


Autistic Symphony home page

Visit the blog Autistic Aphorisms.


Copyright © 2007 by Alan Griswold
All rights reserved.