CULTURAL FACTORS

Several biocultural factors helped make us civilized, sentient human beings. Here we examine briefly and more or less chronologically a half-dozen such factors in relatively recent human evolution: the dawning of rudimentary technology, the discovery of useful fire, the development of symbolic language, the emergence of observational science, the search for a worldview, and the recognition of self-consciousness. Undoubtedly many other factors also helped make us thoughtful, cultured humans, but practical advances and critical thinking such as these are representative of the more important ones.

Technology Anthropologists concur that our ancestors of the past few million years must have survived mainly by hunting, gathering, and scavenging food. The acquired traits of pursuing and eating meat and other high-protein fare were likely exported from the forests to the savanna, whereupon they were enhanced owing to the relative lack of fruit in the open plains. Though most inhabitants of modern civilization no longer regard themselves as hunter-gatherers, this was indeed the job description of all our ancestors from ~3 million years ago until the rise of agriculture ~10,000 years ago. (Then again, who is to say that we don’t do the same today when shopping at the supermarket: Hunting up and down the aisles, gathering food into our baskets, and scavenging for the best parts of animals that were killed by someone else.)

How do we know that early humans, even the advanced australopithecines, hunted? The evidence is twofold, both found in the fossil record. First, scattered bones of a variety of large animals are often found near those of our ancestors at many dwelling sites along the East African Rift Valley. The former’s bones comprise not intact skeletons, but rather strewn debris suggestive more of a picnic than a natural death. Second and more convincing, tools made from stones are often found alongside the remains of 2-million-year-old australopithecines, as well as those of all the more recent human species. Those stone artifacts have endured for millions of years and it seems safe to conclude that even earlier ancestors might have utilized wooden tools that didn’t endure.

Judging by the shapes of the stone tools unearthed at Olduvai Gorge (see Figure 7.17), many of these coin-sized implements were used to chop, cut, and prepare food for easy consumption. Many others, though, evoke weapons, especially larger, rounded stones probably used to maim or kill when thrown, much as modern chimps occasionally do now. Other stones resemble club heads and they were probably used for exactly that—hunting by killing with a club of some sort. These were possibly the type of “tools” wielded by the advanced A. africanus (or H. habilis) to exterminate its relative, A. boisei, ~1 million years ago. Whatever their use, these primitive implements evince the beginnings of the technological society that we all now share.

FIGURE 7.17 FIGURE 7.17 — These are some of the primitive stone tools unearthed at the 2-million-year-old dwelling site at Olduvai Gorge. Whether they were used by advanced australopithecines (africanus) or early Homo (habilis) remains to be determined. (Smithsonian)

Stones were used not only for tools and weapons. They also provided foundations for early homes. Another 2-million-year-old site in Olduvai Gorge, for example, contains a circular stone structure conjectured to have been the base of a hut of some sort. This kind of primal, Oldowan stonework predates the so-called Stone Age—a time when our ancestors not only rearranged rocks but also broke them apart to serve more usefully than mere rockpiles.

Depending upon the place excavated, the duration of the Stone Age spans a period from roughly 1 million years to hardly 10,000 years ago, after which the Bronze and Iron ages continued until Renaissance times only 500 years ago. Distinguished by increasingly intricate stoneware, including various types of handaxes, cleavers, spatulas, and scrapers, the Stone Age displays a steady transition from rather crude tools to more finished ones alongside the advancing fossil record of biological species. Hence the early Stone Age is historically associated with the onset of Homo, and toolmaking itself must have accelerated the evolution of the first true human beings.

Much of the long-ago construction of stone implements preceded the enlargement of the brain. The earliest stone-wielding creatures had brains just a little larger than those of modern chimps, not much bigger than 500 cm3. Undoubtedly, tool use and bipedal posture, twin hallmarks of manual dexterity, were powerful evolutionary advances—fundamental changes that triggered whole new opportunities for living. Unbeknownst to our ancestors of the time, their tool-like chips of rock were the start of a manufacturing society, a technological culture. The difference between stony spoons and jumbo jets is only a matter of degree.

To satisfy our modern need to categorize, anthropologists give labels to the times when different technologies were introduced into human life. Table 7-2 lists some well-known ages that document the steady rise of technology. The intervals are approximate, as many of the ages overlap. For example, natural metals were undoubtedly discovered in the course of working with some stone ores; crude metals were thus probably used well before the formal start of the Metal Age noted in the table. Likewise, humans knew much about the atom even prior to the development of thermonuclear weapons and the Atomic Age a half-century ago. Furthermore, some researchers prefer to subdivide the ages listed, for instance by splitting the Metal Age into the Bronze, Iron, and Copper Ages. Thus one age was not simply and suddenly replaced by another; rather, the transitions were gradual and intersecting, much as for any evolutionary process.

The threshold of technology, then, is hard to pinpoint exactly, but it probably occurred >1 million years ago. The beginnings of cultural, as opposed to utilitarian, activities may be nearly as old, for brightly colored mineral pigments have also been found alongside skeletons of the earliest of the true humans. Even the advanced australopithecines may have had some use for ritual, as geometrically arranged pebbles are often found near their remains.

Only toward the end of the Stone Age do more industrious and sophisticated activities become evident, when a genuine outburst of cultural evolution brought forth a wealth of arts and crafts—a wave of abrupt cultural change so dramatic as to be sometimes termed “culture’s big bang.” Technological advances such as construction of the wheel in central Europe nearly 50,000 years ago were matched by cultural advances such as the oldest deliberate burials in certain European and Asian locales nearly 60,000 years ago, as well as the onset of stunning prehistoric art on the cave walls of western Europe and elaborate sculptures of figurines of personal adornment ~35,000 years ago. Yet not until ~20,000 years ago, well before the dawn of agriculture, did hunter-gatherers of the Middle East give up their migratory lifestyle and settle into villages, where recent archaeological digs have uncovered the remains of stony walls and (carbon-datable) wooden roofs of Late Stone Age huts and a high density of human artifacts. These were uniquely human inventions, unarguable signs of behavioral modernity—cultural products of Homo sapiens, including some Neandertal and especially Cro-Magnon peoples.

Fire The discovery of fire’s usefulness is perhaps foremost among those factors that accelerated culture, its management of crucial import in the toolkit of the hominids. Our ancestors have been using fire for light, heat, and probably protection from predators for nearly a million years. Archaeological excavations of caves in France have revealed blackened hearths at least that old. The general benefit of fire, then, was welcomed a very long time ago, at least insofar as it provided warmth in colder climates. But it seems that its functional practicality went unrecognized until more recently. The cooking of food, for example, arose only ~200,000 years ago. Techniques to fire-harden spears and anneal cutting stones are likely to have been still newer inventions. Significantly, the widespread use of fire, especially for utilitarian purposes, was one of the last great steps in the domestication of humans.

Beginning not quite 10,000 years ago, our ancestors learned to extract iron from ore and convert silica into glass (at ~1800 K), as well as to cast copper (~1000 K) and harden pottery (~1200 K). Many other industrial uses were realized for fire, including the fabrication of new tools and the construction of clay homes. However, archaeologists disagree about the temporal ordering of many of these basic advances. Just when and how one invention paved the way for another is so far unclear. Some researchers argue that, after heat and light, baking clay to make pots was the oldest organized use of fire, even predating the regular cooking of food. As depicted in Figure 7.18, pottery or ceramics is a technique whereby clay is changed back into stone by dehydrating and heating it to a high enough temperature to bond the fine mineral particles. Others maintain that the need for pottery arose because cooking was already established since the earliest uses for pottery must have been for cooking and storing food.

FIGURE 7.18 FIGURE 7.18 — Pottery is one of the oldest cultural uses of fire. This modern mud-brick village in Afghanistan features a pottery kiln in the foreground that is similar to those dating back ~10,000 years. (Smithsonian)

When cultural advances are intimately linked in this way, which was the cause and which the effect is often never quite clear. Both the motivation for and the exact time of an invention are usually hard to establish. Some of the pivotal steps may never be pinned down. But of one thing we can be sure: Scores of new mechanical and chemical uses of fire were mastered during the past 10,000 years and a few may have been discovered well before that.

Ancient clay, metal, and glass products can still be found in the bazaars and workshops of Afghanistan, China, Iraq, Thailand, Turkey, and other generally Middle-eastern and Asian countries. More modern results of these early technologies are evident in the cities of steel, concrete, and plastic surrounding many of us in the 21st century. Cultural changes of this sort are not without problems, however. Increasing amounts of energy expenditure, in the form of fire and its many derivatives, have often been accompanied by grim aftereffects, not least environmental pollution and energy shortages only now becoming evident in our contemporary world.

Language Development of language—that marriage of speech, cognition, and perhaps a dose of emotion—was another central factor in making us cultured, indeed a unique feature in making us human. Some psychologists regard language as synonymous with intelligence, or at least the “jewel in the crown of cognition.” Others, such as the 19th-century Linguistic Society of Paris, which banned discussion of the origin of language, regard it even today as too inconclusive for empirical study if only because linguistic behavior doesn’t fossilize. At the least, human language would seem to be our main evolutionary legacy—perhaps the most dramatically new trait to emerge since the Cambrian explosion ~0.5-billion years ago. The flowering of grammar, syntax, and higher intellectual function coincides with the change, only ~50,000 years ago, of anatomically modern humans to those who were behaviorally modern. But the roots of language are likely older than that—and together with the even earlier emancipation of the hominids from the safety of the trees, the development of language probably had a greater effect than either tools or bipedalism on the rapid growth of the human brain.

The record is unclear which effect—tool use or linguistic skill—had greater affect. Surely, early communication probably related in complex ways to hunting and tooling; a jumbled feedback mechanism likely led to the enhancement of both. After all, symbolic communication, such as sign language and arm gestures (namely, “body language”), would have granted obvious advantages in coordinated, big-game hunting. Likewise, communication of some sort was likely needed to convey such simple skills as toolmaking, tool use, and weaponering. Equally important, language ensured that experience, stored in the brain as memory, could be passed down from one generation to another. But when did language change from grunts, groans, and hand-waving to the grammatically spoken word? As with much else in our cosmic-evolutionary story, it’s often a matter of degree—shades of gray again: Did language emerge gradually over millions of years or did it originate suddenly when needed much more recently?

Though anthropologists have little direct evidence, some type of primitive language could have been employed a million or more years ago. Linguists have tried to estimate the duration needed to justify the roots of world-wide linguistic diversity among the ~5000 currently spoken languages, and they usually find a minimum of 100,000 years (even though the hominid breathing and swallowing plumbing were ready for nasalized speech much earlier). Erudite language surely came later than that, though again its origin and evolution are hard to pinpoint with any accuracy. The just-noted 50,000-year-old blossoming of intellectualism—archaeology’s “50K bloom”—is commonly supposed to have been caused by a combination of environmental change and genetic mutation—a classic biological evolutionary advance—that then appended modern language and artistic expression to the behavioral repertoire of a species that had already had a big brain for >100,000 years. Or, by contrast, did language and art emerge solely as a cultural invention, like agriculture or pottery, having much less to do with biology?

To reiterate the generally accepted scenario: Anatomically modern Homo sapiens, who looked like us and had virtually our same sized brain, resided in Africa ~150,000 years ago, after which they evolved into the behaviorally modern but same human species, who later became dreamers, thinkers, and artists ~50,000 years ago mostly in Europe. Whether this really was a sudden (punctuated) flowering or a more classical Darwinian (gradual) change is currently unknown. Work by archaeologists in this key transition period is underrated, underfunded, and under attack.

Artifacts made by early humans provide more clues to the cognitive-able and manual-skill prerequisites for such advanced communications as speaking and writing. Excavations of caves, mostly in southern Europe, have revealed a wealth of small statues and bones having distinctive markings or etchings. Predating the most famous and beautiful of the ice-age art depicting buffalos and horses on cave walls—the ~25,000 year old expertly rendered lifelike paintings at Lascaux and Chauvet in France—the oldest of these carvings date back ~50,000 years, including a few stone and bone statuettes that almost surely display symbolic functions (Figure 7.19). Since Neandertal Man’s larynx couldn’t possibly have uttered the full range of sounds of modern, articulate speech, the etchings on these objects are thought by many archaeologists to represent a primitive mode of communication. This idea has been reinforced recently by close examination showing the same repertoire of distinct markings repeated on many of the statues and bones. Apparently, the markings are neither accidental nor decorative; they go beyond just art-for-art’s sake and are indeed symbolic. Though sketchy, doodling cave and bone art might date back thrice that at Lascaux, these engraved stone artifacts are among the earliest known attempts to better—and to record—the body signaling and mental reasoning used not only now by modern chimps but also long ago by our ancestral australopithecines.

FIGURE 7.19 FIGURE 7.19 — This ~3-cm (1-inch) carved and chipped reindeer rib, dated to be ~30,000 years old, is one of several found by archaeological expeditions to imply systematic engravings. Likely more than just doodlings of early artists, this statuette probably had a symbolic function of some kind. (Smithsonian)

Credit for being the first to write texts usually goes to the Sumerians, those residents of a flat plain bordering the Persian Gulf in what was once called Mesopotamia (Figure 7.20). Nearly 6000 years ago, this ancient civilization had created an intricate system of numerals, pictures, and abstract cuneiform symbols. Thousands of baked (hence durable) clay tablets have now been unearthed and each shows that a stylus of wood or bone was used to inscribe a variety of characters. Estimates of the Sumerians’ basic vocabulary infer no fewer than 400 separate signs, each denoting a word or syllable. Animals such as the fish and wolf, as well as equipment such as the chariot and the sledge, are clearly depicted, but most Sumerian texts remain largely undeciphered. Because they display more than just pictures, the messages on these tablets already represent a reasonably advanced stage in the evolution of writing. Earlier civilizations perhaps responsible for inventing some of the Sumer symbols probably wrote exclusively on papyrus or wood that decayed long ago. Suggestively, modern computer studies that compare shared linguistic roots of known languages and likely rates of divergence among them (much like biology’s tree of life) have recently revealed that the Hittite spoken language—the forerunner of Indo-European languages including English and all the Germanic, Slavic, and Romance languages—might well have been common among the Neolithic farmers of present-day Turkey as long ago as 9000 years.

FIGURE 7.20 FIGURE 7.20 — The cradle of modern civilization was probably located in and around the Tigris and Euphrates Rivers.

The prevailing view among anthropologists is that writing evolved from the concrete to the abstract, from that needed for survival to the more literary. Gradually between ~50,000 and ~5000 years ago, the pictures and etchings on bones, statues, and cave walls became increasingly schematic, using a single symbol to represent an entire idea. This advance could have been deliberate in order to speed the process of record keeping, or it could have been the result of shorthand or carelessness on the part of ancient scribes. Whichever, it suggests that pictures preceded symbolic writing, which in turn led to the alphabetic prose of modern times, much as on the pages of this Web site.

Science Statue and bone markings might also reflect the origin of ancient science. Some of the etchings of ~40,000 years ago seem to correlate with the periodic lunar cycle depicting phases of the Moon—perhaps among the first attempts to keep track of the seasons. Engravings of this sort, as well as large murals on the walls of caves, suggest that people of the late Stone Age were conscious of seasonal variations in plants and animals. While some archaeologists prefer to interpret these deliberate bone markings as simple arithmetical games, these artifacts may well have been among the very first calendars—in effect, systematic timekeeping instruments.

The earliest written record unambiguously describing the use of scientific instruments doesn’t appear for many thousands of years later. Hieroglyphics dating back ~3000 years partially document the Egyptians’ knowledge of the sundial—which is really a clock, telling the time of day by noting the angle of the Sun’s shadow—but the oldest sundial excavated to date is a Greco-Roman piece of stone built ~2300 years ago. The Greek and Roman sundials reveal key refinements over the earlier Egyptian models, marking both the hour of the day and the day of the year (yet neither the duration of an “hour” nor the design of the calendar were the same as now). These early scientific tools are significant in that they could predict the orderliness of daily, seasonal, and yearly changes on Earth, thus mapping the basic rhythm of the farming cycle. Sundials were among the instruments used by the Greek geometer Eratosthenes to measure the size and shape of the Earth, to a stunning accuracy of ~1%, some 25 centuries ago.

Megalithic monuments, the most famous of which include the pyramids of Egypt and Stonehenge in Britain, can be used to predict the first day of summer, and possibly solar eclipses as well, by the alignment of big stones with the rising and setting of certain celestial objects. Many other such stone structures, though not quite as grand, are scattered across Europe, Asia, and the Americas. In what is now Mexico and the Yucatan, ancient “temple” pyramids have miniature portholes through which celestial objects, particularly Venus, can be viewed at propitious times of the year. Astronomers of the Middle-Age Mayan civilization were essentially priests, with the destinies of individuals, cities, and even whole nations apparently determined by the position and movement of celestial objects across of the sky (Figure 7.21(a)).

FIGURE 7.21 FIGURE 7.21 – (a) Mayan ruins at Caracol in Yucatan, Mexico, suggest that indigenous peoples had a celestial calendar more sophisticated than the Spaniards who eventually conquered them. In those early “observatories” (or temples), dated ~750 A.D., bright stars and planets would have appeared through small window slits at special times of the year. (b) The Big Horn Medicine Wheel in Wyoming, built by the Plains Indians, has spokes and other features that roughly align with risings and settings of the Sun and other stars. (UNAM)

By ~1000 years ago, these Central American cultures had probably influenced many tribes of North American Indians roaming the plains of what is now the western United States and Canada. Recent studies of numerous “medicine-wheel” land structures made of boulders arranged in various patterns of rings and spokes clearly evoke the possibility of such cross-cultural ties (Figure 7.21(b)). Though the actual use of these stone contours is unclear, as most Indians had no written language hence no historical record, some archaeologists regard those structures to have been another kind of calendar that marked the rising of the Sun at certain times of the year. Other researchers disagree, interpreting the stone "wheels" as more symbolic than practical. At least one of those tribes was looking up at the time since it recorded on a rock face in Arizona the 11th-century supernova event of the Crab Nebula, as noted earlier in the STELLAR EPOCH.

Thus, although modern scientific research, including exquisite instruments like the immensely useful microscope and telescope, dates back hardly more than 400 years, we can be sure that pre-Renaissance doyens were adept in elementary astronomy, Euclidean geometry, mechanical engineering, and many other practical ventures. Indeed, the roots of technology go far back. Our ancestors of a millennia ago seem to have been a good deal more technically sophisticated than many modern researchers, until recently, have cared to acknowledge.

Storytelling Mayan astronomer-priests of ancient Mexico bring to mind another factor that aided the cultural evolution of humans—a lofty factor that helps us explore who we are and whence we came. The epitome of culture is the search for truth, or at least a reasonable approximation of reality, most notably the need to know ourselves and the world around us. Both the desire and the ability to undertake this search identifies us as humans, distinguishing us from all other known life forms. To be sure, rational understanding of Nature is the major goal of science, though, admittedly, science shares this goal with other disciplines. Religion, the arts, philosophy, among other endeavors, all represent alternative efforts to appreciate our origin and being.

For thousands of years, humans have realized that the best way to dispel mystery is to understand it. 20,000-year-old cave paintings in southern France may be the oldest traces of early magico-religious ceremonies in Earth’s dim recesses. These cave-wall images seem to depict rites in which elaborate myths perhaps linked the hunting men and the animals they killed.

Reliance on the supernatural and the use of mystical activities are more clearly documented near the very beginning of civilized history. Sumerian inscriptions of ~5000 years ago record myths explaining how gods had created humans to be their slaves. This system guaranteed that food, clothing, and other necessities of life would be provided to priestly households or temples in order to please the gods, or at least appease them. Such a society divided managers from the managed, priests from the plebeians. Apparently, anyone professing knowledge of even the simplest celestial events was able to subjugate the masses. Those who foreknew the seasons, for instance, were likely deemed to have a special relationship with the gods and therefore deserved to be obeyed. No longer required to spend time producing their own food, the priest-masters of ancient Sumer were able to develop skills and knowledge far greater than humans had ever before attained.

Such mythmaking, perpetrated on the populace, forced further specialization, while assuring that legions of humans labored on social and technical tasks now recognized as vast irrigation projects and monumental temple-like structures that even today rise high above the Mesopotamian plain. All of this is clearly documented by Sumerian poetry of ~4000 years ago, including how the Sumer religion incorporated a systematic theology of human, worldly, and cosmic phenomena. Aspects of Nature—Sun, Moon, storms, thunder, and so on—were personified, with humanesque beings playing roles of gods in a divine political society, the whole of which was ruled by the resident god of the sky.

Such a set of beliefs proved powerful, for the system was complex and the populace uninquiring. For several thousand years, the priests of Mesopotamia bamboozled a largely illiterate public with increasingly intricate speculations. Even the surrounding barbarians, the ancestors of the ancient Greeks, Romans, Celts, Germans, and Slavs, were convinced that the gods of Sumer ruled the world. Apparently, myths become truths if upheld long enough.

In some ways, a stratified social system helped maintain cohesion and uniformity; priestly leadership offered a stabilizing influence, as do all religions in principle. But Sumerian inscriptions also tell of arguments, often prompted by water-rights disputes and fostered by rival religious factions and coalitions, that gradually became life-or-death struggles. By ~3000 years ago, dozens of Mesopotamian cities were armed to the hilt, with military organizations rivaling priestly governments. Presumably, the concept of kingship originated when Sumerian clerics decreed that the gods needed a representative among men—a strong-armed chief priest of sorts to adjudicate quarrels or to crush the opposition. Not long thereafter, clustered societies in the form of those cities, united under the influence of one king or one god, began demanding adherence to this, that, or another thing, the objective being to force our ancestors to believe who they were, where they came from, and how they fit into the bigger scheme of things.

Today’s plethora of ~10,000 differing religions and philosophies testifies to the fact that theological beliefs and resolute ideas are not subject to experimentation, not open to objective testing, and thus will not ever be universally acceptable. Such dogma offers stabilization locally and temporarily, perhaps, but how can these ideologies fail to destabilize globally, especially given conflicting sects and so many variant viewpoints in our modern world? The upshot is that neither belief alone nor thought alone can ever make the unknown known.

Ultimate reliance on the authority of the experimental test is the one key feature that clearly distinguishes the scientific enterprise from all other ways of describing Nature. Initiated in ancient Greece and used sparingly throughout the centuries, most notably by Aristotle and Augustine, the scientific method rapidly blossomed during Renaissance times to become one of the primary criteria in our search to deeply understand the material Universe. Today, it is the primary criterion used to generate the millennial worldview of cosmic evolution—not a tradition of revealed beliefs demanded of people, but a chronology of discovered events offered to them. To be sure, cosmic evolution is still very much a story, yet one that is based on empirical tests and real data that bolster our thoughts and ideas; as such, it has the potential to become the greatest story ever told.

Consciousness If the epitome of culture is our ability to seek the truth about ourselves and our world, then an even more startling advance concerns humankind’s desire to know the truth—to manifest curiosity about our origin, being, and destiny in the wider Universe. We are unique among all known living things in pondering who we are, whence we came, and where we might be headed. Just what is it that allows us, even drives us, not only to ask the fundamental questions, but also to attempt proactively to find solutions to them? The answer seems embodied within that hard-to-define refinement of the mind called consciousness, that part of human nature that permits us to wonder, to introspect, to abstract, to explain—the ability to step back, perceive the big picture, and examine how our existence relates to the existence of all things.

How did consciousness originate? When did humans become aware of themselves? Is consciousness a natural, perhaps even inevitable, consequence of neurological evolution? Some cognitive scientists think so, but they cannot yet prove it. They maintain that consciousness equates with “mind” and that minds are just what brains do. That is, the mind, or consciousness, is simply the normal functioning of a structured brain—the whole brain, and not necessarily a seat of consciousness at any localized neural site. Nothing “extra” infuses the mind or consciousness, just as no vitalism or élan vital partakes of life. Consciousness is merely accumulated experiences and the way that the brain functionally utilizes them.

Other researchers demur, noting that some specific, perhaps unlikely mechanism is needed for the development of consciousness. They claim that consciousness is likely to be more than the behavior of a huge collection of interacting neurons and that it may well be confined to isolated areas in the brain. The capacity for imagery and imagination does seem to entail something more than a gradual clustering of neurons—something akin to a holistic property resembling a global neural network that emerges when the whole indeed exceeds the sum of its lightening-quick parts.

Records of ancient history are incomplete and unreliable, making difficult any attempt to document the onset of self-awareness. Some psychologists contend that consciousness, as we know it, is not part of ancestral records until ~3000 years ago. This is about the time when some of the writings in ancient texts became abstract and reflective. People may well have wondered about themselves several millennia ago, but it’s unclear if that was the first time they began doing so. If consciousness did originate so late, we must be prepared to assume that cultures can become highly refined without evolving personal consciousness. Our early ancestors would have had to invent just about every cultural amenity except consciousness and to have lived until quite recently in a dreamlike, essentially unconscious state.

Yet others argue that human consciousness developed much longer ago, well more than merely thousands of years back in time. If modern chimpanzees, which display rudimentary self-awareness, do, in fact, mimic our australopithecine forebears, then embryonic consciousness could have emerged millions of years ago. To be sure, it may extend back tens or even hundreds of millions of years, since there’s no justification for the widespread assumption that basic awareness is a uniquely human trait. Animal behaviorists have acquired a great deal of evidence showing that simplified consciousness is common among many animals, as amply demonstrated daily by domesticated dogs, cats, and even perhaps invertebrate insects in our homes.

That full-fledged consciousness evolved at some intermediate time— ~30,000 years ago, at about the time of the invention of the bow and arrow—is another popular assertion. Some neurobiologists have emphasized that the extension of the hand beyond the body—even merely the art of throwing perhaps—might have had profound effects on key brain qualities, such as foresight, planning, and other actions-at-a-distance. Ballistic skills do seem to offer a particularly attractive biocultural route toward the evolution of advanced neural machinery. Ironic indeed if it were out-of-body experiences in the form of long-distance weapons that acted as a giant step that finally granted humans the freedom to innovatively plan ahead, to begin to evolve culturally, to wonder about space and time.

A reconciliation of these seemingly divergent views holds that crude consciousness likely originated among the prehuman animal kingdom millions of years ago and that prehistoric humans evolved a better sense of consciousness <1 million or so years ago, but only much more recently did humanity per se become sophisticated enough to reveal that sense of wonder and introspection in their writings. Perhaps. In the absence of objective empirical data and controlled experimental tests, we are left wondering how we learned to wonder—how humankind became so curious about ourselves and our surroundings.

The precise path of human evolution during the past million years is tricky to follow in detail, for the causes of recent evolution include both biological and cultural factors, often the two intertwined as biocultural effects. What’s more, whatever truly made us human involved not only such tangible inventions as noted above, but also the creative products of emotion and imagination—qualities virtually impossible to define objectively. A jumbled feedback system came into play among the increase in brain volume, the discovery of technical skills, the advance of cultural amenities, and the development of verbal communication and social organization. Changes were slow at first, but have markedly accelerated within the past 100,000 years or so. Whatever the reasons, these many innovations have enabled Homo to enjoy unprecedented success as a life form on planet Earth, for we alone can ask fundamental questions—and attempt to answer them.


<<BACK            HOME            NEXT>>