English

Stone tools and the evolution of modern human cognition

A recently announced discovery of sophisticated stone tools at the Pinnacle Point site in South Africa pushes further back in time the evidence for the appearance of modern human intellectual capacities, to at least 71,000 years ago. The discovery helps reduce what has been seen as a perplexing temporal gap of over 100,000 years between the earliest fossil remains of anatomically modern humans and the first appearance in the archaeological record of evidence that these people possessed the capability for fully abstract, symbolic thought, which is the basis of modern human technology, social organization, and culture at the beginning of the Upper Paleolithic period.

The new report in the journal Nature (Brown et al 2012 November) was prepared by a research team including members from the US, Greece, Australia, and South Africa.

The tools described in the article, or rather their stone components, are called microliths (literally “small stones”). They were manufactured by a technique known as blade production, which is much more efficient and requires a significantly greater degree of cognitive skill than the methods used by pre-modern humans, such as Neanderthals. Blades may be used to assemble compound tools that are much more varied than earlier tool forms and can be designed for highly specialized functions, thus opening the possibility of creating multiple, task-specific tool kits.

The latter characteristic, in particular, likely gave modern humans the capability to easily create tools adapted to new food sources and environments, thereby facilitating their explosive emigration out of Africa 100,000 years ago. This new technology was substantially superior to that in use by the pre-modern humans living in Eurasia and gave the new arrivals a marked adaptive advantage.

Old views of human evolution, technology and the brain

The known fossil record indicates that anatomically modern humans – Homo sapiens (i.e., humans like us) evolved from an earlier species of the genus Homo approximately 200,000 years ago in Africa. However, for decades, the archaeological record (i.e., artifacts as opposed to fossils) did not yield evidence that indicated a fully modern mental capacity until the beginning of the Upper Paleolithic period, approximately 150,000 years later. Indeed there are sites at which remains of anatomically modern humans appear to be associated with stone tool technologies equivalent to those of pre-modern humans such as Neanderthals. This temporal gap gave the appearance of a substantial lag between physical modernity and intellectual modernity.

The older specimens of Homo sapiens had brain sizes and general brain morphologies, as deduced from the size and shape of fossil skulls, which are essentially indistinguishable from those of currently living humans. Mental capabilities do not fossilize, however, and can only be deduced from the material products of human behavior – archaeological artifacts.

The apparent absence of archaeological indications of sophisticated tool production, art work, and other cultural characteristics of modern humans led to speculation that something was missing in the mental capabilities of early modern humans, something that could not be detected in the fossils, perhaps in the internal architecture of the brain. One possibility is that full linguistic abilities had not been achieved. This missing element, whatever it was, was thought to have prevented the cognitive leap that, once it occurred, “opened the flood gates” to a cultural explosion which characterized the Upper Paleolithic, supposedly creating the basis for modern humans to leave Africa and populate the rest of the world.

However, an old dictum in science states that, “absence of evidence is not necessarily evidence of absence.” The vagaries of preservation mean that both the archaeological and paleontological records are extremely fragmentary. The factors that preserve evidence in one field of study are not necessarily congruent with those for another in either time or space, leading to apparent inconsistencies in available data. The lack of archaeological evidence for cultural sophistication in early modern humans could be due to failures in preservation or simply that archaeologists had not yet looked in the right places. Africa is among the least archaeologically investigated regions in the world.

Revolutionizing our understanding of human cultural origins

The Pinnacle Point artifacts are only the latest of a number of recent finds that have, over the last couple of decades, pushed evidence of complex thought further back in time. Other sites scattered across southern and eastern Africa have yielded shell beads and incised objects, in addition to microliths. The beads and incised objects are thought to represent symbolic behavior, something for which there is little or no evidence in previous human species. The beads apparently indicate some meaning with respect to personal identity and the incised objects may represent abstract images and/or counting. So far, all such artifacts from well-dated contexts are less than 100,000 years old. One site in South Africa, the Klasies River Mouth, contains blade tools and fragments of modern human remains dating to nearly 90,000 years ago. Other African sites that contain microliths may be older, but are not securely dated.

The importance of the discovery of well-dated early microliths is that their manufacture and use are clear evidence of advanced human cognitive abilities. All previous stone tool technology was fundamentally reductive. Beginning with a piece of raw material, the tool maker, using various forms of stone hammers and implements of bone, antler, and wood, removes bits, known as flakes, from the original piece, the core, until the final desired shape is achieved. Over the long prior span of stone tool manufacture, since the earliest Oldowan tools dating to 2.6 million years ago, technological advances had certainly been made, but the essential mental concept—reduction—remained the same.

Tool manufacture employing microliths is fundamentally different. The microliths are made from what are called blades. Blades themselves are still produced by reduction, though using a technique requiring very precise planning and control. Blades are long, thin flakes that are struck from a prepared core with a high degree of uniformity and in large numbers. It may be said that this is the first form of mass production. It is also much more efficient than previous methods of stone tool manufacture in that a substantially greater amount of “cutting edge” can be obtained from a given quantity of raw material than is the case with previous methods.

The real advantage of blades is that they can be assembled with other materials (e.g., bone, antler, wood) in an additive process to create new tools with sizes and shapes that are impossible to achieve with stone alone, even when attached to a handle. Blades can be snapped into smaller segments – microliths – further modified and then embedded into pre-shaped handles to create long cutting edges of various configurations.

One example is the stone sickle, in which many microliths are attached to a long, curved holder. A similarly shaped implement of stone alone would be difficult to manufacture and so fragile that it would be virtually useless. Furthermore, as the individual microliths break or become dull, they can be replaced and the tool as a whole thereby has a much longer useful life. The authors of the Pinnacle Point article suggest that one use of the microliths at that site may have been to manufacture compound arrow or dart points, implying use of the bow or spear thrower, a clear technological advance over hand-held spears.

The manufacturing process outlined by Brown et al is complex, involving a number of steps, which may not necessarily have occurred in immediate succession. The sequence is described as including: “(1) collection of silcrete [the lithic raw material] at patchily distributed sources; (2) collection and transport of appropriate wood fuel to heat treatment locations; (3) controlled temperature heat treatment of silcrete [to improve its flaking characteristics]; (4) preparation of microblade cores on silcrete; (5) controlled production of bladelets; (6) reshaping of bladelets into microliths; (7) production of mounts on wood or bone; and (8) adhesion of microliths to form compound tools.”

The degree of skill and knowledge needed to successfully carry out this manufacturing sequence, the authors contend, necessarily implies the use of language to instruct each succeeding generation. In an accompanying commentary to the main article, another researcher in the study of early modern human culture, Sally McBrearty, states that, “The ability to hold and manipulate operations and images of objects in memory, and to execute goal-directed procedures over space and time, is termed executive function and is an essential component of the modern mind.” Evidence at the Pinnacle Point site indicates that the microlithic industry was used over a span of more than 10,000 years, meaning that it was a key component of the occupants’ cultural adaptation rather than simply a brief experiment.

The use of this new technology provides evidence that the people who employed it could create tools whose shapes and functions required a level of conceptualization fundamentally greater than that which previously existed. In a reductive technology, the ultimate product can be visualized as a shape existing within the raw material. It simply has to be “released” by a process of removal. An additive technology requires an understanding of the properties of many different raw materials and the ability to analyze how they can be combined in new ways to achieve novel ends. The additive technology did not replace the older one, but encompassed it, creating a new system that had a qualitatively greater potential, a truly dialectical transformation.

Research during the last two decades has substantially reduced the apparent gap between biological and cultural/behavioral modernity in Homo sapiens, suggesting that the capability for the latter appeared with the former. Nevertheless, much remains to be understood regarding how the process of achieving fully modern cognitive capabilities actually occurred.

Genetic evidence suggests that the evolution of modern humans occurred during a “genetic bottleneck” when the size of the ancestral population was greatly reduced, perhaps due to severe climatic stress. If that is correct, this population is likely to have come close to extinction. It may well be that a genetic change took place under those critical conditions which allowed our ancestors to conceptualize the world in a new way and gave them the ability to develop new technologies which were key to their survival. This also means, however, that archaeological sites dating to the critical period are likely to be few and difficult to find.

Loading