in the 1960s, anthropologists turned to the hunter-gatherer way of life as the key to human origins. Several research teams had been studying modern populations of technologically primitive people, particularly in Africa, most notable among whom were the !Kung San (incorrectly called Bushmen). There emerged an image of people in tune with nature, exploiting it in complex ways while respecting it. This vision of humanity coincided well with the environmentalism of the time, but anthropologists were in any case impressed by the complexity and economic security of the mixed economy of hunting and gathering. Hunting, however, was what was emphasized. In 1966, a major anthropological conference entitled “Man the Hunter” was held at the University of Chicago. The overriding tenor of the gathering was simple: hunting made humans human.
Hunting is generally a male responsibility in most technologically primitive societies. It is therefore not surprising that the growing awareness of women’s issues in the 1970s threw into question this male-centered explanation of human origins. An alternative hypothesis, known as “Woman the Gatherer,” held that as in all primate species, the core of society is the bond between female and offspring. And it was the initiative of human females in inventing technology and gathering food (principally plants) which could be shared by all that led to the formation of a complex human society. Or so it was argued.
Although these hypotheses differed in what was claimed as the principal mover in human evolution, all have in common the notion that the Darwinian package of certain valued human characteristics was established right from the beginning: the first hominid species was still thought to possess some degree of bipedalism, technology, and increased brain size. Hominids were therefore cultural creatures—and thus distinct from the rest of nature—right from the start. In recent years, we have come to recognize that this is not the case.
In fact, concrete evidence of the inadequacy of the Darwinian hypothesis is to be found in the archeological record. If the Darwinian package were correct, then we would expect to see the simultaneous appearance in the archeological and fossil records of evidence for bipedality, technology, and increased brain size. We don’t. Just one aspect of the prehistoric record is sufficient to show that the hypothesis is wrong: the record of stone tools.
Unlike bones, which only rarely become fossilized, stone tools are virtually indestructible. Much of the prehistoric record is therefore made up of them, and they are the evidence on which the progress of technology from its simplest beginnings is constructed.
The earliest examples of such tools—crude flakes, scrapers, and choppers made from pebbles with a few flakes removed—appear in the record about 2.5 million years ago. If the molecular evidence is correct and the first human species appeared some 7 million years ago, then almost 5 million years passed between the time our ancestors became bipedal and the time when they started making stone tools. Whatever the evolutionary force that produced a bipedal ape, it was not linked with the ability to make and use tools. However, many anthropologists believe that the advent of technology 2.5 million years ago did coincide with the beginnings of brain expansion.
The realization that brain expansion and technology are divorced in time from human origins forced anthropologists to rethink their approach. As a result, the latest hypotheses have been framed in biological rather than cultural terms. I consider this a healthy development in the profession—not least because it allows ideas to be tested by comparing them with what we know of the ecology and behavior of other animals. In so doing, we don’t have to deny that Homo sapiens has many special attributes. Instead, we look for the emergence of those attributes from a strictly biological context.
With this