Working and Living

Sixty years ago, the popular story of social anthropology was that it revealed the history of continuous human progress.  Certainly, academic social anthropologists were keen to study ‘primitive’ societies, even though they were getting harder to find.  By the 1960s a major concern was that the groups we might want to research had already been infected by ‘modern civilisation’.   If you were contemplating postgraduate research, the options were already limited:  some might go to the New Guinea highlands, where small tribes still lived largely cut off from the rest of the world, or travel up the Amazon to similarly isolated groups in Colombia and Brazil.  Many anthropologists were still undertaking research in Africa, but there the traditional ways of life were rapidly disappearing.  Enthusiasm for central Australia had waned:  too harsh and too isolated.  Married with young children, I chickened out and decided I’d try studying a company as if it were a primitive society, but that’s another story.

A dominating narrative back then was the progressive story of development.  Primitive groups were evidence of the early, most demanding period of human life, when people lived in small bands, constantly on the move, hunters and gatherers.  It was a marginal life with little material culture.  Survival was the never-ending daily challenge to find enough food, and to fend of predators.  At some point, two changes took place:  the domestication of animals and the emergence of agriculture.  Now people could stay in one location, in settlements, even villages.  Life was still marginal, and tribes had to survive under the continuing threat of adverse weather, elephants stomping on the garden areas, storms, fires and floods, and more.  Improvements in agriculture, from initial slash and burn technologies through to crop rotation slowly allowed the production of surpluses, a division of labour between the farmers and all the others in the tribe who made clothes, built huts, and even start producing ornaments and art.  Trade began, specialisation developed, and here we are today.

There was an acceptance, never made too explicit, that life for most primitive peoples was exactly as Hobbes had suggested; “solitary, poor, nasty, brutish and short”.  Every day was dominated by survival, all the time taken up finding enough food, dealing with danger, and even coping with aggressive neighbours whose chosen path to survival included taking what they needed from you and the others around you.  Apart from decorative art, painting and sculpture, cultural activity centred around stories, many best described as myths, explanations of how things had come to pass, and how to stave off disaster by propitiating rather unreliable gods.  Often those myths emphasised the importance of fate, risk, tricksters and loss.

I can’t help thinking that Bronislaw Malinowski, researching in the Pacific, helped cement that view.  Self-interned during the First World War, he set himself up in the Trobriand Islands.  Photographs from his time there must have confirmed what most people knew.  There was Malinowski in his tropical outfit, wearing his pith helmet, surrounded by almost naked savages.  He had a tent, his camera, recording equipment, medicines, all the necessities of life; they lived in flimsy huts, with few possessions.  To add to the image, in his famous book, Argonauts of the Western Pacific, readers discovered that while they did engage in trade, their most prized items were shell necklaces and arm bands, rather than gold ornaments.  For the many who didn’t read the book, his findings were used to confirm the perception that these were ‘textbook’ examples of primitive people, close to savages.

Fifty years later, the story looked very different.  Anthropologists going into the field had ceased observing the groups they visited as if they were specimens on the other side of a pane of glass.  Their task had moved from being simply descriptive to seeking to understand the lived experience of the people they met.  A group of social anthropologists I knew had gone to Colombia precisely because it wasn’t Africa, the part of the world that had dominated so much research. Stephen and Christine Hugh-Jones, Bernard Arcand and Peter Silverwood-Cope undertook fieldwork among tribes making infrequent contact with outside world from the upper reaches of the Amazon, and their task was ‘participant observation’, not distanced study.  Another of the Cambridge anthropologists from that time, Caroline Humphrey, whose fieldwork was in Siberia, later described the extent to which the Hugh-Jones set about their task: “They not only learned to hunt, fish, grow vegetables, cook, sing, dance and play instruments like a Barasana, go around dressed only in a G-string, etc., but also that all of that was not a temporary “experience” but a real viable way of life for them. I saw this in Cambridge, after we all returned from the field. The Hugh-Jones’ house in the Gog Magog Hills was like a kind of maloca, with a shifting population of graduate students. Traps and blow-pipes were used for hunting rabbits and pheasants in the surrounding woods; inside, there were hammocks, woven baskets and bags; pet snakes were kept; food was often cooked on open fires, and various substances smoked.”  She wasn’t exaggerating.  My family was first to rent that house in the late 1960s, with Bernard and Peter staying with us.  Yes, it was different, and occasionally I worried about the jar of curare Peter kept in his room!

Looking back, this was a time of revolution in social anthropology.  Researchers like Stephen and Christine helped abandon the study of other societies as it they could be chopped up into the separate spheres of kinship, ritual, politics, and so on.  Instead, they revealed how people everywhere make sense of their world in an integrated fashion, using logic, aesthetics, and processual understanding to connect all of the elements of their lived experiences.  Not so primitive, after all.  They might have lacked some of what we have gained as a result of the Enlightenment and the growth of science and technology, but they, too, had a complex and sophisticated understanding of the world.  Among many other reasons why this revolution in understanding took place (a function of the structuralist thinking of Levi-Strauss and the combination of this with detailed fieldwork developed by Edmund Leach), there was one other innovation.  Several social anthropologists undertook their participant studies with partners and even children coming along with them, combining working and living in a way that created both challenges and opportunities.  Famously, David Maybury-Lewis wrote about this in his classic ethnographic study, The Savage and the Innocent’, in 1968.

In revealing detail, Christine Hugh Jones contribution to Joan Cassell’s Children in the Field, in 1987, explains “It is a curious experience for a woman anthropologist to write about taking children to the field. The subject threatens the dual system, born of sweat and tears, into which we force our adult lives. Attitudes, emotions, time and place, verbal styles, and self-images are all “family” or “professional.” Although we are unitary beings caught in a single stream of time, if we are no good at living out this conceptual divide, our careers disintegrate to the point where we cannot convince ourselves of their reality. We must struggle hard to keep off this slippery slope because our situation makes us experts in the nuances of self-doubt and might-have-beens. In writing about our own children and fieldwork, we not only have to integrate what we have so carefully differentiated, but those of us who have never written about our personal fieldwork experiences before have to undo the systematizing , analyzing, generalizing, and pruning that has transformed our remembered experience for professional purposes.”  Her comments in this very important, rule revealing piece, uncover what would previously have been considered ‘unprofessional’.

What is that rule?  For professionals, there’s a divide between working and living.  What is that nice question: ‘do you live to work, or work to live?’  To have this as a question is to accept, or at least imply, these two domains are separate. Certainly, for much of my working life (see, the distinction is already in place), I would ‘go to work’, a place where family and other aspects of my life would be left outside.    Through the year there might be one or two occasions when the two worlds met:  at the Christmas Party, a graduation, a work celebration, or similar event.  However, their infrequency made it clear these moments were exceptions, although I should admit I wasn’t good in following the rules, especially as I grew older, and often had my partner or a child along to enjoy a class, and even taken part in a discussion.

However, this isn’t about my wayward behaviour.  Rather, I want to return to that popular view of the lives of others.  For thousands of years, the story was humans were hunters and gatherers, their life hazardous, almost exclusively spent working.  With the appearance of agriculture and settlements, slowly work dropped from taking 95% of the time to 80%, and at the end of work day– usually in the evening –family and friends might eat together, tell stories, dance.  But the balance was clear:  you lived to work.  Reading James Suzman’s recent book, Work: A Deep History, From the Stone Age to the Age of Robots, has made me wonder if we have had that wrong.  The common view has been that for most of human history we have lived as hunter-gatherers on fruits, vegetables, nuts, insects, fish, and game.  During that time, it has been taken for granted that work dominated, foraging, surviving  “permanently on the edge of starvation, plagued by constant hunger.”  After all, that was what ethnographic studies had shown.  However, by the time anthropologists were in the field, Suzman suggests agriculturalists, like colonial empire builders, had pushed most foragers out of their ancestral homelands and into a marginal life quite unlike the past.

This is the heart of the matter:  in looking at other societies, we see through the lens of our expectations about what we have, and what others don’t.  We are the beneficiaries of hundreds of thousands of years of development with ourselves at the pinnacle of what has been achieved.  Suzman says we need to rethink.  Today anthropologists help us understand the so-called primitive peoples they meet in field studies have complex, intelligent and aesthetically sophisticated cultures, with subtleties we are still exploring.  Indeed, by the time anthropologists were in the upper Amazon, the world outside had already had an impact, as greedy agriculturalists were squeezing their traditional lands.  The evidence suggests that in the past, many generations earlier, those hunters and foragers lived differently, and had excellent and plentiful food, and time for leisure. They worked to live, and lived well.

Current research on foragers has shown they acquired their food through only “a modest effort,” leaving them with more free time than most people in advanced industrial societies.  Chemical analysis of bones has demonstrated early humans weren’t constantly teetering on the brink of starvation.  Rather, they had excellent diets.  How could this be?  Suzman suggests the turning point for early hominids came with their capacity to control fire, which gave them access to a “near-limitless supply of energy”.  Hunter-gatherers didn’t “suffer from systematic dietary deficiencies”, working themselves to the point of exhaustion without any security. On the contrary, their farming descendants were the ones who lived like that.

In trying to get to grips with debates about living versus working, there are two issues that seem important.  One was addressed some 2,500 years ago by Plato, in one of his dialogues (‘The City of Pigs’ in Book II of The Republic).  In this section Socrates skilfully leads his suitably credulous students along, as they explore the creation of cities.  Socrates describes an idyllic society, with needs met, and ample time lor leisurely conversation.  However, he gently nudges the listeners into constantly asking for more.  This idyllic city is meeting an increasing list of wants, to the point that each person’s wants can only be met at the expense of others, ending in war.  This is at the core of modern economics, the theory that human beings have infinite needs and wants but only a limited quantity of resources.  Economics studies how choices are made under the constraints of scarcity, to the point some of us can satisfy a few more of our desires only by compromising others’ ability to meet their own needs.  We are left to conclude that people will always want more, and our task as humans always has been, and always will be, to struggle out of penury and acquire more things.

This account of human nature, which underpins the standard economic perspective, is precisely what Suzman’s anthropological evidence allows him to reject.  He argues the scarcity postulate applies only to a limited period of humanity’s existence.  For the most of our history, humans have considered their material needs as limited.  Families divided up the work required to meet those needs, and when the work was done, they called it a day.  When there was an abundance of goods, they saw this as an excuse to throw gigantic parties.  In many cultures, giving away or even ritualistically destroying one’s possessions at festivals has been a common way to show one’s worth.  That people all over the world continue to spend their meagre incomes on elaborate marriage celebrations and funerals is something mainstream economists can understand only as anomalous and rather perverse.

That might make sense, but what about Socrates’ point, that we always want more?  Why wasn’t that the case for early foragers?  One answer is obvious:  if you were an itinerant forager, all you could have was all you could carry.  There simply wasn’t the capacity to obtain more and more.  Key is the distinction between needs and wants.  We do need things, but today, much of what we have can more sensibly described as wants, (even if teenage children are often portrayed as ‘needing wheels’).  John Maynard Keynes suggested we could do more than merely to stabilize Western economies but could advance beyond them to a post-scarcity society in which economic concerns largely faded from view.  To conceive of this alternative, Keynes asserted, economists would have to reconsider, recognising desires as coming in two types, which he called ‘absolute’ needs and ‘relative’ wants. For a city dweller, for instance, absolute needs might include things like clean water, an apartment, leisure as well as working clothes, and an annual bus pass. Relative wants, by contrast, refers to items like Gucci loafers and an Ivy League education.

Sadly mistaken, Keynes predicted that by his grandchildren’s time we would have such an immense quantity of buildings, machines, and skills as to overcome any real scarcity of resources with respect to meeting our needs (even including new ones like the 21st-century need for a smartphone).  Many wants might remain unfulfilled, but in Keynes’s view, wants are about status:  giving everyone Gucci loafers won’t help, since they’re worthless as status symbols once everybody has a pair.  Only reducing levels of inequality would relieve society-wide status anxieties, since everyone’s relative position would then matter much less. In his hopeful, idealised world, Keynes argued people would stop striving so hard, instead devoting their energy to a variety of “non-economic purposes.”  He thought in a future post-scarcity society, people would work just 15 hours a week, and then mostly for the pleasure of it.

Are these just academic debate topics for social anthropologists and economists?  Can we create Keynes’ post-scarcity world, or are we stuck in Plato’s, where getting more is evidence of success?  It’s hard not to be pessimistic.  As I look around, it seems we’re losing the ability to know when we have enough; even leisure time requires more and more stuff.  As I’m getting older, I’m changing, no longer living to work, to a point when I work (as little as possible ) to live.  The wisdom of age?  Maybe.  As I see it, time for living a good life is far sweeter than working long hours to buy the latest version of a seldom used kitchen gadget!

Recent Posts

Categories

Archives