Hunting for meat was a critical step in all animal and human evolution. A key brain-trophic element in meat is vitamin B3 / nicotinamide. The supply of meat and nicotinamide steadily increased from the Cambrian origin of animal predators ratcheting ever larger brains. This culminated in the 3-million-year evolution of Homo sapiens and our overall demographic success. We view human evolution, recent history, and agricultural and demographic transitions in the light of meat and nicotinamide intake. A biochemical and immunological switch is highlighted that affects fertility in the ‘de novo’ tryptophan-to-kynurenine-nicotinamide ‘immune tolerance’ pathway. Longevity relates to nicotinamide adenine dinucleotide consumer pathways. High meat intake correlates with moderate fertility, high intelligence, good health, and longevity with consequent population stability, whereas low meat/high cereal intake (short of starvation) correlates with high fertility, disease, and population booms and busts. Too high a meat intake and fertility falls below replacement levels. Reducing variances in meat consumption might help stabilise population growth and improve human capital.
Malthusian demographic projections predict that the rate of population increase will inevitably (postponed in man by improvements in agricultural and medical technology) lead to intense competition between individuals for food resources and reproductive success. This led Darwin to his theory of natural selection and the concept of the survival of the fittest. However, was this competition just about calories? We will argue that it was really about an optimal supply of nicotinamide/nicotinamide adenine dinucleotide (NAD) important not only for the energy supply but also for the metabolic and genetic regulation and growth of big brains. The demographic success of humans relative to other omnivorous primates, or, as individuals or groups competing with each other, being down to success in obtaining this supply, largely from meat. Nicotinamide adenine dinucleotide powered brains with cognitive workspaces and the physical wherewithal to obtain NAD precursors in an ‘NAD World’.
Demographic events that we discuss in the light of meat intake closely relate to Cohen’s ‘Four Evolutions in human population growth’ covering hunter-gatherers whose populations doubled over hundreds of thousands of years to early agriculturists that doubled over thousands of years to later agriculturists that doubled over hundreds of years to the modern day when populations in developing countries can treble within lifetimes.
Taking an ecological view of human history was first proposed by Aldo Leopold in A Sand County Almanac and taken up by Alfred Crosby in The Columbian Exchange and his followers including Anthony McMichael’s recent Climate Change and the Health of Nations. These authors emphasise historical exchanges of food and disease, particularly infectious disease, and the effect of climate on crop yield: here, we expand on the effect of changing nutrition on intelligence, longevity, and reproductive behaviour in more specific detail. A very long perspective helps our argument beginning with the original rise of the animal kingdom and costly brains.
Meat, NAD, and the Cambrian
Our argument begins with the Cambrian explosion. This era is known as Darwin’s dilemma as evolution progressed so fast – a veritable gallop of morphological complexity and genomic variation. The Cambrian was, in essence, an explosion of (vertebrate) brains and mineralised skeletons allowing calculated movement and burrowing for food. Consciousness, sentience, primal emotions, qualia, high arousal, and mental maps whether visual, tactile, sound, pain, taste, and lingering smells leading to memory maps, and so on, necessarily evolved for predation, as did raptorial appendages, in an escalatory arms race. (This evolutionary route was not used by all species as many survived on genetically programmed reflex actions that avoid the high costs of complex nervous systems.) Bacteria that eat worms use NAD as a ‘food signal’ to open their mouths, but if NAD is unavailable they stop reproducing and enter a developmental and reproductive arrest phase, mediated by serotonin, to survive. The next evolutionary step was macropredator worms hunting worms and arthropods hunting fish and included widespread cannibalism, often at defined developmental phases. Carnivory became pervasive and our own evolution can be traced from these times. Even the clever invertebrates, such as octopi, hunted and they had been preceded in the long Precambrian by the earliest animals with recognisable heads, and ancestral neurones and ganglia, some of whom developed prey capture as their foraging style.
Rising oxygen levels in the atmosphere would have allowed better adenosine triphosphate (ATP) production from NAD(H) + O2 reactions but preceded the Cambrian. One can take this ‘metabolic acceleration’ argument over the importance of hydrogen metabolism and NAD(H) availability back further to the origins of life, given that it is a redox cofactor used by all living organisms, and this fits with ideas over the importance of the endosymbiotic acquisition of mitochondria and the arrival of visible multicellular eukaryocytes.
Well after the Cambrian, dinosaurs flourished, but a series of volcanic eruptions followed by a large asteroid struck the Yucatan 65 million years ago and caused widespread climate change and extinctions; this is thought to have first affected herbivorous prey and consequently the demise of clever ‘red in tooth and claw’ carnivorous dinosaurs such as Tyrannosaurus rex. Amniotes, then mammals and primates then took the lead again with improving nutrition (more nicotinamide-riboside in milk and early weaning to animal products) for their young building big brains in an increasingly complex and variable environment requiring problem-solving skills aided by social learning and cooperation. Strategic and social hunters have to take account of the habits of their prey and competitive groups whether other carnivores or ‘Machiavellian’ members of their own species. Whether one is a proponent of the ecological, social, technological, or general intelligence hypothesis for the evolution of high energy requiring high neuronal count brains, ideas overlap if the original task was clever foraging for difficult-to-obtain micronutrients and high-quality energy in a variable environment with new opportunities and new dangers.
Meat, NAD, and Human Evolution
Archaeological and palaeo-ontological evidence indicate that hominins increased meat consumption and developed the necessary fabricated stone tools while their brains and their bodies evolved for a novel foraging niche and hunting range, at least 3 million years ago. This ‘cradle of mankind’ was centred around the Rift Valley in East Africa where the variable climate and savannah conditions, with reductions in forests and arboreal living for apes, may have required clever and novel foraging in an area where overall prey availability but also predator dangers were high. Tools helped hunting and butchery and reduced time and effort spent chewing as did cooking later. Another crucial step may have been the evolution of a cooperative social unit with divisions of labour, big enough to ensure against the risks involved in hunting large game and the right size to succeed as an ambush hunter – with the requisite prosocial and altruistic skills to also share the spoil across sexes and ages. The ambitious transition from prey to predator hunting the then extensive radiation of megaherbivores so big that they are normally considered immune to carnivores, needed advanced individual and social cognition as humans do not have the usual physical attributes of a top predator. Adult human requirements to run such big brains are impressive enough, but during development, they are extraordinarily high with 80% to 90% of basal metabolic rate necessary in neonates – this is probably not possible after weaning without the use of animal-derived foods.
Such a climb up the food chain is unusual as most species go for the easy calories as they increase in body size. Sometimes, this climb down the food chain can be extreme with some, for instance, in the order Carnivora becoming specialist herbivores. Close examples among primates that have taken this approach are gorillas and orangutans and extinct robust hominids such as Paranthropus – none known for their particularly big brains or sociality relative to chimpanzees. Chimpanzees went for a high-quality diet as did early Homo. Homo sapiens whose brains enlarged markedly simultaneously managed high reproductive rates with large dependent children with long developmental periods and high longevity. Energy constraints were lifted to achieve this suite of costly traits without always having large trade-offs, at least when the NAD supply allowed.
As a consequence, man may have been part responsible for the wave of megafaunal herbivore extinctions starting in Australasia some 50 000 years ago in a diaspora out of Africa probably driven by hunting parties in search of meat –even if climate and loss of high-protein clover-like foods caused some of the die-off of animals such as the wooly mammoths. Later came the technically more difficult over-fishing and near extinctions or extinctions of oceanic megafauna and fauna such as the dodo and giant tortoises less than 20 years after they were discovered by meat-hungry sailors. Present-day near extinctions of other primates and meso-carnivores for bush-meat leave ‘empty forests’. All this meat eating changed the geological and physical nature of the world long before the onset of agriculture and marks the true beginning of the Anthropocene where now 90% of mammals are either human or our domesticates.
Omnivory, such as by farming, first evolved in insects and later by convergent evolution in Eocene primates and is their defining trait. Our more remote ancestors were herbivores eating grasses, digested by gut symbionts, then frugivores and insectivores eating enough animal protein to supply vitamin B12 and some nicotinamide – either by chance on fruit or more actively, such as fishing for termites. Hominids and early Homo increased meat intake but remained omnivores not carnivores, despite being top predators.Finding new plant foods and the tension between neophobia vs neophilia (as laid out in the ‘omnivore’s dilemma’) involved dangers that need avoiding to avoid plant toxicity from secondary compounds evolved to deter herbivores. We were helped by evolving a complex detoxification enzymic system (such as cytochrome P450 now also used to metabolise many drugs) that includes detoxification from excess nicotinamide by nicotinamide N-methyltransferase and culturally acquired cooking methods to avoid toxicity, for example, from cyanogens and improve extraction of calories and some micronutrients (including nicotinamide). Cooking helped tenderise meat and made plants more edible and nutritionally available – such as the culturally learnt alkalinisation of maize that releases nicotinamide from its bound form niacytin – a process known as nixtamalization. Processes for storage and preserving, such as smoking, drying, and salting of meat, or fermentation, whether with co-evolved yeasts and lactobacilli (of cereals to produce bread and beer or of milk to produce cheeses), also improved the nicotinamide supply. Tolerance of milk drinking throughout adult life – a genetic change that followed the culturally lead domestication of animals for their milk and the potent nicotinamide-riboside in particular – may speak for strong selection pressures for nicotinamide.
Being omnivorous has several advantages, an obvious one being switching diet depending on availability, rather than starving. Diet (and altered nutrimicrobiomes) can affect individual behaviour and increase variability within a population – striking in isogenomic social insect castes but also generally true. Omnivores can use diet to self-regulate behaviour – Argentine ants change from protein-rich to carbohydrate-rich diets when they settle down and increase population size – just as we did).
Plant and animal cultivation
A key series of plant and animal pre-domestication gardening or pet-like relationships followed by co-evolved domestication happened, including of ourselves. This transition was slower than previously thought – 3000 years to fix the non-shattering spikelet of wheat – but still speaks for significant selection pressures – not convincingly explained. Only a tiny proportion of available plants were selected – some 3% of 5000 species with less than 20 plants now providing most of the world’s food supply. Some selection must have been unconscious from metabolic needs and some conscious based on taste, toxicity, harvesting characteristics, or tameness.
Why there is sudden interest in plants is unclear even if the when, where, and how is becoming clearer. Farming, for instance, was facilitated by a benign climate at the end of the recent ice age and the warmer, wetter, higher carbon dioxide early Holocene 9000 to 12 000 years ago. This heralded the dawn of food production rather than food gathering and urban civilisation – almost simultaneously, and independently, in 8 to 10 major centres. In the context of our 200 000-year history (‘an eerie synchronicity’), this suggests a single driving evolutionary advantage adapted to local circumstances.
We make the case that the change to an agricultural lifestyle co-evolved as the change in diet changed our (tryptophan-nicotinamide) biochemistry and immunology favouring tolerance of the foetus, increased fertility and higher, denser, populations. Population size and density are important to a social species. More people particularly young people enable more division of labour while avoiding labour shortages – even if the price is hard work, more disease, inter-group stress, and social stratification with rich elites. Perhaps, an earlier evolutionary pressure for bigger brains relaxed (brains did atrophy) under domestication in favour of interactive social brains and specialisation.
Change in diet changes serotonin and other neurotransmitters directly and indirectly through altering the gut microbiome (an overall trend from Bacteroides to Prevotella enterotype on more plant-based diets) influencing cognition and mood and potentially increasing social cohesion, collectivism, and sedentism. Cross infection with symbionts is an advantage when they add to the supply of nutrients of living closer together with less than perfect hygiene, but risks dysbioses. Increased transmission of pathogens and species jumping from domesticates’ vectors led to then emerging ‘crowd’ diseases, such as measles and smallpox, and slash and burn agriculture encourages mosquitoes and malaria explaining why many of these diseases became common at this time. Pathogens and dysbioses have a ‘Malthusian’ role in reducing population size particularly and may target NAD-deficient individuals on a poor diet. This could, seen from one perspective, be beneficial and better than starving and damaging the ecological and farming niche or eating seed corn in desperation – putting eventual recovery at risk. Ecological damage made deliberately and politically by empires being a major factor in their population collapses, although once a popular alternative hypothesis, has proven difficult to confirm.
The key features of demographic transitions comprise usually (but not always) improving economics followed by decreased mortality, first in adults and then in infants. Then, nearly always, this is followed by a delayed decline in fertility from up to 8 births per woman to replacement levels of around 2 (this lag causes the disequilibrium and the population explosion). Population then stabilises but can implode as fertility drops below replacement levels (if not bolstered by immigration or infertility treatment). This pattern has been and still is being replicated in all parts of the world – even if circumstances and timescales differ in detail. The pattern is almost a demographic law even though the mechanism remains hotly disputed. During the transition period, population increases often over a century – although some recently have been faster. A historical minimum is a 2-fold increase with an average of an 8-fold increase. Fourfold increases are predicted for China and India, but record-breaking 10-fold to 18-fold increases or more (up to 30-fold) are projected in several countries mainly in sub-Saharan Africa where meat intake remains very low.
Thomas Doubleday in his Great General Law made while contemporaneously observing the United Kingdom’s 19th century transition stated that the first decline in fertility happened in the wealthy, whose food habits at that time included far more meat eating, and led him to favour a biological cause for the relative sterility of the rich. We will take his argument a step further and argue that more meat in diet provides the biochemical basis for increased longevity; then the usual (but not inevitable) further improvement in dosage leads to a decrease in fertility as well as improved cognition and education – that then play their part in further decreases in fertility. The length of the lag between increased longevity and decreased fertility is critical to the size of the population explosion and depends, we think, on the speed at which the meat intake (and nicotinamide dose) increases.
Meat – The Key Ingredient is Nicotinamide
Is meat special? In some cultures, they recognise a ‘meat hunger’ and it is a prized food in all cultures with cattle capital often used as a sign of wealth and used for dowries. Great effort and costs are made to hunt and obtain hunting grounds or pastureland, through raids or warfare when necessary. Domesticated animals include omnivores at the cost of them eating human animal product fodder let alone crops. We have argued, as have others, that there have always been easier and safer ways of obtaining calories, such as foraging for tubers. Vitamin B12 can only be obtained from meat, but an insectivorous diet, even one obtained inadvertently from eating fruit, would have sufficed. Similarly, iron, zinc, and vitamin A are best sourced from meat, but not much is enough to satisfy requirements. Certainly, one does not need to hunt big game. An argument for protein has been found wanting as most essential amino acid needs can be met from plants. However, a case can be made for needing animal protein for tryptophan that is otherwise hard to obtain – tryptophan is a source of nicotinamide, and dietary deficiency also has effects on serotonin and therefore social behaviour. We have made the case for vitamin B3 – nicotinamide – as this is the key vitamin that is missing in cases of over-reliance on a single plant, usually maize, and little or no meat. Nicotinamide as an essential component of NAD/NADH (reduced NAD) drives the electron transport chain converting the free energy of the electromotive force in to a proton gradient across the mitochondrial inner membrane driving ATP production and controlling pH and other voltage-coupled processes. Simultaneously, many NAD-coupled redox reactions as well as more recently discovered NAD-consuming reactions are known to be important for cell development, repair, and ageing: NAD is a master controller of much of metabolism necessary for running large bodies and brains and restoring stem cells.
High energy has long been recognised as crucial for expanding ape brain size but has often been considered to be paid for by reducing energy spent on large guts, inefficient locomotion, and less chewing. Storing more fat – another human feature – would be a reserve store of calories and nicotinamide without having to use auto-carnivory of muscle, liver, or other organs where the longer term price is high. Our metabolism runs faster with higher energy expenditure measured as calories burnt (27% more than chimps and more than that in gorillas): this can only happen by boosting mitochondrial function. Increasing the supply of NAD would be a way of accomplishing such a feat.
Lessons From Pellagra – The Forgotten Meat Deficiency State
The past is never dead. It’s not even past. -William Faulkner
Pellagra causes brain atrophy and subsequent low IQ / dementia, poor social behaviour, and gut dysbiosis. It has often been considered an atavistic model of human evolution; in other words, evolution running in reverse as a truly degenerative phenomenon – homo without the sapiens. Pellagra was known as the ‘lazy’ disease with emotional, cognitive, and physical stunting alongside many degenerative disease mimics (as we define them now). Comments about pellagrins as degenerates were made in the early European literature and the more recent American literature describing the pellagra-prone southern states around a century ago. Clinically undiagnosed cases had impaired IQ with average army Confederate recruits being in the ‘moron’ group (unlike Union recruits) and been felt to be in part responsible for the ‘white trash’ phenomenon with racial differences being both an additional factor in causing the Civil War and in determining the outcome. Such views fuelled the eugenic movement as genetics were thought to be causal and it was believed that such people had too many children.The high fecundity observed among sub-clinical pellagrins that caused much of this friction may be relevant to our argument.
Golberger proved that pellagra was dietary in origin and due to a lack of meat and milk, not genetic or infectious, and that eventually influenced thinking that southerners were not genetically degenerate and deserved help. Finally, the biochemical basis and treatment with replacement nicotinic acid was discovered in the 1940s.
604 view/sPrint or share on ;