1 Introduction and Background

Nutrition is a young science. For thousands of years, foods and herbs were a major component in the armamentarium of the physician and his predecessors. Although built upon much earlier work in the sciences, as understood in centuries past (Carpenter 2003), nutrition science came into its own only a century or so ago with the discovery of certain micronutrients with an essential nature (needing to be provided in the diet) that were called vitamins . These discoveries were largely made in the context of finding a cure for, or the prevention of, serious diseases such as scurvy, beriberi, pellagra, rickets, and others. Following the recognition that these diseases were the result of dietary deficiencies, there also came an understanding of the major physiological functions of these nutrients; their other epigenetic and microbiological roles are still being elucidated.

In the mid-twentieth century, as technology advanced and research methodology became more sophisticated, it was discovered that insufficient amounts of some of the “trace” minerals , so called because of the small amounts and relative ubiquity in the diet, could also cause human deficiency disease. Zinc is a good illustration, “discovered” only in 1963 and, paradoxically, in the Middle East where zinc was adequate in the diet if considered only in terms of chemical analysis. However, the diet was also high in phytates , due to the consumption of unleavened bread, a staple of the Middle Eastern diet. The phytates, by binding to zinc in the gut, decreased its bioavailability, creating over time a deficiency syndrome characterized by short stature and delayed puberty, among other signs. Thus also began the understanding that other compounds in food can affect nutrient absorption and/or metabolism. The discovery of the essentiality of other trace minerals soon followed, copper, selenium, nickel, etc., and in fact, the roles of new essential trace nutrients continue to be elucidated as laboratory techniques become more and more sensitive. Another important development is the ongoing discovery of other bioactive compounds in food and the emerging field of the microbiome and its relationship to diet, nutrition, and behavior.

The twenty-first century, following the mapping of the human genome and the rapid development of the relatively new field of epigenetics, where many of the epigenetic markers are nutrients or products thereof, promises to elucidate “new” biochemical and physiological functions for many of the nutrients whose non-genomic role was long ago understood. Perhaps food, or even selective nutrients, will again be part of the medical armamentarium and targeted with specificity to promote optimal health on both an individual and population basis. These concepts are presented in Fig. 1.

Fig. 1
figure 1

Evolution of the Importance of Food to Health

Today, nutrition is defined as a biological science, with physiological, genomic, medical, social, and environmental aspects (Beauman et al. 2005). Investigators have accumulated rigorous evidence about the importance of dietary intake , from preconception through old age, in shaping health trajectories across the lifespan and even into subsequent generations (Herman et al. 2014) through a process that is an interactive one, involving genes, environments, and behaviors. Being a field that crosses domains, nutrition epitomizes the need for an interdisciplinary approach to research and, certainly, to any intervention. Thus, when looking at the life course, nutrition must be seen through an integrative lens.

The purpose of this chapter is to use life course health development principles (as elaborated by Halfon and Forrest 2017) to organize a review of the current knowledge base regarding the crucial role nutrition plays in health development over the lifespan. There are time-specific health development pathways that result from particular nutritional exposures that occur during sensitive or critical periods, when developing biological systems are most alterable. Nutrition is an important determinant of health potential , such as reaching an individual’s genetically endowed height. The effects of nutrition on human biology for individuals as children and then as adults are also constrained by evolutionarily determined adaptive responses to the perinatal nutritional milieu to which the developing fetus and infant are exposed. These generally stated links between nutrition and health development will be explored more specifically in each of the chapter’s sections. We conclude with a discussion of the gaps in our knowledge that merit future research and attention from policy-makers.

1.1 Folate and Health Development

Our growing understanding of the importance of individual nutrients to optimal health development can be illustrated by the “story of folate,” a trace nutrient that can be used to describe the role of certain nutrients that is increasingly being recognized as a basic mechanism for translating environmental cues into changes in gene expression that impact health development. Folate is the generic name for a group of related B vitamin , water-soluble compounds that have been known to be essential since being isolated in 1941. Folate acts as a cofactor in several reactions leading to the synthesis of deoxyribonucleic acid (DNA) and ribonucleic acid (RNA) and, along with vitamin B1 2, the amino acid methionine and other nutrients such as choline, to serve as a “methyl donor,” providing one-carbon units for the methylation of a number of substances, including DNA, proteins, and neurotransmitters, thus influencing many functions at the cellular level. The exact mechanisms are still being worked out.

As early as the 1960s, epidemiological studies began focusing on geographic patterns that had been observed with regard to the incidence of neural tube defects, most commonly spina bifida, in newborns. Great Britain (specifically Wales, Northern England, Northern Ireland) was one of the sites of these studies. The initial approach to analyses of geographic clustering focused on a genetic cause. However, it soon became apparent that environmental factors and the timing of birth were important factors as well. The incidence of neural tube defects was lowest among babies conceived from April to June (Laurence et al. 1968). Social factors also emerged as potentially causal when it was observed that there was a social class gradient associated with the risk of developing a neural tube defect.

Since the neural tube develops very early in gestation, the suggestion was that women who conceived in late fall and winter, when there was very little access to fresh fruits and vegetables, might have been missing essential vitamins prior to, and during, the first month of their pregnancies, whereas those conceiving in late spring and summer would not have that dietary limitation. Women from higher social class groups may have had better access to fruits and vegetables during these months. The first intervention studies, in which women who had given birth to one child with a neural tube defect were given vitamin supplements (or not) prior to, or early in, their next pregnancies, provided further evidence that this was indeed likely to be the case; those who received the supplements had fewer subsequent children with a neural tube defect (Smithells et al. 1976).

Folate had been suspected early on (Laurence et al. 1968) as being the most likely vitamin because of the lack of green vegetables in winter and because it had been shown that there was a higher incidence of abnormal folate metabolism in mothers of babies with neural tube defects than in those with normal children (Hibbard et al. 1965). Several intervention studies followed, using periconceptional supplementation (Smithells et al. 1980, 1981). However, it took many years for the scientific community to declare the role of folate to be “fact” because the earlier studies had not been conclusive (Laurence et al. 1981) or were complicated by the inclusion of other vitamins in the supplement provided (Smithells et al. 1980, 1981). Finally, in the mid-1990s, after a large, well-designed clinical trial provided strong evidence in support of a protective role for folate (MRC 1991), the US government promulgated that in the interest of the public’s health, white flour should be fortified with folic acid (the synthetic, more bioavailable form of folate). This was a controversial decision because of the population-wide aspects of the supplementation, targeting women of reproductive age in order to reduce risk to the developing fetus. Nonetheless, this program has proven to be successful (Williams et al. 2015).

Our current understanding of the next chapter in the “story” awaited the unraveling of the mystery of the human genome and the field of genomics. Nutritional genomics is the study of the interactions between our genes and the bioactive components—both nutrient and non-nutrient—of foods we consume and the resulting health outcomes. The ultimate goal is to understand the interactions well enough to be able to influence these outcomes by structuring diets to meet the genomic needs of certain populations, or even individuals, in order to prevent disease and promote optimal health.

1.2 Nutritional Genetics

Nutritional genetics, also known as nutrigenetics , is the study of changes in the primary sequencing of DNA that alter nutrient requirements or metabolism and the identification and description of human genetic variation that changes nutrient metabolism and food tolerances. The effects of certain mutations have been recognized for decades. The classic example is “inborn errors of metabolism ,” such as phenylketonuria (PKU) , in which a mutation in a single gene alters the structure and function of an enzyme important in the metabolism of the essential amino acid phenylalanine, the toxic buildup of which results in severe intellectual disability. Now, infants are screened at birth, and those born with PKU , and many other metabolic disorders, can be treated by dietary means in order to promote normal development. Here is a clear example of a single gene-disease relationship that can be managed with a nutritional intervention which, if begun early enough, can have a dramatic, lifelong influence on the health of the individual.

As predicted by the life course health development principles, most disease causation is not as simple as one gene-one disease. The vast majority of diseases result from continuous interactions between individuals, inclusive of their genotype, with their environments over time. This multicausal and longitudinal paradigm requires new research tools for understanding etiological mechanisms.

1.3 Nutritional Epigenetics

More recently, geneticists recognized that the inheritance of traits was possible through mechanisms independent of genotype. Phenotypic changes where the sequence of DNA is not changed but expression of the gene is altered—turned on or off so to speak—are called epigenetic changes. The study of these phenomena is called epigenomics . Nutritional epigenetics is documenting how nutrition, one of the primary environmental factors that are in continuous interaction with individuals, may influence epigenetic changes with resultant consequences for health development (Jimenez-Chillaron et al. 2012).

Nutritional epigenetics examines the effects of nutrients on chromatin, a complex of DNA and histones; the latter is considered the superstructure or packaging (tertiary structure) of DNA which allows it to be contained in the nucleus of a cell by reducing its volume through chemical bonding. Regulation of the expression of proteins that function in metabolic or signaling pathways occurs through the opening and reforming of these bonds to allow transcription of DNA (through messenger RNA) and ultimately the formation of the protein coded in the DNA. Epigenetic regulation is mediated by means of methylation and demethylation that alters DNA, the histones that envelope DNA, and small RNA molecules. This type of epigenetic regulation depends on, or is influenced by, nutrients including, folate/choline, betaine, vitamin B1 2, vitamin B6, iron, selenium, methionine, and other bioactive components of food, such as resveratrol (red wine) and sulforaphane (broccoli) (Dolinoy et al. 2006). Polyunsaturated long-chain fatty acids (PUFA) have also been implicated in regulating gene expression in the brain as reviewed by Schuchardt et al. (2008). Thus, a growing body of evidence is pointing to an important link between food and gene regulation, with manifold effects on health.

Back to our folate story, it is currently believed that the epigenetic disruption caused by lack of, or insufficient, folate in the diets of pre-pregnant women and those in the first month of pregnancy can be overcome by sufficient, or additional, folate. Individual folate requirements vary, and it is further believed that women susceptible to neural tube defects have a greater than average need for folate—beyond two standard deviations of the population mean—or a metabolic impairment, perhaps due to epigenetic changes mitigated by the additional amount of folate. Maternal folic acid supplementation has been associated with higher rates of methylation of the IGF2 gene, increased intrauterine growth, but lower birth weights, demonstrating the potential to lower the risk of chronic disease as adults (Steegers-Theunissen et al. 2009).

In the space of half a century, we have progressed from the first epidemiological clues that the vitamin folate is linked to neural tube defects to an understanding, though not yet complete, of the probable mechanisms for that linkage, both as an essential cofactor for cell division and as a methyl donor involved in epigenetic regulation. Intriguingly, there are now suggestions that the paternal, as well as the maternal, folate status may influence pregnancy outcomes. A recent animal study found that the folate status of paternal rats plays an important role in regulating placental folate metabolism and transport (Kim et al. 2011). The contribution of fathers’ nutritional status, prior to conception, to the life course health development of their offspring is another demonstration of the manifold ways in which environmental exposures can be transduced in ways that affect health development. Moreover, understanding these paternal effects is an area of research that merits much more attention.

Similar stories can be told and arguments made for the necessity of each of the known essential nutrients to the optimal development and functioning of an individual across the life course and beyond. However, we have chosen to highlight only a few to illustrate their crucial importance during sensitive periods of life course health development, or where improvement in the status of the nutrient would make a positive difference in life course health development beyond the early years, or where there is either current research activity or a need for such to advance our understanding of the contribution of nutrition to life course health development across the life trajectory and into future generations.

2 The Importance of Nutrition to Life Course Health Development: Selected Examples

2.1 Pre-conception and Pregnancy

The availability of an adequate supply of nutrients may be the most important environmental factor influencing pregnancy health outcomes . Although the woman’s body adapts physiologically to meet the nutrient demands of both mother and fetus, these adaptations may not be sufficient if the mother is not well nourished. The placenta is recognized as a fundamental influence on life course health development (Barker and Thornburg 2013), and until rather recently, it was commonly thought that the placenta would provide for the nutrient needs of the fetus, even over those of the mother. Although some nutrients are actively transported across the placental barrier, others, including folate, pass by simple or mediated diffusion and are, therefore, much more dependent on the maternal diet and her available stores. Thus, the fetus should not be considered the “perfect parasite ” (McNanley and Woods 2008).

Two groups at particular risk in this regard are adolescents within 2 years of menarche and women with inter-pregnancy intervals <18 months; both groups may have low nutrient reserves (King 2003). This underscores the importance of optimal nutritional status of women, from the preconception period and throughout pregnancy, detailed in a recent publication by the International Federation of Gynecologists and Obstetricians (FIGO ) , on optimal pregnancy outcomes and the prevention of low birth weight as well as defects such as neural tube defects (NTD) (Hanson et al. 2015). Current research interest is also turning to the role of the father’s nutritional status/diet in influencing pregnancy outcomes, reflected in identified paternal epigenetic contributions in animal models (Siklenka et al. 2015; Donkin et al. 2016) as well as humans (Govic et al. 2016).

2.1.1 Birth Defects

In spite of the progress made in the prevention of neural tube defects worldwide, there are still approximately 300,000 infants born annually with neural tube defects (Christianson et al. 2006). Although the impact of folic acid fortification of the food supply has successfully lowered the incidence in the United States, there are still babies being born with neural tube defects. The latest Centers for Disease Control and Prevention data show the estimated prevalence has decreased from 10.7 per 10,000 live births in 1995–1996 (pre-fortification) to 6.5 per 10,000 in 2009–2011 (post-fortification) and that about 1326 fewer infants have been born over this 3-year period with a neural tube defect (Williams et al. 2015). However, Hispanic babies are affected disproportionately (4.17/10,000 live births) compared to either non-Hispanic Blacks (2.64/10,000) or Whites (3.22/10,000) (CDC, 2015; Parker et al. 2010), indicating that either this population is not being reached (e.g., only wheat flour, not corn, is fortified) or there are other factors in play such as genetic predispositions or other environmental exposures. In the case of Mexican-American women, there is some evidence that fumonisins, mycotoxins that contaminate corn, may be involved in the etiology (Missmer et al. 2006).

Incomplete uptake of folic acid, either from fortification or supplements (or folate from food), is suggested by recently published data (Tinker et al. 2015) from the 2007 to 2012 National Health and Nutrition Examination Survey (NHANES) , which showed that 23% of women of childbearing age have red blood cell (RBC) folate concentrations below the recently established World Health Organization (WHO) recommended cutoff (i.e., 400 ng/mL or 906 nmol/L) for the prevention of neural tube defects (WHO 2015). On the other hand, another study has reported that one in ten women took folic acid supplements that exceeded the tolerable upper limits for that vitamin (Hoyo et al. 2011); more research into the possible effects of too much folate is also needed.

Even considering incomplete protection from dietary folate deficiencies in spite of the fortification program, not all cases of neural tube defects are preventable by folic acid alone (Heseker et al. 2009), indicating that there are other factors involved. Using the data (n = 954 with neural tube defects; n = 6268 controls) from the National Birth Defects Prevention Study (1997–2005), it was found that other micronutrients, related to methylation or oxidative pathways, in addition to folate and betaine, when present at higher levels in the maternal diet (thiamin, iron, and vitamin A), were associated with a decrease in anencephaly among some groups of women. In others, higher intakes of thiamin; riboflavin; vitamins B6, C, and E; niacin; and retinol were associated with decreased risk of spina bifida (Chandler et al. 2012). Vitamin A deficiency during pregnancy has also been associated with congenital diaphragmatic hernia (Beurskens et al. 2013).

Other birth defects, including cardiac anomalies , limb defects , cleft lip , and/or palate , have been prevented by provision of folate alone (Berry et al. 1999) or, in combination with other vitamins, urinary tract defects and congenital hydrocephalus, according to a meta-analysis (Goh et al. 2006).

Further evidence for the importance of other micronutrients has come from data collected by the Hungarian periconceptional service between 1984 and 1994 (Czeizel et al. 2004). In a randomized controlled trial (RCT) to test the efficacy of a multivitamin supplement containing folate (0.8 mg) in the prevention of neural tube defects, researchers were surprised to find that not only were neural tube defects reduced by about 90%, but there was a very significant reduction (21 per 1000 population versus 41 per 1000 population) in the incidence of other major anomalies as well, notably cardiovascular anomalies and urinary tract defects (Czeizel 2009). Even when the neural tube defect cases were removed, the results remained statistically significant. The Hungarian group has published several articles examining folate with and without multivitamin supplementation and, based on their results, has strongly recommended that folate be recognized as preventing cardiovascular malformations as well as neural tube defects (Czeisel 2011) and that multivitamins be included in periconceptional supplements (Czeizel and Bánhidy 2011).

The US Public Health Service currently recommends that women capable of becoming pregnant consume 400 μg of folic acid daily. Based on correlational data from Irish and Chinese studies (Cordero et al. 2015), the WHO recently recommended levels of maternal RBC folate >400 ng/ml in order to reduce the risk of neural tube defects. To determine optimal levels for supplementation to prevent birth defects, one approach would be to quantify the daily dietary amounts necessary to achieve a level in whatever biological sample is typically used to measure the status of the nutrient. For example, the next step in this case would be to examine the dietary intake necessary to produce those protective RBC folate levels .

In recognition of the importance of folate and zinc to the processes involved in cell division and DNA synthesis and that alcohol interferes with the metabolism of each of these nutrients, as well as providing calories devoid of any nutrients, it is not surprising that a recent publication (Young et al. 2014) has explored the nutritional etiology of fetal alcohol syndrome . Young and his colleagues provide a review of the current state of research on the effects of alcohol on folate and zinc metabolism, along with some of the other nutrients known to be essential to successful reproduction—vitamin A , docosahexaenoic acid (DHA) , choline , vitamin E , and selenium —that have been studied in animal models in relation to fetal alcohol syndrome . The question these authors pose is an interesting one: What is the potential of preventing or at least mitigating the severity of the disorder, by providing nutrient supplementation for pregnant women who drink?

These findings provide support for the importance of optimal preconception nutritional status to long-term health development. According to the life course health development principles, we would predict that further research will implicate yet other nutrients and their interactions with the intrauterine environment in the development of a variety of birth defects. Even for conditions such as fetal alcohol syndrome, no single risk factor (such as alcohol) is likely to fully explain the development and severity of birth defects.

The complexity of intrauterine environmental interactions between nutrition and alcohol is further illustrated in a Danish National Birth Cohort study that found a lack of effect of mild to moderate alcohol intake (one to eight drinks/week; one drink = 12 gm pure alcohol) during early to midpregnancy on children’s intelligence, executive function, or attention at age 5 (Kesmodel et al. 2012). It is possible that adequate nutritional status was a protective factor in these women. Although diet was considered in the study as a possible “confounding factor ” (Olsen et al. 2001), the focus was on fish (omega 3s), iron, and breastfeeding. They did not evaluate how diet as a whole might have been a protective factor; the women sampled were largely middle class and so may be expected to have had adequate diets. Further research into diet as a protective factor among women who are moderate drinkers, as well as the implications of higher nutrient supplementation in an attempt to potentiate the damage to their fetuses in those whose alcohol intake is heavy is needed, particularly to confirm positive results, to determine optimal amounts of nutrients and types of nutrients that should be provided, and to investigate the collective effects of multiple-nutrient supplementation.

2.1.2 Low Birth Weight

Low birth weight has long been recognized as a risk factor for mortality and neurodevelopmental problems (Hack et al. 1995). When the infant is small for gestational age—that is, born either before or at term and weighing less than expected for his/her fetal “age”—it suggests that the fetus was malnourished in utero. Full-term low birth weight has an incidence of 5% of live births in the United States and is considered a proxy for intrauterine growth retardation (CDC 2011). Among women underweight preconceptionally, the prevalence of low weight gain during pregnancy is 23%, preterm birth 15%, and low birth weight 10%.

In 2009, the Institute of Medicine issued updated guidelines for weight gain during pregnancy based on World Health Organization body mass index categories (IOM 2009). (There were not enough data to recommend guidelines for adolescents, and this is an area needing to be examined.) A recommended action was to continue services, where needed, into the postpartum period to prepare women for the next pregnancy, assuring good nutrition and adequate weight gain. A recent meta-analysis has suggested that there are protective effects of maternal overweight or obesity on risk of delivering low birth weight infants, both in developed and developing countries, but especially the latter. However, there was also an increased risk of having an infant of very low birth weight (<1500 g) or extremely low birth weight (<1000 g); the heavier the woman, the higher the risk (McDonald et al. 2010). This seeming contradiction may be explained by a high risk for malnutrition due to a lack of micronutrients in obese women, as reported by Bodnar and Parrott (2012) and referenced earlier.

Two recent reviews of the potential effects of maternal multiple-micronutrient supplementation, as opposed to folate or iron alone, have clearly shown that there is a significant benefit on decreasing the incidence of low birth weight and babies born small for gestational age (Haider and Bhutta 2012; Zerfu and Avele 2013). Both reports recommend amassing more evidence through future, larger trials before moving to a universal policy change, as well as further studies to determine the optimal dosage and mix of the micronutrients to promote the best pregnancy outcomes.

The scope of this chapter does not extend to the examination of the literature concerning the many individual nutrients, or other environmental and social factors, that have been associated with the incidence of low birth weight. However, because of the current burgeoning interest in the effects of vitamin D beyond bone health (also related to undernutrition and deficiency in circulating concentrations of maternal vitamin D) (Javaid et al. 2005, 2006), it is worth mentioning that a recent study found an association between maternal vitamin D levels at 26 or fewer weeks of gestation and growth measures of newborns; serum vitamin D levels were positively related to birth weight and head circumference and negatively associated with the risk of an infant being born small for gestational age (Gernand et al. 2013).

2.1.3 Developmental Origins of Adult Disease

We now know that the embryo is particularly susceptible to “nutrient-induced adaptations in gene expression ” (Waterland and Garza 1999). These adaptations, or developmental plasticity , have been defined as the ability of a single genotype to give rise to several different phenotypes which, teleologically speaking, allows organisms to adapt to their surrounding environmental conditions more rapidly than possible with evolutionary changes (Duncan et al. 2014). Some have suggested that the term “plasticity ” was used to describe epigenetic phenomena before we had an understanding of epigenetics (Jablonski 2012). Whatever these phenomena are called, and some have suggested the term “evo-devo ” (Jablonka 2012), there can be negative implications for life course health development when the prenatal and postnatal environmental conditions are dissimilar.

Beginning with the epidemiological observations of David Barker and his colleagues in England linking low birth weight to earlier adult mortality (Barker et al. 1989, 1993) and the subsequent fetal origins of adult disease hypothesis (Barker 1995), the consensus now is that undernutrition during the early stages of pregnancy results in epigenetic adaptations which prepare the fetus for survival in an extrauterine life where the environment is lacking in sufficient nutrients to support optimal development. If the infant, born with a low birth weight, is then exposed to an environment of plenty, there is a mismatch between the metabolic adaptations that took place prenatally and the demands of the extrauterine environment which can result in later chronic disease (Gluckman et al. 2008). For example, if the kidney develops with fewer nephrons in response to prenatal deprivation, the well-nourished adult may later be predisposed to hypertension (Stein et al. 2006). Christian and Stewart (2010) have published a conceptual framework (Fig. 2) that illustrates the pathways by which various organ systems may be affected by maternal micronutrient deficiency.

Fig. 2
figure 2

Conceptual framework for the metabolic effects of maternal micronutrient deficiency (Christian and Stewart 2010)

Barker’s early hypothesis regarding the negative, lifelong consequences of poor nutrition during pregnancy, in spite of a presumably postnatal normal diet, has been supported by retrospective observations of life course health development of children prenatally exposed to starvation during the brief (6 months), but severe, Dutch famine during the winter of 1944–1945 when the Nazis laid siege to western Holland. Not only were these children more likely to experience coronary heart disease, decreased renal function, and decreased glucose tolerance in adulthood (Roseboom et al. 2006), but their offspring are shorter and heavier, which demonstrates the transgenerational effects of early nutritional insults on future health outcomes (Painter et al. 2008). This is possibly due to epigenetic changes occurring in fetuses exposed to the famine, which are then transmitted across generations (Tobi et al. 2009).

Similar findings of long-term health effects resulting from exposure to starvation during the developmental period have resulted from investigations following the Chinese famine of 1959–1961 during the “Great Leap Forward” (Huang et al. 2010). Over 35,000 young women (average age 32 years), exposed during this period, were studied. In the 1958 and 1959 cohorts, postnatal exposure (1.5–3 years of life) was associated with reduced height, increased body mass index, and a threefold increase in the odds of hypertension for the 1958 cohort. Body mass index increased for the same postnatal exposure for the 1957 cohort, but decreased for the 1960–1961 cohorts for those exposed during pregnancy and infancy. The authors note the young age of the subjects and hypothesize that later in life there might have been more significant effects. Interestingly, they also document negative effects on economic productivity as measured by labor and earning findings.

In a study of adults exposed as infants to undernutrition before, during, and after the Biafran famine of 1967, both women and men experienced a higher prevalence of overweight, hypertension, and impaired glucose tolerance at 40 years of age (Hult et al. 2010). This reinforces the above findings suggesting that fetal and infant undernutrition are closely associated with the development of chronic disease as adults and that nutritional challenges in early life can result in changes to epigenetic regulation of genes which are detectable up to 60 years later (Lillycrop 2011).

2.2 Growth and Obesity

2.2.1 Growth

Although adequate, if not optimal, nutrition is recognized universally as being important to growth and life course health development, nutritional evaluations are rarely considered in population-based studies because of the difficulties (i.e., time intensity, expense) in collecting the dietary and/or biochemical data that would reflect the inadequacies in dietary intake and/or decreases in nutrient stores that precede growth faltering, as the body adapts to the decreased nutrient intake. This continuum in nutritional status from optimal health to overt disease is depicted in Fig. 3, which also shows the typical measures used in assessing nutritional status, as well as their level of sensitivity. Dietary assessment is the most sensitive in predicting, and therefore preventing, nutritional problems, followed by biochemical assessment, a measure of nutrient stores. As these assessments are rarely included, many studies of children’s health development use growth as a proxy for nutritional status. Although growth is a clinical/ functional measure of nutritional status, it is not ideal, because it follows the usually chronic insufficient/poor nutrient intake which results in decreased nutrient stores. However, growth remains the most widely used and reported indicator of nutritional status and, if measurements are carefully taken, can be useful in monitoring growth rates in children or assessing nutritional health in populations.

Fig. 3
figure 3

Continuum of nutritional status and sensitivity of assessment methods to detect risk/signs of malnutrition

Reduced stature caused by a combination of poor diet and disease burden in childhood, called “net nutrition ” by some (Steckel 1995; Silventoinen 2003), is modifiable. Data from the Organization for Economic Cooperation and Development (OECD) have shown an increase in stature, generally attributed to the improvement in net nutrition, for most of the 34 member countries (OECD 2009), although there do seem to be upper limits to growth potential, as evidenced by the apparent plateauing of the increases seen in the Dutch population (Schönbeck et al. 2012). Interestingly, the United States has not shown the same gains as the other countries (Komlos 2008), a finding not likely explained by the influx of immigrants of short stature (OECD 2009).

The Institute of Nutrition of Central America and Panama in Guatemala conducted a study from 1969 to 1977 in which women and children in entire villages were exposed to supplemental nutrition; findings showed significant improvements both in children’s stature and in pregnancy outcomes in intervention, compared with control villages (Martorell 1992; Martorell et al. 1995). Follow-up studies (1988–2007) showed that the nutritional intervention for girls also increased the body size of their offspring, again demonstrating the intergenerational effects of nutritional status (Behrman et al. 2009).

The idea that there is a universal potential for growth, regardless of the environmentally sensitive phenotypic expression of that potential, led to a study (1997–2003), sponsored by WHO, of the growth of infants and children from six different continents who were healthy at birth and breastfed for at least 6 months (the Multicentre Growth Reference Study). The results strongly suggested that all human infants, regardless of racial or cultural background, have the genetic potential to achieve similar stature under optimal conditions (de Onis et al. 2004). These findings were considered robust enough for the WHO to adopt them as universal reference data for growth (WHO 2006); the international WHO Child Growth Standards, from 0 to 5 years of age, are now used in more than 140 countries worldwide (de Onis et al. 2015). In the United States, the Centers for Disease Control and Preventions have recommended their use for children 0–2 years, recognizing that the growth of breastfed infants should be the standard (Grummer-Strawn et al. 2010).

Data from the INTERGROWTH-21st Project (the International Fetal and Newborn Growth Consortium for the 21st Century), which assessed fetal growth and newborn size among healthy pregnant women in eight urban populations (Brazil, China, India, Italy, Kenya, Oman, the UK, and United States), have provided similar international convergence in growth potential. From 2009 to 2013, assessments of over 4600 women with problem-free pregnancies (out of 60,000) were done using ultrasound measurements at 5-week intervals from week 14 to delivery. Again, similar results in growth across ethnicities and countries suggested that most of the variation in the average size of babies born in different places around the world is due to nutritional and other socioeconomic and healthcare differences (Villar et al. 2014a). Thus, these data, representing optimal fetal growth, have been used to construct fetal growth charts to be used universally, in conjunction with the WHO charts, as a clinical tool to assess maternal, fetal, infant, and early childhood health and nutritional status on both the individual and population-based levels (Chatfield et al. 2013; Papageorghiou et al. 2014). For the newborn standards, meticulous measurements of weight, length, and head circumference for more than 20,000 babies born between 33 and 42 weeks’ gestation during the same study were used to generate the charts, representing the first international standards for newborn growth under optimal conditions during pregnancy (Villar et al. 2014b).

2.2.2 Obesity

As the world goes through a “nutrition transition ” from inadequate food to relative overabundance, adaptation to poor diets of a different sort in an age of “overnutrition ” has no evolutionary precedent. Physiological constructs have been programmed from Paleolithic times, when we were hunters and gatherers, to conserve energy as fat in times of plenty to serve as energy stores for periods of relative food scarcity. At that time and up until the last century, most humans were very active physically and, because of their high-energy needs, consumed enough food to meet micronutrient needs as well. Today, a sedentary lifestyle means that we must be selective about our food choices to meet micronutrient needs with a lower energy intake. In fact, obese people may also be malnourished even while exceeding their energy needs if their diets are nutrient poor.

This double burden of malnutrition presents interesting challenges, with regard to which nutrients to supplement for optimal birth outcomes, when overweight or obese mothers coexist with underweight children in the same household. Until recently, the double burden was considered a phenomenon more common in low- and middle-income countries. However, data from the Danish National Birth Cohort studies reveal that obese women have a higher risk of micronutrient deficiencies (Bodnar and Parrott 2012). These data demonstrate that micronutrient deficiencies associated with pregnancy in overweight/obese women are becoming increasingly an issue in both affluent and lower- and middle-income countries and may further negatively impact birth outcomes, particularly the incidence of preeclampsia and low birth weight (Darnton-Hill and Mkparu 2015), and may explain the apparent contradiction mentioned earlier with regard to obesity being at the same time a potential protective factor for birth weight, but also a risk factor (McDonald et al. 2010).

There are many causes underlying the current worldwide “obesity epidemic ,” including many that are based in cultural, social, and political as well as biological factors. Genetics, epigenetics, and even causes related to the microbiome (discussed later in the chapter) are currently being studied intensely. However, the bottom line is that obesity is caused by increases in dietary energy (much of it nutrient-poor food) that is not compensated for by increased energy output.

Paradoxically, the increasing concern about obesity and its role in the predisposition to metabolic disease has resulted in the growing interest in the developmental origins of health and disease, discussed earlier.

The Newborn Epigenetics Study , a federally funded research project based at Duke University, is studying how pre- and postnatal environmental exposures, including nutrition, affect the epigenome, with a special interest in obesity. A paper, recently published by this group, examined the potential associations between DNA methylation patterns in newborns and parental preconception obesity and found that hypomethylation of the IGF2 gene in newborns is associated with paternal obesity (Soubry et al. 2013). The contribution of the father to pregnancy outcomes—mentioned earlier with regard to folate—is an area of research that has potentially important implications for the public health community with regard to the development of new and different population intervention strategies.

While an inverse relationship between total serum vitamin D and increased adiposity has been established in children, adolescents, and adults, the relationship between neonatal adiposity and vitamin D status has only recently been investigated. Researchers found that infants born to lean mothers had a third higher amount of vitamin D compared to infants born to obese mothers (Josefson et al. 2012). Obese women in this study transferred less vitamin D to offspring than normal weight women, despite similar serum levels, possibly due to sequestration of this fat-soluble vitamin in their adipose tissue. These findings underscore the evolving relationships between maternal obesity, vitamin D nutritional status, and adiposity in the neonatal period that may influence subsequent childhood and adulthood vitamin D-dependent processes (Josefson et al. 2012).

2.3 Neurological Development

The brain continues to grow rapidly after birth for the first 2–3 years of life, coinciding with a high iron requirement. Iron deficiency is the most prevalent nutritional problem worldwide, because after 6 months of age, even breastfed infants need an additional source of iron (WHO 2011). Multiple studies, dating back to the mid-twentieth century, have investigated the effects of iron deficiency and iron deficiency anemia on the development of infants and young children, which support the claim that there is a causal relationship between iron deficiency anemia and poor performance on measures of psychomotor and cognitive development, particularly when the anemia is severe (McCann and Ames 2007). In cases in which a child is iron deficient but not anemic, the findings are equivocal (Sachdev et al. 2005; Szajewska et al. 2010).

Other nutrient deficiencies in early childhood can also impact the ability to learn and affect school readiness, which in turn can alter lifelong achievement and increase inequalities in life course health development (WHO 2011). The importance of folate and iron to brain formation has already been mentioned. Another animal study has shown that offspring of pregnant rats fed a mildly zinc-deficient diet, compared to those of controls and supplemented rats, showed decreased learning and memory ability that was reflected in changes in the morphology of the hippocampus (Yu et al. 2013). Vitamin C has also been shown to be associated with fetal brain development. Unfortunately, the placental transport of vitamin C is not active, and it appears to be insufficient in the case of vitamin C deficiency (Norkus et al. 1979). In animal models, even marginal vitamin C deficiency in the mother stunts the development of the fetal hippocampus, the important memory center, permanently impeding optimal development of the brain even if guinea pig pups were given vitamin C after birth. (Tveden-Nyborg et al. 2012). This study has implications for the 10–20% of all adults in the developed world with vitamin C deficiency, including the most vulnerable populations who already suffer from health and socioeconomic disparities and who may also have poor dietary habits and perhaps smoke cigarettes, both of which increase the risk of vitamin C deficiency (Montez and Eschbach 2008).

Polyunsaturated fatty acids are essential nutrients for humans; omega-3 (synthesized from linolenic acid) and omega-6 (synthesized from linoleic acid) long-chain polyunsaturated fatty acids are involved in the development and maturation of neuronal structures, serve as structural and functional components of cell membranes, and are precursors of eicosanoids, which exert hormonal and immunological activity. The central role of the omega-3s DHA and eicosapentaenoic acid (EPA) in the development and functioning of the brain has met with growing research interest with regard to neurological development in children, including visual acuity (Schuchardt et al. 2008). In early studies, when it was recognized that infant formulas were lacking in DHA and EPA , it was shown that retinal development (Birch et al. 1992) and visual evoked potentials (Faldella et al. 1996) in very low birth weight infants were improved if they were supplemented with these omega-3 fatty acids. The effects of supplementation on visual acuity in term infants has also been shown (Birch et al. 2005), but not consistently. In addition, the effects of oral supplementation of omega-3 fatty acids during pregnancy on early childhood development, especially visual, were inconclusive (Gould et al. 2013). In a meta-analysis of 12 studies (1949 infants up to 12 months of age), omega-3 supplementation of infant formulas appeared to have possible effects on visual acuity as measured by evoked potential for infants 2 months of age and by behavioral methods at 12 months (Qawasmi et al. 2013). In another meta-analysis conducted by the same authors evaluating the effects of omega-3 supplementation of infant formula, no effects on general cognition were identified (Qawasmi et al. 2012).

Omega-6 fatty acids are also essential, but their metabolites are more inflammatory than those of omega-3, and some researchers believe that the ratio of omega-6 to omega-3 fatty acids that are consumed fall within a range of 1:1–1:4 (Lands 2005). Others believe that a ratio as high as 4:1, which is thought to approximate the ratio obtained in the diets, rich in animal meat and seafood, of our earliest ancestors, is still healthy (Simopoulos 2002). Currently, typical Western diets provide dramatically higher ratios of between 10:1 and 30:1, which may be of concern (Hibbeln et al. 2006; Schuchardt et al. 2010). The significance of these findings has been related to current dietary practices that result in an increased intake of omega-6 precursors (vegetable oils, processed and fast foods, meat) over those of omega-3 (fish, nuts, legumes).

A review of studies considering the significance of polyunsaturated fatty acids related to the development of behaviors in older children has found equivocal evidence in support of an association, although numerous observational studies have shown a link between omega-6 to omega-3 imbalances and some developmental disorders, including attention deficit hyperactivity disorder (ADHD) and autism (Schuchardt et al. 2010). In an RCT, conducted by Vaisman et al. (2008), among 60 children, aged 8–13, with impaired visual sustained attention, an improvement in performance test scores was seen in those supplemented with either EPA or EPA plus DHA compared to those receiving the placebo. Frensham and coworkers (2012) suggest that among children and adolescents, the greatest benefits of omega-3s in the diet are seen in trials with durations of at least 3 months and in subgroups of children with low socioeconomic status, learning disabilities, and ADHD or ADHD-type symptoms, which might explain the reported discrepancies.

2.4 Immune Function and Oxidative Stress

With respect to the development of the immune system, animal models show that exposure to omega-3s during gestation and lactation resulted in a more permeable gut, allowing new substances to pass through the lining and into the bloodstream more easily (De Quelen et al. 2011). The new substances then trigger the fetal immune response and the production of antibodies. This results in the potential for a more developed and mature immune system with better immune function, which could be important for the prevention of allergies as a child develops. Supporting this is a recent periconceptional multi-micronutrient supplementation intervention (vs. placebo) with malnourished Gambian women which led to differential methylation of genes, some of which were associated with the immune function, in their offspring at birth, and also at 9 months of age (Khulan et al. 2012).

Zinc has been shown to be required for the activation of at least 300 enzymes and for the gene expression of nearly 2000 transcription factors (Prasad 2012). It serves as an intracellular signaling molecule, playing an important role in cell-mediated immune function and oxidative stress (Prasad 2009). An estimated 17% of the global population is at risk for inadequate zinc intake, ranging from 8% in high-income regions to 30% in South Asia (Wessells and Brown 2012). Provision of zinc supplementation has been shown to decrease oxidative stress markers and inflammatory cytokines (Prasad 2008a, b). This is an interesting example of the dependence, as described by Hambidge (2000), of many metabolic processes important to life course health development, including the immune system, on the presence of a trace mineral. Certainly, the role of zinc, and other trace elements once thought to be insignificant, needs further investigation .

2.5 Healthy Aging

The aging process begins before birth with epigenetic changes that affect gene regulation. In addition to fundamental changes in organ structure, it has also been increasingly demonstrated that epigenetic mechanisms, which are susceptible to the presence or absence of certain nutrients during critical growth periods, establish long-lasting patterns of gene expression. Some of these have been discussed earlier in the chapter. The nutrient needs of individuals over 50 have only recently been explored; the oldest age category in the last edition of the Recommended Dietary Allowance (RDA) , published in 1989, was 50 and over (Food and Nutrition Board 1989). This was largely because the studies carried out to determine nutrient needs used young subjects and were focused on deficiencies and extrapolating beyond the age of 50 was not deemed reasonable. With an aging population and an increased interest in the relationship between nutrition and chronic disease, there were calls to determine the RDAs for older adults (Russell 1997). In the newly defined series, the Dietary Reference Intakes , first published in 1997 and updated with regularity, the age categories were expanded to include 50–70 years and 70 and older (Food and Nutrition Board 1997), using data from the Jean Mayer USDA Human Nutrition Research Center on Aging at Tufts University. The following is a brief overview of some of the nutrients that the latest research has shown to be potentially important for healthy aging.

The traditional function of vitamin D is understood to involve the release of 1,25 vitamin D into the circulation, after which its effects are targeted on end organs involved in calcium and phosphorus homeostasis, such as the kidney, intestines, parathyroid, and bone (Jones 2007). However, we now know that there is a vast array of other biological functions in which vitamin D plays a role in and that these functions actually represent the bulk of daily metabolic utilization of vitamin D (Jones 2007; Heaney 2008). The recognition of these new pathways has led to newly ascribed paracrine functions of vitamin D that include multiple organ systems such as the cardiovascular (Heaney 2008; Verstuyf et al. 2010), renal, and immune systems. These systems are believed to activate vitamin D locally, via vitamin D receptors, to regulate cell and tissue growth and differentiation (Falkenstein et al. 2000), as well as to serve as precursors of enzyme cofactors, all integral to the intact functioning of numerous metabolic processes (Heaney 2008; Rostand and Warnock 2008). The widespread presence of vitamin D receptors supports the extensive range of physiological functions of 1,25 vitamin D (Dusso and Brown 1998). As such, vitamin D has a number of important effects on both developmental and lifelong health trajectories.

As an example of the role vitamin D plays in the development of long-term health outcomes, substantial evidence has linked low circulating vitamin D levels to increasing risk and incidence of cardiovascular disease (CVD) (Poole et al. 2006; Wang et al. 2008) and also suggests the possibility that vitamin D repletion may reverse or attenuate what remains the leading cause of mortality in the United States (Ford et al. 2011). In the cardiovascular system, the mechanisms responsible for vitamin D’s effects appear to be mediated through the interaction of activated vitamin D with the intracellular vitamin D receptors within vascular smooth muscle, endothelium, and heart muscle cells. These mechanisms serve to modulate key processes involved with the pathogenesis of CVD including vascular inflammation (Rigby et al. 1987), platelet aggregation (Aihara et al. 2004) and vascular smooth cell proliferation, vascular calcification, and more (Artaza et al. 2009, 2010).

Age-related effects of vitamin D include its protective effects against Alzheimer’s disease (Annweiler et al. 2012), improved cognitive health in older women (Annweiler et al. 2013), and improved mobility among older adults (Houston et al. 2013). Vitamin D plays an important role in the development and maintenance of muscle mass, particularly in institutionalized elderly, and is recommended for optimal musculoskeletal health (Mithal et al. 2012).

In addition to vitamin D, omega-3 fatty acid intake appears to influence the aging process. Adequate intake of omega-3 fatty acid supplements, to improve the balance of omega-3s to omega-6s, may slow a key biological process linked to aging. Among overweight middle-aged and older adults who took omega-3 supplements for 4 months, the ratio of their fatty acid consumption was altered in a way that helped preserve white blood cell telomeres, which normally shorten during the aging process (Kiecolt-Glaser et al. 2013). The improved omega ratio also resulted in reduced oxidative stress, caused by free radicals in the blood by about 15% compared to those in the placebo group. Other benefits of omega-3 intake for older adults include its putative role in the prevention of dementia and pre-dementia. In the results from a meta-analysis of more than 2200 elderly subjects and matched controls with cognitive deficits, the latter had lower serum levels of EPA , DHA, and omega-3 fatty acids , while serum levels of EPA alone were significantly lower in those with pre-dementia. This indicates that EPA might not only be a disease state marker but may indicate increased risk for cognitive impairment as individuals age (Lin et al. 2012). All of these studies demonstrate that omega-3 fatty acid intake across all phases and stages of the life course is important and has the potential to alter health development outcomes, although there is some controversy regarding whether or not the omega 6/omega 3 ratio is important (Willett 2007).

There is also some evidence that vitamin E , with its antioxidant properties, may protect against memory loss in older adults. In a prospective study carried out in Finland, a sample of 140 over 65-year-olds with no memory impairment at the onset of the study was followed for 8 years, during which time it was found that higher total serum levels of vitamin E (alpha-tocopherol), as well as the other forms of the vitamin, seemed protective against memory disorders (Mangialasche et al. 2013). A recent study, using a zebra fish model, showed that a diet deficient in vitamin E —equivalent to a human lifelong deficiency—resulted in about 30% lower levels of DHA phosphatidylcholine (DHA-PC) , which is a part of the cellular membrane in every brain cell or neuron, indicating that DHA-PC may be a good predictor of a higher risk for Alzheimer’s disease (Choi et al. 2015). A recent meta-analysis found that patients with Alzheimer’s disease, compared with cognitively intact elderly controls, had significantly lower plasma alpha-tocopherol concentrations (Lopes da Silva et al. 2013).

A recent large RCT involving over 500 patients with mild to moderate Alzheimer’s disease at 14 Veterans Affairs medical centers found that 2000 IU/day of alpha-tocopherol compared to a placebo resulted in a slower cognitive functional decline (Dysken et al. 2014). Although supplements have been found to have benefit in slowing Alzheimer’s disease progression, they do not seem to prevent Alzheimer’s disease occurrence (Traber 2014). Because 96% of adult women and 90% of men in the United States do not receive adequate levels of vitamin E in their diet (Choi et al. 2015), it would seem that additional studies are needed to explore further risk reduction related to the development of cognitive impairment.

Further evidence of the preventive value of folate in reducing the risk of cardiovascular disease and stroke has been provided by recent studies. A meta-analysis of eight randomized trials which assessed the use of folic acid supplementation in the primary prevention of stroke showed beneficial effects, especially in trials that lasted longer than 3 years (Wang et al. 2007). The China Stroke Primary Prevention Trial , a large (n = 20,702) randomized, double-blind clinical trial that took place in 32 communities over 5 years, also examined folic acid’s effects on cardiovascular disease (Huo et al. 2015). Participants were given either enalapril (a drug used to treat hypertension) alone or in combination with folic acid. The results showed that the combined use of enalapril and folic acid, compared with enalapril alone, significantly reduced the risk of first stroke. These examples (folate and vitamin E ) both illustrate how the accumulation of a lifetime of inadequate intake of a nutrient may contribute to the development of a serious disease and also that improved nutrient intake can result in some degree of amelioration of chronic disease symptoms.

While adequate intake of specific nutrients has been shown to help preserve structure and function of the body with aging, a number of studies in model organisms have also demonstrated the benefits of caloric restriction. A 20–40% reduction in calorie intake reduces levels of insulin-like growth factor I and other growth factors, which has been consistently associated with increased lifespan, and prevents the development of age-associated cardiovascular functional and structural changes (Fontana et al. 2012; Wei et al. 2008). In animal models, caloric restriction is associated with reduced cancer risk (Longo and Fontana 2010), likely through similar mechanisms as described above, in addition to reduction in circulating levels of anabolic hormones, inflammatory cytokines, and oxidative stress markers (Hursting et al. 2003; Fontana and Klein 2007). Further, caloric restriction reduces glucose uptake and lactate concentration, which preserves vascular function. Therefore, the effects of caloric restriction appear to be neuroprotective (increased presence of ketone bodies, improved cerebral blood flow), which seems to play an important role in preserving brain physiology in aging (Lin et al. 2015).

3 The Importance of Food

3.1 Food Versus Nutritionism

We have highlighted the importance of individual nutrients to optimal health development. Nonetheless, we must emphasize that focusing on nutrients alone provides an incomplete picture. The tendency to isolate nutrients, to use them to fortify foods, and to manufacture supplements can lead to what Michael Pollan (2008) and others (Lang et al. 2009) have called “nutritionism ,” or the emphasis on individual nutrients rather than food. The science of nutrition has progressed to the point where we have amassed a great deal of information about nutrients, their metabolic roles, their influence on gene regulation, and ways that the physical and social environments interact with nutritional intake. The complexity of these interactions, which are dynamic, are continuous, and work on multiple levels, is typical of life course health development phenomena.

However, we cannot assume that we have no further discoveries to make regarding the thousands of bioactive compounds contained in food itself. Nutrition science is not yet able to “copy” nature; there remain elements, present in natural foods, that promote health development and which we are still just beginning to understand. An example of this is the ongoing attempt to emulate breast milk in infant formulas as newly recognized components of breast milk are uncovered. In the 1990s, it was shown experimentally that adding additional zinc than that found in breast milk to infant formula improved growth in male infants (Walravens and Hambidge 1976). Further investigation revealed that the zinc in breast milk is more biologically available than that in formula, due to the presence of previously unrecognized factors that enhance its absorption (Sandstrom et al. 1983; Blakeborough et al. 1986). DHA , one of the omega-3 acids , was not added to infant formula in this country until the mid-1990s, although we now know that this compound, present in breast milk, is critical to optimal infant health development. Thus, simply replicating the nutrient composition of breast milk, or of any other naturally occurring foods, cannot replicate all of the potentially bioactive compounds, as it is likely that there are as yet other unknown factors in all foods that positively affect health development.

There has been an evolution in the definition of dietary nutrient sufficiency in the past two decades. The Recommended Dietary Allowances were largely based on empirical criteria that established minimum nutrient requirements by assuming that individual requirements were normally distributed and recommending amounts that would meet the needs of 97.5% (two standard deviations above the mean) of the population. The newer Dietary Reference Intakes (DRI) , besides being expanded to include older age categories as mentioned above, are based on the idea that optimal nutrient intakes should be the standard. As a part of establishing the DRI values , upper limits of intake are now also being recommended, along with the Estimated Average Requirement (EAR) , recognizing that the new approach could lead to problems with exceeding safe intake level of some nutrients, particularly the fat-soluble vitamins, or nutrients like iron which are not readily excreted (see Fig. 4).

Fig. 4
figure 4

Dietary Reference Intakes (Food and Nutrition Board 1997)

Going back to the “folate story” for another example, one of the objections to universal fortification in the United States was based on the fact that increased folate in the diet has the possibility of masking the pernicious anemia caused by a lack of vitamin B1 2. Deficiencies of vitamin B1 2 affect elders for whom its dietary absorption becomes less efficient and, if not recognized, can progress to severe neurological problems. Excess nutrient intake can also be an issue for young children because of the tendency of the food industry to “over-fortify” many of the foods intended for this age group. Most breakfast cereals, for example, have high levels of added nutrients, as do “snack foods,” to make them more appealing to parents who often provide “vitamin pill” supplements to their children as well. The influence of excessive levels of certain nutrients on health development is an important area for research.

3.2 Food and Other Bioactive Compounds

As we delve more deeply into the biochemistry/metabolism of the individual nutrients, we are simultaneously realizing that our nutritional health depends not only on essential vitamins and minerals, as well as the optimal balance of the macronutrients that fuel our bodies, but also on the other bioactive compounds in our food, commonly referred to as phytochemicals or phytonutrients (Erdman et al. 2007; Beecher 1999). These include important subgroups such as carotenoids and phenolics , which are primarily derived from plant-based foods and are thought to convey important health benefits. While there are more than 600 carotenoids identified in nature, nutrition research has focused primarily on just a handful, alpha-carotene, beta-carotene, beta-cryptoxanthin, lutein, zeaxanthin , and lycopene, because of their prevalence in both food and the body (Holden et al. 1999; Linus Pauling Institute 2015). Polyphenols are a diverse group of phytochemicals that include many of the molecules that give fruits and vegetables their colors. More than 8000 polyphenols have been distinctly identified to date (D’Archivio et al. 2007; Pandey and Rizvi 2009), and very little is known about the metabolic activity for the majority of them. Flavonoids , the most abundant of the polyphenols in the human diet, are comprised of many subclasses of compounds based on chemical structure (Scalbert and Williamson 2000; Mulvihill and Huff 2010). Flavonoids that have been well explored in the literature include anthocyanidins, found in red, blue, and purple fruits (Erdman et al. 2007) and vegetables, quercetin (Manach et al. 2004), and ellagic acid, primarily found in berries (Beecher 2009; Clifford and Scalbert 2000). Resveratrol, a polyphenol belonging to the group of stilbenoids, is found in the skin of wine grapes, both white and red. However, more frequent maceration in red winemaking allows the resveratrol to be released into red wine at a rate ten times higher than that of white wine.

Both carotenoids and polyphenols have potential health benefits. Carotenoids may reduce the risk for heart disease, particularly from intake of beta-carotene (Mente et al. 2009), while studies on lycopene show potential for the prevention of prostate cancer (Khan et al. 2010). Lutein and zeaxanthi n may be instrumental in reducing the effects of oxidative injury that contribute to the development of age-related macular degeneration (Olson et al. 2011). Flavonoids have been studied for their anti-inflammatory characteristics (Garcia-Lafuente et al. 2009), while studies in animal models provide evidence that ellagic acid may reduce DNA damage (Aiyer et al. 2008). Stilbenoids such as resveratrol have been shown in animal models to prevent cancer, increase endurance, and lessen the consequences of obesity, including the loss of insulin sensitivity and increase in mortality rate (Jang et al. 1997; Baur et al. 2006; Lagouge et al. 2006). However, it should be noted that the amounts needed to produce such effects far exceed what could be obtained from usual dietary intake (Walle et al. 2004; Wenzel and Somoza 2005; Vitaglione et al. 2005).

In addition, recent analyses from national food consumption surveys in the United States (Murphy et al. 2012), Korea (Lee et al. 2013), and globally (Murphy et al. 2014) indicate that despite dietary recommendations that underscore the importance of increasing consumption of fruits and vegetables, intakes worldwide are lower than recommended. As a result, the diets of many individuals may be lacking in nutrients and phytonutrients typical of a diet rich in a variety of fruits and vegetables. Each of these studies demonstrates that individuals with the highest intakes of fruits and vegetables also have the highest intakes of phytonutrients, yet the sources of these phytonutrients in each of these studies are derived from just a few types of fruits and vegetables. This means that many of the essential nutrients already mentioned (e.g., vitamins A, C, and E and folate) that also play key roles in supporting optimal health are lacking in the diets of the majority of people. These data underscore one of the reasons why diets domestically and globally are suboptimal; without adequate access to fruits and vegetables, it is not possible to obtain essential nutrients let alone phytonutrients.

Optimizing the intake of specific foods and/or their bioactive components is a reasonable and cost-effective strategy for disease prevention. However, defining “the” ideal food pattern is challenging for a number of reasons including the difficulty in determining the required quantity of a particular food or nutrient to bring about the desired response as well as a host of nutrient-nutrient and nutrient-gene interactions that can occur (Milner 2008; Mariman 2008; Ferguson 2009; Ahmed et al. 2009; Simopoulos 2010). Even though there is not sufficient information to formulate the “ideal diet ,” there is sufficient knowledge to justify a call for future food-oriented health research (Milner 2008). In addition, it is becoming increasingly clear that individuals do not respond identically to the foods they consume. Therefore, as we begin to better understand the critical roles that multiple food components have in regulating cellular events and how these are influenced by genetic and epigenetic events, cultural and lifestyle differences, as well as our individual physical and social environments, the greater is our ability to develop a more individualized or personalized approach to our diets and to optimize our nutritional health (Kannan et al. 2008; da Costa et al. 2007; Kaput 2008). Since those compounds, which are growing in number, have yet to be fully understood, the importance of whole food itself, as opposed to simply the nutrients it provides, is becoming paramount again, as illustrated in Fig. 1.

3.3 Food and the Microbiome

An important area of emerging research focuses on the relationships between the microbiome and food and is helping to explain why individuals do not metabolize food in the same way. The human microbiome encompasses the collective genetic material of microbial communities found on several different sites on and inside the human body including the nasal passage, oral cavities, skin, urogenital tract, and the gastrointestinal (GI) tract and refers to the genetic information that these microorganisms carry (Ursell et al. 2012). In contrast, the human microbiota consists of 10–100 trillion symbiotic microbial cells, which are found primarily in the gut of every person (Turnbaugh et al. 2007). Evidence for a strong link between a person’s microbiota, digestion, and metabolism is increasing. For example, in animal models, dietary changes have led to significant alterations in bacterial metabolism, especially small-chain fatty acids and amino acids, in as little as 1 week (Ley et al. 2008; Martin et al. 2010), and can lead to large changes after only 1 day (Turnbaugh et al. 2009). Perhaps most importantly, the genetic diversity found within our gut microbiota allows us to digest compounds via metabolic pathways not explicitly coded for in the mammalian genome, which greatly increases our ability to extract energy from our diverse diets. Therefore, these pathways are important linkages to understanding the individual differences in nutritional intake and how nutrients are utilized in our bodies to affect our health or risk for disease.

In addition to being an adjunct to the basic function of the human digestive system, the human microbiota have an important influence on the body’s physiological, nutritional, and immunological processes and are able to modulate the expression of host genes that regulate diverse and fundamental physiological functions. Some of the ways in which the microbiome affects our health include its role in energy harvest from the gastrointestinal system, vitamin production, development and maintenance of the gut itself, metabolism of drugs and xenobiotics, deconjugation and metabolism of bile acids, and modulation of the immune system (Cerf-Bensussan and Gaboriau-Routhiau 2010; Young 2012; Maynard et al. 2012). However, despite the essential functions provided by the gut microbiome, the composition of each individual’s microbiome is distinct (Costello et al. 2009). The individualized nature of the gut microbiome is related to a number of factors, including host genetics, age, diet, and health status and other processes like antibiotic use and birth delivery mode (Spor et al. 2011). To complicate things further, the amount of temporal stability in one’s microbiome also appears to be personalized (Flores et al. 2014).

Nevertheless, a number of trends relating the composition of the microbiome to host health are beginning to emerge. In particular, changes in the dominant types of microorganisms living in the human gut have been associated with obesity in adults and model organisms (Backhed et al. 2004; Tremaroli and Backhed 2012; Ley et al. 2005, 2006; Turnbaugh et al. 2006). The relationship of the gut microbiome to obesity has become a focus of research. Although causality between the gut microbiome and obesity has yet to be established, findings from a number of studies suggest that the microbiome of obese individuals has an increased ability to extract energy from food. The gut microbiome has also been suggested to play a role in other inflammatory diseases (e.g., cardiovascular diseases, diabetes, and cancer) through the production of pro-inflammatory compounds that can cause chronic low-grade inflammation (Heilbronn and Campbell 2008; Tremaroli and Backhed 2012; van Olden et al. 2015; Hartstra et al. 2015).

Although several other factors play a role in the pathogenesis of obesity, the composition of the gut microbiome is now considered an important environmental factor and a potential therapeutic target for treatment of obesity (Tremaroli and Backhed 2012; Hartstra et al. 2015). Moreover, recent research suggests that not only are the metabolic products that result from carbohydrate digestion important in the etiology of obesity (e.g., short-chain fatty acids such as acetate, propionate, and butyrate) but that these factors may also have effects on appetite regulation through signaling of the hypothalamic region of the brain (Corfe et al. 2015).

The microbiome not only affects physical health but also our mental health across the life course. Patients with various mental health disorders appear to experience alterations in the stability, structure, and composition of fecal microbiota (Jiang et al. 2015; Mayer et al. 2014a), which in turn affect the severity of their disease. Scientists speculate that alterations in the gut microbiome may play a pathophysiological role in human brain diseases, including autism spectrum disorder, anxiety, depression, and chronic pain through bidirectional signaling between the brain and the gut microbiome involving multiple neurocrine and endocrine signaling mechanisms (Mayer et al. 2014b; Wang and Kasper 2014; Rosenfeld 2015). This research starts to bring together the effects that the microbiome has not only on physical health but also on mental and emotional health.

4 Conclusions and Future Directions

In this chapter, we have mainly considered population-based issues . However, if diet can be used to alter phenotypic expression of our genes, then nutritionists, food scientists, and physicians may also be able to work together to design personalized diets to prevent disease and optimize health development outcomes.

To affect maximum preventive benefit, dietary changes should begin early in life, ideally with breastfeeding, but, if not, with a formula more closely resembling breast milk than at present. Studies testing this hypothesis have already begun. The European Childhood Obesity Project , which has enrolled 1000 infants from five countries, tested an infant formula lower in protein, to more closely match that of breast milk, against the typically higher protein formula and a breastfed group, and found that the growth pattern of the infants more closely resembled that of the breastfed infants as well as the trajectory on the WHO growth charts referenced on breastfed infants (Koletzko et al. 2009). Given the importance of diet during this sensitive period in life course health development and in response to growing demand, the Agricultural Act of 2014 (Farm Bill) officially called for the Dietary Guidelines for Americans to expand to include infants and toddlers (ages 0–2), as well as women who are pregnant, beginning with the 2020 edition (USDA CNPP 2015). The US Department of Agriculture (USDA) , in collaboration with Department of Health and Human Services (DHHS) , is currently in the evidence-gathering phase (Jan 2015–Jan 2017) prior to developing a technical report to submit to the 2020 Dietary Guidelines Advisory Committee .

The fact that food is related to our health has always been obvious. However, as we have attempted to show, that relationship is very complex, and our understanding of its importance is growing with technological advancements, both in nutritional science and also in the numerous fields of study that have contributed to our knowledge base. That epigenetic mechanisms are so closely linked to the nutrients in our food (Jimenez-Chillaron et al. 2012) suggests that these mechanisms have allowed us to adjust rapidly to changes in our diet over the course of evolution, modifying the expression of our genes to adapt metabolically to a changing environment.

The effect of environmental stressors , other than nutrition, on inheritable epigenetic changes has been recognized and explored perhaps more fully to date (Jablonka 2012), while the importance of nutrition has been underrated, in spite of the rich literature on the subject (Horton 2008; Hanson et al. 2015)). Thus, we would like to echo Horton (2008), quoted at the beginning of the chapter; because of its key role in promoting optimal life course health development, especially for mothers, infants, and children, nutrition should no longer be “neglected.” It is time for the attention of the maternal and child health community to turn toward nutrition, as we would argue, along with FIGO (Hanson et al. 2015), the most important environmental factor in the determination of life course health outcomes. As FIGO now recommends for gynecologists and obstetricians, we all need to “Think Nutrition First” (Hanson et al. 2015).

Areas for further study have been alluded throughout the paper. The long-term consequences of periconceptional nutrition to pregnancy outcomes and early development must be recognized; epigenetic changes during periods of rapid health development can have lifelong effects and intergenerational consequences. At the same time, we know that dietary changes at any time during the life course can ameliorate potentially inimical nutritional status.

We have also emphasized that the focus of the “power” of nutrition for life course health development has moved from the centuries-old attention paid to food, to elucidating, during the past century, an understanding of the role of specific nutrients and other bioactive components in food, and now back to food itself as the source of as yet unknown, but potentially important, other health-supporting elements (Fig. 1). The effects of food as translated through the microbiome is an emerging area of research that is likely to affect how we think about the composition and quantity of the food in our diet and the consequences for life course health development at all levels, from physical to mental and emotional health. It is, therefore, essential that access to enough food and the appropriate quality foods be available to all populations to afford the opportunity for optimal health development.

The recognition of nutrition as a central environmental determinant that all investigators should include in their assessment of life course health development and promotion is now becoming more widespread. Therefore, we invite and embrace other disciplines and health professionals to gain a better understanding of the centrality of nutrition and to work together in an effort to optimize population and individual health.

While more extensive than this chapter has space to allow, it bears mention that it is expected that the outcomes of research into the role of nutrition in life course health development would be further translated into priority-setting strategies for both practice and policy development . One current jumping-off point is the recent development of a healthcare system which promises to provide coverage for most Americans. This provides policy-makers, as well as healthcare professionals, an opportunity to switch from a focus on secondary prevention and treatment to a concentration on primary prevention. This should begin, we would argue, with optimizing diets, taking advantage of the current trend among consumers to recognize food as an avenue to health. In addition, recent clinical interventions, such as physicians offering patient’s prescriptions to purchase healthy foods, would be a part of such a conversation (Brody 2014), as would the new recommendation by the American Academy of Pediatrics that pediatricians screen families with children for evidence of food insecurity (2015), given the impact of nutrition on health development. Other practice-oriented efforts such as authorizing reimbursement for nutrition services in multiple settings, including those providing primary preventive (e.g., inter-conception care) as well as secondary preventive care (e.g., diabetes) and placing more importance on educating healthcare providers in the basics of nutrition science, would go a long way to improving service delivery and promoting health development across the life course.

With respect to the food industry, broadly speaking, it would be important for those professionals to work collaboratively with academics and governmental agencies to translate academic insights into innovative solutions and agreed-upon regulations, for the benefit of the public’s health, with a greater focus on health development as well as food safety. Laws taxing foods containing excessive amounts of sugar and/or fat as well as legislative collaboration between governmental organizations such as USDA , National Institutes of Health (NIH) , and Maternal and Child Health Bureau/Health Resources and Services Administration (MCHB/HRSA) around nutrition issues could also support positive changes to the healthcare system as a whole but also to the greater public through crosscutting policy efforts.

5 Key Research Priorities

5.1 Basic Mechanisms

  • Identify the genes influenced by the nutrient environment, and translate these findings into improved health with a focus on gaining a better understanding of the contribution of paternal diet and lifestyle to epigenetic inheritance.

  • Understand the role of maternal nutrition in influencing their children’s physiologic pathways, the mechanisms involved, and the long-term health consequences for children.

  • Establish the biological/biochemical role of key nutrients, such as folate, iron, and vitamin D, in the epigenomic process.

  • Elucidate the mechanisms by which breastfeeding reduces the risk of obesity, and understand the role of nutritional genetics and epigenetics in central and peripheral body weight regulatory mechanisms.

  • Identify (1) early biomarkers, more sensitive than growth, that predict later chronic disease and (2) the critical periods for the development of each organ system, and determine the possibility of reversing/attenuating epigenomic changes .

5.2 Clinical Research

  • Include data collection relating to nutritional status and diet in all longitudinal studies to better understand the effects of nutrition to life course health development.

  • Determine the optimal levels of fortification, supplement doses, and blood levels for women of childbearing age and how these data and concepts translate into population screening for prevention.

  • Explore the effects of subclinical nutrient deficiencies, as well as potential effects of supplements exceeding the tolerable upper limits for individual nutrients .

5.3 Population/Epidemiologic Research

  • Identify critical periods for nutrition prevention or intervention to prevent later chronic disease.

  • Examine the long-term effects of folic acid fortification.

  • Understand the genetic/epigenetic contribution to nutrient requirements of specific populations to inform public health policies.

  • Maximize the benefits of future research efforts through interdisciplinary birth cohort studies which allow researchers to identify relationships that might otherwise have been overlooked because of the ability to observe outcomes across the life course and across generations.

  • Examine the utility of focusing on foods rather than nutrients alone on the life course health development outcomes including the effects of food on the microbiome and its relationship to physical, mental, and emotional health.

5.4 Data and Methods Development

  • Refine study design and methods to enhance interdisciplinary collaborations between basic scientists, clinicians, and social scientists to deliver coherent, evidence-based research plans.

  • Develop more efficient and feasible methods for recording and evaluating dietary intake to enhance the use of these methods in research studies.

  • Develop crosscutting comprehensive data sources that include nutrition indicators to allow for continual quality improvement and consistent performance measurements at the local, state, and national levels.