The grandmother hypothesis (GH) of Hawkes et al. ([1998]: Proc Natl Acad Sci USA 95: 1336-1339) finds that selection for lower adult mortality and greater longevity allow for the evolution of prolonged growth in human beings. In contrast, other researchers propose that the evolution of the human childhood and adolescent stages of life history prolonged the growth period and allowed for greater biological resilience and longevity compared with apes. In this article, the GH model is reanalyzed using new values for some of its key variables. The original GH set the age at human feeding independence at 2.8 years of age (weaning) and used demographic data from living foragers to estimate average adult lifespan after first birth at 32.9 years. The reanalysis of the GH uses age 7.0 years (end of the childhood stage) as the minimum for human feeding independence and uses data from healthier populations, rather than foragers, to derive an estimate of 48.9 years for average adult life span. Doing so finds that selection operated to first shorten the infancy stage (wean early compared with apes), then prolong the growth period, and finally result in greater longevity. The reanalysis provides a test of the reserve capacity hypothesis as part of a multilevel model of human life history evolution.