KIN 342: Nutrition and Aging

Heather Keller

Estimated study time: 1 hr 58 min

Table of contents

Sources and References

Primary textbook — Touger-Decker, R., Mobley, C.C., & Epstein, J. (Eds.) (2021). Diet and Nutrition in Oral Health, 3rd Edition. Prentice Hall. Morley, J.E., & van Stavern, G.P. (2023). Geriatric Nutrition. CRC Press.

Supplementary texts — Bernstein, M., & Munoz, N. (2012). Nutrition for the Older Adult, 2nd Edition. Jones & Bartlett. Stanner, S., Thompson, R., & Buttriss, J. (Eds.) (2009). Healthy Ageing: The Role of Nutrition and Lifestyle. Wiley-Blackwell.

Online resources — Dietitians of Canada — Nutrition Guidelines for Older Adults (www.dietitians.ca); National Resource Center on Nutrition and Aging (nutritionandaging.org); Malnutrition Universal Screening Tool (MUST) and Mini Nutritional Assessment (MNA) from the British Association for Parenteral and Enteral Nutrition (www.bapen.org.uk); Health Canada Canadian Nutrient File (www.canada.ca); PubMed (pubmed.ncbi.nlm.nih.gov).


Chapter 1: Aging, Nutrition, and the Continuum of Care

How Nutrition Affects Aging

The relationship between nutrition and aging is bidirectional and dynamic. Poor nutritional status accelerates many processes of biological aging — oxidative stress, inflammation, immune senescence, sarcopenia, and cognitive decline — while aging itself produces profound physiological changes that alter how the body processes and responds to dietary inputs. Understanding this relationship is essential not only for optimizing the health span of individual older adults but also for informing the nutrition policies and care practices of a rapidly aging society.

Canada’s population is aging at an unprecedented rate. In 2023, individuals aged 65 years and older constituted approximately 18% of the Canadian population, a proportion projected to reach 25% by 2050. This demographic shift carries enormous implications for healthcare, long-term care, and social support systems. Nutrition-related conditions — malnutrition, sarcopenia, osteoporosis, dehydration, and the nutritional complications of chronic disease management — are among the most prevalent and costly health problems in this population. Yet nutrition is also one of the most modifiable determinants of health outcomes in aging, making nutritional assessment and intervention a high-priority area for kinesiology and health science professionals working with older adults.

Biological aging is characterized by several inter-related processes that collectively reduce physiological reserve and increase vulnerability to stressors. Inflammaging — the chronic low-grade sterile inflammation that develops with aging — is driven by the senescence-associated secretory phenotype (SASP) of aging cells, accumulation of damaged macromolecules, changes in the gut microbiome, and loss of immune regulatory capacity. Elevated circulating concentrations of pro-inflammatory cytokines (IL-6, TNF-alpha, IL-1-beta, C-reactive protein) in older adults are associated with accelerated muscle loss, insulin resistance, cognitive decline, depression, and increased mortality — making anti-inflammatory dietary strategies (particularly the Mediterranean dietary pattern) a target of growing interest in geriatric nutrition. Oxidative stress — resulting from the imbalance between reactive oxygen species (ROS) production and antioxidant defense — accumulates with age and contributes to mitochondrial dysfunction, cellular senescence, and tissue damage in the brain, cardiovascular system, and skeletal muscle. Dietary antioxidants — vitamin C, vitamin E, carotenoids, polyphenols, selenium — may help buffer this age-related oxidative burden, though the evidence from randomized trials of individual antioxidant supplements is largely disappointing, reinforcing the importance of whole-diet approaches.

The continuum of care for older adults — from independent living in the community through assisted living, to institutional long-term care (LTC) — presents distinct nutritional challenges at each stage. Community-dwelling older adults face barriers related to food access, preparation capacity, social isolation, and economic constraints. Those in assisted living have food provided but may have limited choice and reduced ability to advocate for their preferences. Those in LTC facilities are often at the greatest nutritional risk, with high rates of malnutrition (estimated 40–60% in some studies), multiple medication interactions with nutrients, frequent dysphagia, and progressive functional decline that impairs self-feeding. KIN 342 addresses nutrition across this entire continuum, with particular attention to practical assessment and intervention strategies at each care level.

Healthy Diet and Dietary Reference Intakes for Older Adults

The Dietary Reference Intakes (DRIs) for older adults reflect the physiological changes of aging that alter nutrient needs. Two age groups are defined within the older adult category: 51–70 years and 71+ years, reflecting the recognition that physiological aging is not uniform and that the oldest old have distinct needs from the young-old. In general, energy requirements decline with aging due to decreases in lean body mass (lowering basal metabolic rate) and reductions in physical activity. However, requirements for most micronutrients are maintained or increased, creating the challenge of achieving high nutrient density within a reduced caloric framework.

Vitamin D requirements increase with age: the RDA rises from 600 IU/day for adults under 70 to 800 IU/day for those 71 and older. However, many researchers and clinical guidelines suggest these values underestimate actual needs in the elderly, particularly those with limited sun exposure, dark skin pigmentation, or obesity. The reasons for increased requirements include reduced efficiency of cutaneous vitamin D synthesis (7-dehydrocholesterol concentrations in the skin decrease with age), reduced hepatic 25-hydroxylation capacity, reduced renal 1-alpha-hydroxylation capacity, and reduced expression of the vitamin D receptor in target tissues. Vitamin D deficiency (serum 25(OH)D below 50 nmol/L) is extremely common in older adults, with surveys finding rates of 25–60% in various populations. Clinically, vitamin D deficiency contributes to osteoporosis (through impaired calcium absorption), muscle weakness (VDR is expressed in skeletal muscle and mediates direct effects on muscle protein synthesis), falls, and immune dysfunction.

Protein requirements in older adults are the subject of considerable scientific debate. The current RDA of 0.80 g/kg/day was set on the basis of nitrogen balance studies, many of which included younger adults and used short-term assessment periods. A growing body of research indicates that older adults require higher protein intakes to maintain lean body mass and function, due to anabolic resistance — the blunted sensitivity of aging muscle protein synthesis machinery to anabolic stimuli (amino acids, insulin, physical activity). The PROT-AGE Study Group (2013) and the European Society for Clinical Nutrition and Metabolism (ESPEN) have recommended intakes of 1.0–1.2 g/kg/day for healthy older adults and 1.2–1.5 g/kg/day for those with acute or chronic illness. Achieving adequate protein intake in older adults who also have reduced appetites and energy requirements necessitates choosing high-protein, nutrient-dense foods at every eating occasion.


Chapter 2: Vitamin and Mineral Supplements in Aging

Rationale for Supplement Use in Older Adults

The use of dietary supplements in older adults is widespread: surveys suggest that 60–70% of older Canadians use at least one supplement regularly, with calcium, vitamin D, and multivitamin-mineral preparations being the most common. The rationale for supplementation in this population is compelling in some cases and weak in others, and the clinician’s role is to help older adults make evidence-based decisions about supplement use.

The case for vitamin D supplementation is among the strongest in geriatric nutrition. Given the combination of reduced cutaneous synthesis, frequent inadequate dietary intake (few foods are naturally rich in vitamin D, and fortification is limited in Canada compared to the United States), and the demonstrated association between low vitamin D status and adverse outcomes (fractures, falls, muscle weakness, immune dysfunction), vitamin D supplementation at 800–2000 IU/day is recommended by most Canadian and international geriatric guidelines for older adults who cannot meet needs through diet and sunlight alone. The VITAL trial (2022) found that vitamin D supplementation (2,000 IU/day) significantly reduced the risk of advanced cancer and cancer mortality in a general adult population, adding to the evidence for supplementation beyond skeletal health.

Calcium supplementation is more nuanced. While adequate calcium intake is essential for maintaining bone density and preventing osteoporosis, the question of whether supplemental calcium (beyond what is obtained from diet) reduces fracture risk has been controversial. A series of meta-analyses have found that calcium supplementation modestly reduces vertebral fracture risk but has minimal effect on hip fracture risk — the most clinically important osteoporotic fracture. Furthermore, some analyses (though not all) have found associations between high-dose calcium supplementation (particularly at doses above 1,000–1,200 mg/day of supplemental calcium) and increased risk of cardiovascular events, possibly through effects on vascular calcification. The current consensus is to prefer dietary calcium when possible and to use supplementation only when dietary intake is consistently below target, at doses of 500–600 mg elemental calcium per day (because intestinal absorption efficiency decreases at higher single doses). The chemical form of calcium matters: calcium carbonate requires gastric acid for absorption and should be taken with food, while calcium citrate can be absorbed without gastric acid and is preferred in individuals with achlorhydria or those taking proton pump inhibitors.

Vitamin B12 supplementation is strongly warranted in older adults with atrophic gastritis (which impairs release of protein-bound B12 from food matrices) or those taking proton pump inhibitors long-term (which reduce gastric acid and similarly impair B12 release). Crystalline B12 in supplements is absorbed by passive diffusion and does not require intrinsic factor or gastric acid, making it effective in those whose deficiency is due to gastric causes. Intramuscular B12 injections are the gold standard for treating severe deficiency or in those with pernicious anemia (where the autoimmune destruction of parietal cells eliminates both intrinsic factor and gastric acid). Subclinical B12 deficiency — associated with elevated methylmalonic acid and homocysteine, even with normal serum B12 — is common in older adults and associated with cognitive impairment, depression, and neuropathy; treatment with B12 supplementation can reverse these manifestations if initiated before irreversible neuronal damage occurs.

Vegan and Vegetarian Diets in Older Adults

Plant-based diets — ranging from lacto-ovo vegetarian (excluding meat and fish but including dairy and eggs) to vegan (excluding all animal products) — are increasingly adopted by older adults for health, ethical, and environmental reasons. While well-planned plant-based diets can provide adequate nutrition at any life stage, they present specific nutritional challenges that intensify in older adults due to the physiological changes described above.

The critical nutrients requiring attention in plant-based diets for older adults include vitamin B12 (present only in animal foods; supplementation is non-negotiable for vegans), vitamin D (largely from animal sources and sun exposure; supplementation is typically needed), calcium (well-absorbed from some plant sources like kale and broccoli, but the higher oxalate content of others like spinach limits absorption; fortified plant milks are valuable), iron (non-heme iron from plants is less bioavailable than heme iron; consuming with vitamin C-rich foods enhances absorption), zinc (phytate in grains and legumes inhibits absorption; varied plant-protein choices help), omega-3 fatty acids (EPA and DHA are found almost exclusively in marine foods; algae-derived DHA/EPA supplements are available for vegans and may be particularly important for brain health in older adults), and protein (plant proteins are generally less bioavailable and more limited in essential amino acids than animal proteins; higher total protein intakes and attention to leucine content are needed to achieve equivalent anabolic stimulus). For older adults following vegan diets, individualized nutritional assessment and targeted supplementation are essential, and working with a registered dietitian is strongly recommended.


Chapter 3: Prevention, Nutritional Assessment, and Food Security

Nutrition Screening and Assessment in Older Adults

Malnutrition is a state of nutrition in which a deficiency or excess (or imbalance) of energy, protein, and other nutrients causes measurable adverse effects on tissue and body form, function, and clinical outcome. In the geriatric context, malnutrition most commonly refers to protein-energy malnutrition (undernutrition), though overnutrition (obesity) and specific micronutrient deficiencies are also important.

Malnutrition is dramatically underdiagnosed in clinical practice. Studies suggest that fewer than 30% of hospitalized malnourished older adults are identified and treated. The consequences of unrecognized malnutrition are severe: prolonged hospital stays, increased complication rates, impaired wound healing, immune dysfunction, reduced response to medications, increased mortality, and accelerated functional decline. Systematic nutrition screening — brief, validated tools applied to all patients or residents at the point of care — is essential for identifying individuals at risk who require further assessment and intervention.

The Mini Nutritional Assessment (MNA) is the most extensively validated nutrition screening and assessment tool for older adults, developed specifically for this population and available in two versions: the MNA Short Form (MNA-SF, 6 items, screening only) and the full MNA (18 items, screening + assessment). The MNA-SF screens for malnutrition risk using questions about recent food intake decline, weight loss, mobility, psychological stress/acute disease, neuropsychological problems, and BMI (or calf circumference if BMI cannot be obtained). A score of 12–14 indicates normal nutritional status, 8–11 indicates risk of malnutrition (requiring further assessment), and 0–7 indicates malnutrition. The full MNA adds questions about living situation, medication use, pressure ulcers, number of full meals eaten, fluid intake, mode of feeding, and subjective self-assessment of nutritional status. Sensitivity of the MNA-SF for identifying malnutrition (confirmed by the full MNA) is approximately 90%, making it an effective screening tool.

The Malnutrition Universal Screening Tool (MUST) was designed for use in adults across care settings and uses three criteria: BMI (< 18.5 kg/m² scores 2 points; 18.5–20 scores 1 point; > 20 scores 0 points), unplanned weight loss over the past 3–6 months (> 10% scores 2 points; 5–10% scores 1 point; < 5% scores 0 points), and the presence of an acute illness expected to cause more than 5 days of reduced or no nutritional intake (scores 2 points). Total scores of 0 indicate low risk, 1 indicates medium risk, and 2 or above indicates high risk. A comprehensive nutritional assessment for those identified at risk includes anthropometric measures (height, weight, weight history, mid-arm circumference, calf circumference, skinfold thickness), biochemical measures (serum albumin, prealbumin, transferrin, C-reactive protein, complete blood count, micronutrient levels as indicated), clinical signs (muscle wasting, edema, poor wound healing, skin and hair changes), and dietary assessment (24-hour recall, food frequency questionnaire, or diet history).

Food Insecurity and Social Determinants of Nutritional Health

Food insecurity — the state of being without reliable access to sufficient quantities of affordable, nutritious food — is a major but often underrecognized problem among older adults. In Canada, approximately 8–12% of older adults living in the community experience some degree of food insecurity, with rates much higher among those with low incomes, those living alone, and recent immigrants. Food insecurity in older adults is associated with poor dietary quality, micronutrient deficiencies, poorer self-rated health, increased hospitalization, and accelerated functional decline.

The determinants of food insecurity in older adults extend beyond income and include physical access to food (inability to drive, limited public transit, living in food deserts), functional limitations (difficulty standing, carrying, opening packages), cognitive limitations (forgetting to eat, inability to plan meals or shop independently), reduced food literacy and cooking skills (particularly in men who were not primarily responsible for food preparation during their working years), social isolation (eating alone reduces appetite and food intake by approximately 15–20% compared to eating with others), dental and oral health problems (edentulism and poorly fitting dentures limit food choices and texture tolerance), and depression and bereavement (which suppress appetite through multiple mechanisms including reduced serotonin signaling and elevated cortisol).


Chapter 4: Sarcopenia, Frailty, and Falls

Sarcopenia: Definition, Pathophysiology, and Nutritional Management

Sarcopenia is a progressive and generalized skeletal muscle disorder characterized by accelerated loss of muscle mass and function (strength and/or physical performance) that is associated with adverse outcomes including falls, functional decline, frailty, and mortality. The EWGSOP2 (2018 update of the European Working Group on Sarcopenia in Older People) defines sarcopenia as low muscle strength (the primary criterion) plus low muscle quantity/quality, with low physical performance indicating severe sarcopenia.

The pathophysiology of sarcopenia is multifactorial, involving interacting age-related changes in protein metabolism, motor neuron loss, hormonal milieu, inflammation, and mitochondrial function. In the context of protein metabolism, the fundamental mechanism is a shift from protein anabolic to protein catabolic predominance. Aging reduces the sensitivity of muscle protein synthesis (MPS) to anabolic stimuli — particularly postprandial amino acid availability and insulin — a phenomenon termed anabolic resistance. This means that a given dose of dietary protein produces a smaller increment in MPS in older versus younger muscle, necessitating either higher protein doses or higher leucine content per serving to achieve a similar anabolic response. The threshold dose of leucine required to maximally stimulate MPS rises from approximately 2 g in young adults to approximately 3–4 g in older adults, meaning that protein sources with high leucine content (whey protein, dairy, eggs, meat) or supplemental leucine may be particularly valuable for older adults.

Motor neuron loss is an underappreciated contributor to sarcopenia. With aging, there is progressive loss of fast-twitch (type II) motor units: the motor neurons die and are not replaced, and the muscle fibers they innervated are either reinnervated by remaining slow-twitch motor neurons (causing fiber-type conversion from type II to type I) or undergo denervation atrophy. This denervation-reinnervation cycle is thought to be a major driver of age-related muscle fiber loss and the shift toward a slower, more fatigue-resistant but less powerful muscle phenotype in older adults. Nutritional interventions cannot fully compensate for motor neuron loss, which may explain why protein supplementation alone has limited effects on muscle mass and function without concurrent resistance exercise.

The nutritional management of sarcopenia combines protein optimization with micronutrient adequacy and overall energy sufficiency. Regarding protein, current evidence supports intakes of 1.2–1.5 g/kg/day in sarcopenic older adults, with protein distributed relatively evenly across meals (targeting 25–40 g per meal) and including leucine-rich sources. Emerging evidence supports the value of essential amino acid (EAA) supplementation, particularly mixtures enriched in leucine, in circumstances where high protein intake is difficult to achieve from whole foods. Omega-3 fatty acids — particularly EPA and DHA — may sensitize aging muscle to anabolic signals; meta-analyses suggest that omega-3 supplementation (approximately 2–4 g/day) in older adults modestly but significantly increases muscle protein synthesis rates and may augment the hypertrophic response to resistance exercise. The mechanism involves omega-3-mediated changes in membrane phospholipid composition that enhance insulin receptor and mTORC1 signaling. Vitamin D is required for normal muscle function (VDR signaling promotes muscle protein synthesis and suppresses atrophy pathways); correction of vitamin D deficiency in older adults with low vitamin D status has been shown to improve muscle strength and reduce fall rates.

Frailty and Falls

Frailty is a clinical state of increased vulnerability to stressors due to diminished physiological reserve across multiple organ systems. The Fried Frailty Phenotype defines frailty as the presence of three or more of five criteria: unintentional weight loss (≥ 4.5 kg in the past year), self-reported exhaustion, weak grip strength (below sex- and BMI-adjusted thresholds), slow gait speed (below 20th percentile for sex and height), and low physical activity level. One or two criteria indicate pre-frailty; three or more indicate frailty.

Nutritional status is intimately connected to frailty status. Weight loss — driven by reduced caloric intake (due to anorexia of aging), increased energy expenditure from chronic inflammation and disease, or both — is both a diagnostic criterion and a major driver of frailty progression. Malnutrition and sarcopenia are considered overlapping but distinct components of frailty: sarcopenia involves loss of muscle specifically, while malnutrition is a broader state of nutrient deficit that accelerates all components of the frailty phenotype. The malnutrition-sarcopenia syndrome (sometimes called cachexia when driven by inflammation from chronic disease) represents the convergence of these processes and is associated with the worst clinical outcomes.

Falls in older adults are the leading cause of injury-related death and disability in this age group, with approximately one-third of community-dwelling adults over 65 falling at least once per year. Nutrition contributes to fall risk through multiple pathways: vitamin D deficiency impairs muscle function and neuromuscular control; protein-energy malnutrition reduces muscle mass and strength; dehydration impairs cognitive function and postural stability; and hypoglycemia in diabetic patients receiving insulin or sulfonylureas can cause sudden neurological dysfunction precipitating falls. Nutritional interventions targeting fall prevention should therefore include correction of vitamin D deficiency (targeting serum 25(OH)D above 75 nmol/L), adequate protein intake to support muscle mass maintenance, and careful attention to hydration and glycemic management.


Chapter 5: Obesity, Cardiovascular Disease, and Diabetes in Older Adults

Obesity in Older Adults: A Paradox

Obesity in older adults presents a clinical paradox. While the association between obesity and adverse health outcomes (cardiovascular disease, type 2 diabetes, certain cancers) is well-established across the lifespan, the relationship between BMI and mortality in older adults is modified by age. The obesity paradox — the observation that moderate overweight (BMI 25–30) and even mild obesity (BMI 30–35) are associated with reduced mortality in older adults compared to normal weight in some studies — has generated considerable debate. Several explanations have been proposed: heavier individuals may have greater nutritional reserves that buffer against the weight loss associated with acute illness (“survival of the fattest”); BMI may misclassify individuals with high muscle mass as overweight; and selection bias may eliminate the leanest and sickest from the oldest age groups.

The clinical implications of the obesity paradox are contentious. Most geriatric specialists advocate for a nuanced approach: intentional weight reduction through caloric restriction is generally appropriate for obese older adults with obesity-related comorbidities (particularly type 2 diabetes, knee osteoarthritis, and sleep apnea) who are functionally limited and can reasonably tolerate a weight loss program. However, caloric restriction in older adults invariably produces loss of both fat and lean mass (with the proportion of lean mass loss being greater than in younger adults), potentially accelerating sarcopenia. Any weight loss program in older adults must therefore be combined with resistance exercise and adequate protein intake to minimize muscle loss. In frail older adults, intentional weight loss is generally contraindicated.

Cardiovascular Disease and the Mediterranean Diet in Aging

Cardiovascular disease (CVD) — encompassing coronary artery disease, stroke, heart failure, and peripheral arterial disease — remains the leading cause of death in older Canadians, accounting for approximately 28% of all deaths. The relationship between diet and CVD risk in older adults follows the same principles established for younger populations, but several age-specific considerations apply. The Mediterranean dietary pattern — characterized by high consumption of olive oil, vegetables, fruits, legumes, nuts, whole grains, and fish; moderate consumption of dairy and poultry; and low consumption of red and processed meat and sweets, often accompanied by moderate wine consumption — is the most extensively studied dietary pattern for CVD prevention in older adults.

The PREDIMED trial (Prevención con Dieta Mediterránea) — a large Spanish randomized trial of 7,447 individuals at high cardiovascular risk, including many older adults — demonstrated that a Mediterranean diet supplemented with extra-virgin olive oil or nuts reduced the composite endpoint of cardiovascular events by approximately 30% compared to a low-fat control diet. Subsequent analyses of this trial revealed particularly strong protective associations for stroke and atrial fibrillation. The PREDIMED-Plus trial, a follow-up study incorporating caloric restriction and physical activity alongside the Mediterranean diet, demonstrated additional benefits for weight loss and metabolic syndrome. The Mediterranean diet’s benefits for CVD may be mediated by its anti-inflammatory properties (reduced CRP, IL-6), favorable effects on lipoprotein profiles (higher HDL, reduced LDL oxidation), improved endothelial function (through polyphenol-mediated nitric oxide production), and positive effects on gut microbiome composition.

Type 2 Diabetes Management in Older Adults

Type 2 diabetes affects approximately 25–30% of Canadians over 65, making it the most prevalent metabolic disease in this age group. The management of diabetes in older adults requires a framework that differs from younger populations in several important ways. Glycemic targets in older adults should be individualized based on functional status, cognitive function, life expectancy, and frailty: tight glycemic control (HbA1c below 7.0%) may be appropriate for functionally intact older adults with long life expectancy, but the risks of hypoglycemia from this approach — which include falls, fractures, cardiac arrhythmias, and cognitive impairment — may outweigh the benefits in frail individuals, in whom less intensive targets (HbA1c 7.5–8.5%) are recommended by the Canadian Diabetes Association.

The nutritional management of type 2 diabetes in older adults centers on achieving glycemic control while maintaining adequate nutrient intake and avoiding malnutrition. The primary dietary goals are: achieving and maintaining a healthy body weight (weight loss of 5–10% improves insulin sensitivity significantly in overweight and obese individuals); reducing consumption of refined carbohydrates and added sugars; increasing dietary fiber from vegetables, legumes, and whole grains; distributing carbohydrate intake evenly across meals to attenuate postprandial glucose excursions; and ensuring adequate protein intake to prevent sarcopenia. Very low-calorie diets (< 800 kcal/day) and low-carbohydrate ketogenic diets have been shown to produce dramatic short-term improvements in glycemic control — sometimes resulting in remission — in younger obese adults, but their safety and tolerability in older adults, particularly those with frailty or existing malnutrition, requires careful individualized assessment.


Chapter 6: Gastrointestinal Conditions and Nutritional Management

The gastrointestinal tract undergoes multiple age-related changes that affect nutrient digestion, absorption, and tolerance. Reduced gastric acid production (hypochlorhydria or achlorhydria), occurring in 10–20% of older adults due to Helicobacter pylori-induced gastritis and autoimmune gastric atrophy, impairs the acid-mediated release of protein-bound vitamin B12, non-heme iron, and calcium from foods. It also alters the gastric acid barrier against ingested pathogens, increasing risk of small intestinal bacterial overgrowth (SIBO), which can further impair B12 absorption by competing for luminal B12 and impairing the distal ileum.

Gastric motility slows with aging, reflecting degenerative changes in the enteric nervous system and reduced responsiveness to motility hormones. Delayed gastric emptying exaggerates postprandial fullness (early satiety), reduces appetite, and can contribute to anorexia of aging — the multifactorial decline in appetite and food intake that occurs in many older adults and is a major driver of weight loss and malnutrition. Anorexia of aging results from increased sensitivity to the satiety hormone cholecystokinin (CCK), reduced opioid-mediated appetite stimulation, decreased nitric oxide modulation of gastric accommodation, elevated leptin levels relative to body fat, and psychological factors including depression and social isolation.

Constipation is the most common gastrointestinal complaint in older adults, affecting 15–30% of community-dwelling elderly and up to 50% of nursing home residents. Its causes in the elderly are multifactorial: reduced dietary fiber intake, inadequate fluid intake, physical inactivity, polypharmacy (many medications including opioids, calcium channel blockers, anticholinergics, and iron supplements cause constipation), neurological disease affecting enteric nervous system function, and in some cases metabolic causes (hypothyroidism, hypercalcemia). Management focuses on adequate fiber (25–38 g/day from a variety of sources), fluid (1.5–2 liters/day), and physical activity, with osmotic laxatives (polyethylene glycol, lactulose) as first-line pharmacological treatment when lifestyle modifications are insufficient.

Select GI Conditions Affecting Nutrition

Gastroesophageal reflux disease (GERD) is prevalent in older adults, partly because lower esophageal sphincter tone decreases with aging and partly because hiatal hernia becomes more common. Nutritional management involves avoiding trigger foods (high-fat meals, chocolate, coffee, alcohol, citrus, tomato-based foods), eating smaller, more frequent meals, avoiding lying down within two to three hours after eating, and maintaining a healthy body weight (as obesity increases intra-abdominal pressure that promotes reflux). Proton pump inhibitors (PPIs), the standard pharmacological treatment, impair absorption of vitamin B12, magnesium, calcium (particularly from carbonate salts), and iron, and their long-term use in older adults requires monitoring of these nutrient levels.

Inflammatory bowel disease (IBD) — comprising Crohn’s disease and ulcerative colitis — can present de novo in older adults (late-onset IBD) or persist as a chronic condition managed over decades. Nutritional consequences of IBD in older adults include protein-energy malnutrition (from increased nutrient losses, reduced intake due to pain and dietary restrictions, and increased metabolic demands from inflammation), iron deficiency anemia (from intestinal blood loss and impaired absorption), vitamin B12 deficiency (in Crohn’s disease affecting the ileum), vitamin D deficiency (from fat malabsorption and avoidance of dairy), zinc deficiency, and calcium deficiency. Exclusive enteral nutrition — providing all nutrition as a liquid formula delivered by nasogastric tube — can induce remission in Crohn’s disease and allows the bowel to rest while ensuring adequate nutrient intake, serving as both a nutritional and therapeutic intervention.

Colorectal cancer is the third most common cancer in Canada. Dietary factors associated with reduced colorectal cancer risk include high dietary fiber intake, high consumption of fruits and vegetables, adequate calcium and vitamin D, and limited red and processed meat consumption. After a diagnosis of colorectal cancer and surgery (colostomy or ileostomy), significant nutritional adjustments are required. An ileostomy bypasses the colon and results in high-volume liquid output that can cause dehydration and electrolyte imbalances (particularly sodium and potassium losses); a low-fiber diet and adequate fluid and sodium intake are required in the early postoperative period. A colostomy preserves more colonic function but may still require dietary modifications to manage stool consistency and reduce flatus.


Chapter 7: Dementia, Dysphagia, and Nutritional Management

Dementia: Nutritional Considerations

Dementia — the umbrella term for progressive cognitive impairment severe enough to interfere with daily life — affects approximately 15% of Canadians over 80 and is one of the most challenging conditions for nutritional management in older adults. The major forms — Alzheimer’s disease, vascular dementia, Lewy body dementia, and frontotemporal dementia — all lead to progressive impairment in the cognitive and behavioral capacities required for independent nutrition: meal planning, food shopping, food preparation, recognition of hunger and thirst, and self-feeding.

Individuals with dementia are at high nutritional risk through multiple pathways. Reduced food intake is nearly universal: wandering and hyperactivity increase energy expenditure early in the disease; later stages bring impaired recognition of food, difficulty with utensils, impaired swallowing (dysphagia), and loss of appetite. Weight loss is the most commonly reported nutritional complication of dementia and is independently associated with accelerated cognitive decline, increased mortality, and reduced quality of life. Nutritional interventions for persons with dementia include providing high-energy, high-protein foods in preferred flavors and textures; removing distractions during meals; allowing extra time for eating; using adaptive utensils; providing finger foods for those who can no longer use cutlery effectively; and ensuring adequate social support during mealtimes (eating with others has been shown to increase intake by 15–20% even in those with significant cognitive impairment).

The Mediterranean-DASH Intervention for Neurodegenerative Delay (MIND) diet — a hybrid of the Mediterranean diet and the DASH (Dietary Approaches to Stop Hypertension) diet, specifically optimized for brain health — has been associated in prospective cohort studies with slower cognitive decline and reduced risk of Alzheimer’s disease. The MIND diet emphasizes green leafy vegetables (at least 6 servings per week), other vegetables (at least 1 serving per day), berries (at least 2 servings per week), nuts (at least 5 servings per week), olive oil as the primary cooking fat, whole grains (at least 3 servings per day), fish (at least 1 serving per week), beans (at least 4 meals per week), and poultry (at least 2 servings per week), while limiting red meat, butter and margarine, cheese, pastries and sweets, and fast or fried food. The MIND diet’s neuroprotective mechanisms likely involve reducing oxidative stress and neuroinflammation, improving cerebrovascular function, and promoting a microbiome composition that reduces brain inflammation through the gut-brain axis.

Dysphagia: Assessment and Nutritional Management

Dysphagia is the medical term for difficulty swallowing, encompassing problems with any phase of the swallowing process: oral preparation (chewing and manipulating food in the mouth), oral transit (moving the bolus toward the pharynx), pharyngeal phase (the complex neuromuscular sequence that propels the bolus through the pharynx and into the esophagus while protecting the airway), or esophageal phase (peristaltic transit to the stomach). Dysphagia is distinguished from odynophagia (painful swallowing) and globus sensation (lump-in-throat feeling without objective swallowing difficulty).

Dysphagia is extremely common in the populations served by KIN 342 students: approximately 30–40% of LTC residents have clinically significant dysphagia, and prevalence is even higher (50–60%) among those with dementia or post-stroke. The consequences of dysphagia are severe: aspiration (entry of food, liquid, or secretions below the level of the true vocal cords into the subglottic airway) can cause aspiration pneumonia — the most common cause of death in individuals with dysphagia — and may be clinically “silent” (occurring without coughing or choking) in up to 40% of cases. Aspiration pneumonia is the leading cause of death in residents of LTC facilities. Malnutrition and dehydration occur because dysphagia makes eating and drinking laborious, distressing, and time-consuming, leading individuals to reduce intake.

The standard clinical assessment for dysphagia begins with a bedside swallowing evaluation by a speech-language pathologist (SLP), which includes oral mechanism examination, assessment of vocal quality, and controlled trials of food and liquid with varying textures and viscosities. For more detailed assessment, videofluoroscopic swallowing study (VFSS) — X-ray video of swallowing while the patient consumes barium-impregnated foods and liquids — provides direct visualization of bolus movement, airway closure timing, and aspiration. Fiberoptic endoscopic evaluation of swallowing (FEES) uses a flexible laryngoscope to visualize the pharynx and larynx during swallowing and can be performed at the bedside.

Nutritional management of dysphagia involves texture modification of foods and thickening of liquids to a viscosity that can be safely and efficiently managed by the individual’s specific swallowing impairments. The International Dysphagia Diet Standardisation Initiative (IDDSI) provides a globally standardized framework with eight levels of food texture (0 = thin liquids to 7 = regular normal textures) and corresponding test methods for verifying texture properties (the fork drip test, spoon tilt test, and finger test for foods; the IDDSI flow test for liquids). Providing food and liquid at the appropriate IDDSI level is a fundamental standard of care for individuals with dysphagia. However, texture modification significantly compromises the palatability and nutrient density of meals — modified texture foods may be less flavourful, less visually appealing, and contain more water and less energy per serving than unmodified foods — making nutritional adequacy monitoring essential.


Chapter 8: Long-Term Care, Retirement Homes, and Community Nutrition

The Continuum of Care: Institutional Nutrition

Long-term care (LTC) facilities provide 24-hour nursing and personal care for older adults who are unable to live safely in the community. The LTC resident population is characterized by extreme frailty, multiple chronic conditions, polypharmacy, cognitive impairment (60–70% have dementia), and very high rates of malnutrition. Meeting the nutritional needs of this population requires addressing not only the macronutrient and micronutrient composition of meals but also the entire food service system — meal timing, portion sizes, service style, assistance with eating, dining environment, and food preferences.

Assisted living (also called retirement homes, senior living communities, or continuing care retirement communities, depending on the province and level of care) offers a range of services between fully independent living and LTC. Residents typically require some assistance with daily activities but can participate in meal selection. The nutritional quality of meals provided varies enormously between facilities, and regulatory oversight of nutritional standards is less stringent than in LTC. Residents of retirement homes face unique nutritional risks including dependence on institutional catering for most meals (with limited control over food selection), reduced physical activity compared to community living, and social dynamics of communal dining that may affect intake.

The Waterloo KIN 342 course includes laboratory experiences at the CCCARE (Centre for Community, Clinical, and Applied Research Excellence) on the UW north campus, where students conduct practical nutritional assessments with volunteer older adult participants. These experiential learning components — the Mediterranean diet assessment, risk screening with CCCARE clients, the hidden salt/fat/sugar lab, and the modified texture foods lab — provide direct clinical application of the nutritional principles taught in the classroom. The MNA and other validated tools are used in their intended clinical context, and students develop communication skills for discussing sensitive nutritional information with older adults in a person-centred way.

Diet, Medications, and Alcohol in Older Adults

Older adults are the heaviest users of medications, with the average community-dwelling older adult taking 5–7 prescription medications simultaneously — a phenomenon known as polypharmacy. Drug-nutrient interactions in this population are numerous, clinically significant, and frequently overlooked. Several broad categories of interaction are particularly important in geriatric nutrition practice.

Drugs affecting nutrient absorption: Cholestyramine and other bile acid sequestrants bind fat-soluble vitamins (A, D, E, K) along with bile acids, reducing their absorption. Proton pump inhibitors and H2-receptor antagonists reduce gastric acid, impairing absorption of vitamin B12, non-heme iron, calcium (as carbonate), and magnesium. Metformin impairs B12 absorption by uncertain mechanisms (possibly reduced calcium-dependent intrinsic factor binding in the ileum). Antacids containing aluminum or magnesium hydroxide can bind dietary phosphorus.

Drugs affecting nutrient metabolism: Methotrexate (used in rheumatoid arthritis and some cancers) inhibits dihydrofolate reductase, depleting cellular tetrahydrofolate and requiring supplemental folic acid. Isoniazid (used in tuberculosis treatment) acts as a vitamin B6 antagonist, requiring supplemental pyridoxine. Phenytoin (an anticonvulsant) induces hepatic enzymes that increase vitamin D and vitamin K catabolism; long-term phenytoin use is associated with reduced bone mineral density. Warfarin is directly antagonized by vitamin K, making dietary vitamin K intake an important variable in maintaining therapeutic anticoagulation.

Alcohol use in older adults carries unique risks. The pharmacokinetics of alcohol change with aging: reduced total body water (increasing peak blood alcohol concentration for a given dose), reduced gastric alcohol dehydrogenase activity (increasing systemic bioavailability), and impaired hepatic metabolism (prolonging alcohol’s effects) all increase older adults’ sensitivity to alcohol. Additionally, alcohol interacts with many of the medications commonly used in older adults: it potentiates the CNS depression of benzodiazepines, antihistamines, and opioids; increases the hypoglycemic effect of sulfonylureas; increases the anticoagulant effect of warfarin; and can precipitate disulfiram-like reactions with metronidazole. Nutritional consequences of heavy alcohol use include impaired absorption of thiamine, folate, B12, zinc, and magnesium; disrupted fat metabolism; increased caloric intake that may displace nutrient-dense foods; and hepatic damage that impairs nutrient processing and storage.


Chapter 9: Nutritional Assessment Case Study Integration

Applying Clinical Nutrition Skills

The practical application of nutritional assessment skills — central to the KIN 342 laboratory experiences — involves integrating the assessment frameworks and condition-specific knowledge from the preceding chapters to develop individualized care recommendations for older adults facing complex nutritional challenges. A clinical nutrition encounter with an older adult encompasses multiple domains: anthropometric measurement and interpretation, dietary intake assessment, review of medications for drug-nutrient interactions, physical examination for signs of malnutrition, functional assessment of eating and swallowing, and evaluation of social and environmental determinants of food intake.

Mrs. K. is an 82-year-old woman residing in a retirement home. She has Parkinson's disease (affecting her swallowing and hand dexterity), mild Alzheimer's dementia, type 2 diabetes, osteoporosis, and mild-to-moderate chronic kidney disease (CKD, eGFR 40 mL/min/1.73m²). Her medications include levodopa/carbidopa, metformin, alendronate, and a calcium/vitamin D supplement. She has lost 4 kg over the past 3 months and her MNA-SF score is 9 (risk of malnutrition). Her diet recall reveals low protein intake (approximately 40 g/day), inadequate fiber, excessive sodium (approximately 3,400 mg/day), and suboptimal fluid intake (approximately 900 mL/day).

Assessment reveals several clinical nutrition priorities: (1) protein intake must be increased to 1.0–1.2 g/kg/day to combat sarcopenia and weight loss, but CKD requires caution — at eGFR 40, the 2020 KDIGO guidelines suggest that protein intake of 0.6–0.8 g/kg/day may slow CKD progression; this tension between sarcopenia prevention and CKD management requires individualized clinical judgment, likely trending toward the higher end given that her weight loss is the more immediate concern; (2) levodopa should be taken 30–60 minutes before high-protein meals to avoid competition between large neutral amino acids and levodopa for intestinal and brain transport; (3) dysphagia assessment by an SLP is warranted given Parkinson’s disease; modified texture foods and thickened liquids may be needed; (4) the retirement home dietitian should work with dietary staff to provide high-protein, high-energy foods in textures appropriate for her dysphagia; (5) sodium reduction to below 2,300 mg/day is indicated for blood pressure and CKD management; (6) fluid intake should be increased to at least 1,500 mL/day, with assistance provided to ensure access; (7) dementia-related eating challenges (difficulty with utensils, distraction at mealtimes) should be addressed through adaptive equipment and a supportive mealtime environment.

This case illustrates the complexity of clinical nutrition practice with older adults — the frequent coexistence of competing nutritional priorities, the need to integrate pharmacological and nutritional management, and the importance of an interdisciplinary team (including the dietitian, SLP, physician, and personal care workers) in achieving optimal nutritional outcomes. The KIN 342 course prepares students to contribute to this interdisciplinary process by providing a comprehensive understanding of the nutritional needs and risks of older adults across the continuum of care, and by developing practical skills in nutritional assessment and recommendation.

Pressure Injuries and Nutritional Support

Pressure injuries (formerly called pressure ulcers or decubitus ulcers) are localized damage to the skin and underlying tissue, usually over a bony prominence, resulting from sustained pressure or pressure combined with shear forces. They are a significant and largely preventable complication in immobile older adults, particularly those in LTC and acute care settings. The National Pressure Injury Advisory Panel (NPIAP) stages pressure injuries from Stage 1 (non-blanchable erythema of intact skin) through Stage 4 (full-thickness skin and tissue loss exposing muscle, tendon, cartilage, or bone), with two additional categories: unstageable (obscured base) and deep tissue pressure injury (persistent non-blanchable deep red/maroon discoloration).

Nutritional status is a major risk factor for pressure injury development and impairs healing of established injuries. Protein is the nutrient most critical to wound healing: it is required for synthesis of collagen (the primary structural protein of healing wounds), for keratinocyte and fibroblast proliferation, for immune function (preventing infection of open wounds), and for maintaining the oncotic pressure of plasma proteins that prevents tissue edema. The recommended protein intake for pressure injury healing is 1.25–1.5 g/kg/day — higher than requirements for healthy older adults — with attention to distributing intake across meals to maximize MPS. Vitamin C is the cofactor for prolyl and lysyl hydroxylases required for collagen cross-linking; supplemental vitamin C (200–500 mg/day) is widely recommended during active wound healing, though evidence from controlled trials is limited. Zinc is required for DNA synthesis, protein synthesis, and tissue repair; deficiency impairs wound healing, and zinc supplementation (up to 40 mg elemental zinc/day for a limited period) may benefit healing in deficient individuals. Arginine — conditionally essential in stressed states — is a precursor to nitric oxide (required for angiogenesis and immune function) and collagen synthesis; disease-specific enteral formulas enriched in arginine, vitamin C, and zinc (such as Juven and Arginaid) have been shown in some trials to accelerate healing of Stage III and IV pressure injuries.


Chapter 10: Physiological Changes of Aging and Nutritional Status

Reduced Gastric Acid and Vitamin B12 Absorption

The stomach undergoes profound age-related changes that have far-reaching consequences for nutrient absorption. Atrophic gastritis — chronic inflammation of the gastric mucosa leading to loss of parietal cells and chief cells — becomes increasingly prevalent with advancing age, with some estimates suggesting it affects 10–30% of adults over 60 and up to 40% of those over 80. Parietal cells produce both hydrochloric acid and intrinsic factor; their progressive loss therefore results simultaneously in hypochlorhydria (reduced gastric acid secretion) and, in more severe cases, in intrinsic factor deficiency. The consequences for vitamin B12 metabolism are particularly significant and deserve careful mechanistic explanation.

Vitamin B12 in food exists almost exclusively bound to dietary protein. The release of protein-bound B12 is an acid-dependent process: gastric acid denatures the protein, allowing pepsin (a protease activated by low gastric pH) to cleave the B12-protein bond, liberating free B12 to bind to haptocorrin (R-binder protein) in saliva and gastric juice. In the duodenum, pancreatic proteases degrade haptocorrin-B12 complexes, releasing free B12 to bind intrinsic factor (IF) — a glycoprotein produced by parietal cells. The IF-B12 complex is then absorbed by specific cubilin receptors in the terminal ileum. In atrophic gastritis, if gastric acid is insufficient but intrinsic factor production is relatively preserved (as occurs in mild-to-moderate atrophic gastritis), then protein-bound B12 from food cannot be released, but crystalline B12 (as found in supplements and fortified foods) can still be absorbed via IF-mediated transport. This distinction is clinically critical: individuals with mild-to-moderate atrophic gastritis respond to oral crystalline B12 supplementation, whereas those with severe gastric atrophy or autoimmune gastritis (pernicious anemia) — who lack both gastric acid AND intrinsic factor — require either intramuscular B12 injections or very high doses of oral crystalline B12 exploiting passive diffusion (approximately 1% of a high dose is absorbed by passive diffusion, independent of IF).

Atrophic gastritis is a chronic inflammatory condition of the stomach in which the gastric mucosal glands are progressively destroyed and replaced by fibrous tissue or metaplastic intestinal epithelium. The most common causes in older adults are chronic Helicobacter pylori infection (which triggers autoimmune responses against parietal cells) and autoimmune gastritis (in which antibodies directly target parietal cells and intrinsic factor). The hallmarks are hypochlorhydria or achlorhydria, reduced intrinsic factor secretion, elevated serum gastrin (due to loss of acid feedback inhibition), and elevated serum pepsinogen ratios — all used as diagnostic biomarkers.

The prevalence of vitamin B12 deficiency in older adults is substantial. Population surveys suggest that 5–20% of older adults have low serum B12 levels (below 148 pmol/L), and a far larger proportion have subclinical deficiency defined by elevated methylmalonic acid (MMA) or total homocysteine — more sensitive functional indicators of B12 status than serum B12 alone. MMA is a metabolite that accumulates when B12-dependent methylmalonyl-CoA mutase cannot convert methylmalonyl-CoA to succinyl-CoA; elevated MMA is highly specific for B12 deficiency. Total homocysteine is elevated in both B12 and folate deficiency (and also in B6 deficiency), as all three vitamins participate in homocysteine remethylation. The clinical manifestations of B12 deficiency in older adults include megaloblastic anemia (large, poorly formed red blood cells due to impaired DNA synthesis), subacute combined degeneration of the spinal cord (demyelination of the dorsal and lateral columns, producing progressive peripheral neuropathy, ataxia, and ultimately spastic paraplegia), cognitive impairment and dementia (B12 is required for myelin synthesis and for methylation reactions in neuronal DNA and histone metabolism), and depression and psychiatric symptoms. Because neurological manifestations can precede hematological findings and may not be fully reversible even with treatment, early identification and treatment of B12 deficiency is essential.

Renal Function Decline and Its Nutritional Consequences

Kidney function declines with age even in the absence of frank renal disease. Creatinine clearance — a measure of glomerular filtration rate (GFR) — decreases by approximately 1 mL/min/year after age 40, so that by age 80, the average individual has lost approximately 40% of their young-adult renal function. The clinical implication for nutrition is that many older adults without a formal diagnosis of chronic kidney disease (CKD) are effectively functioning at mild-to-moderate CKD levels, with important consequences for vitamin D metabolism, protein handling, electrolyte balance, and medication clearance.

Vitamin D activation requires two sequential hydroxylation steps: first, hepatic 25-hydroxylation converts dietary vitamin D3 (cholecalciferol) or skin-synthesized vitamin D3 to 25-hydroxyvitamin D (25(OH)D, calcidiol) — the main circulating storage form; second, renal 1-alpha-hydroxylation by the enzyme CYP27B1 (25-hydroxyvitamin D-1-alpha-hydroxylase) in the proximal tubule converts 25(OH)D to 1,25-dihydroxyvitamin D (1,25(OH)2D, calcitriol) — the biologically active form that binds the vitamin D receptor. With progressive loss of functional renal tubular mass, 1-alpha-hydroxylase activity declines, reducing calcitriol synthesis even when 25(OH)D levels are adequate. This impaired renal activation contributes to the high rates of functional vitamin D deficiency in older adults with CKD and is an additional reason (beyond reduced cutaneous synthesis and dietary insufficiency) for the increased vitamin D needs in aging. In moderate-to-severe CKD (GFR < 30 mL/min), supplementing with calcitriol or other active vitamin D analogs (alfacalcidol, paricalcitol) rather than simple vitamin D3 may be necessary to ensure adequate biological activity.

Anabolic resistance in the context of protein and muscle metabolism refers to the blunted sensitivity of skeletal muscle protein synthesis to anabolic stimuli — specifically postprandial aminoacidemia, insulin, and resistance exercise — observed in older versus younger adults. Rather than a reduction in the maximum achievable muscle protein synthesis rate, anabolic resistance is characterized by a rightward shift in the dose-response relationship between amino acid availability (particularly the essential amino acid leucine) and the muscle protein synthesis rate, such that a larger aminoacidemic stimulus is required to achieve equivalent anabolic signaling through mTORC1.

The mechanistic underpinnings of anabolic resistance in aging muscle are multifactorial. Reduced splanchnic extraction of dietary amino acids in younger individuals typically allows large postprandial increases in plasma EAA and leucine concentrations, whereas older individuals show both reduced net protein digestion efficiency and increased splanchnic retention of dietary amino acids after a protein-containing meal, resulting in blunted postprandial plasma leucine peaks. At the cellular level, insulin receptor substrate (IRS)-1 phosphorylation, PI3K/Akt activation, and mTORC1-S6K1 signaling are all attenuated in older muscle in response to both insulin and leucine, reflecting reduced expression and activity of upstream signaling proteins. Intramuscular lipid accumulation (myosteatosis) — common in older adults and exacerbated by inactivity and obesity — induces ceramide and diacylglycerol-mediated inhibition of insulin/IRS-1 signaling, contributing to insulin resistance at the muscle level that reinforces anabolic resistance to postprandial amino acids. Inflammation, particularly elevated IL-6 and TNF-alpha, activates muscle-specific ubiquitin E3 ligases (MuRF1 and MAFbx/atrogin-1) that increase proteasomal protein degradation — tipping the net muscle protein balance toward catabolism even when protein synthesis signaling is intact.

Sensory Changes: Taste, Smell, and Appetite in Aging

Chemosensory function — the ability to detect and discriminate tastes and odors — declines progressively with aging and contributes significantly to the anorexia of aging and reduced dietary quality in older adults. Taste perception depends on taste receptor cells within taste buds located primarily on the tongue, palate, and epiglottis; aging is associated with a reduction in the number of functional taste buds, reduced turnover and regeneration of taste receptor cells, and reduced sensitivity to all five primary taste qualities (sweet, salty, sour, bitter, and umami). Threshold concentrations required to detect and recognize tastes are elevated two- to three-fold in older versus young adults for most taste qualities, meaning that older adults effectively taste food as less flavorful even when they know intellectually what they are eating.

Olfactory function declines even more dramatically than taste function with aging. The olfactory epithelium in the nasal cavity undergoes progressive loss of olfactory receptor neurons (ORNs) with normal aging — a process accelerated by repeated upper respiratory infections, smoking, head trauma, and neurodegenerative diseases including Alzheimer’s and Parkinson’s disease. Because the vast majority of what we perceive as “flavor” is actually retronasal olfaction — the detection of volatile compounds from food in the mouth as they reach the olfactory epithelium via the nasopharynx — declining olfactory function has a proportionally greater impact on flavor perception and food enjoyment than does declining taste sensitivity alone. The clinical implication is that strategies to enhance flavor intensity — using strong herbs and spices, enhancing umami through fermented foods, soy sauce, or monosodium glutamate; intensifying aromas through cooking techniques — can meaningfully improve food enjoyment and intake in older adults with chemosensory decline. Flavor enhancement at the institutional level (e.g., adding concentrated flavor compounds to LTC meals) has been shown in controlled trials to increase food intake and body weight in malnourished older adults.

Immune Senescence and Nutritional Modulation

The aging immune system undergoes a complex reorganization — sometimes termed immunosenescence — that simultaneously increases chronic baseline inflammation (inflammaging) and reduces the adaptive immune response’s ability to mount effective defenses against novel antigens, including vaccine antigens and new pathogens. At the cellular level, the thymus undergoes progressive involution beginning in early adulthood; by age 65, thymic output of naive T lymphocytes has virtually ceased, leaving the peripheral T-cell pool to be maintained by homeostatic proliferation of existing memory cells. This results in reduced T-cell repertoire diversity, impaired T-cell help for B-cell antibody production, and reduced cytotoxic T-cell capacity to clear virally infected and malignant cells. Natural killer (NK) cell numbers may increase with age, but NK cell cytotoxic function per cell declines, reducing surveillance against virally infected and cancer cells.

Several nutrients have well-documented roles in immune function and are commonly deficient in older adults, making nutritional support of immune function a priority in geriatric care. Zinc is required for the development and function of virtually all immune cell lineages, including T lymphocytes (zinc is required for thymulin, a thymic hormone), NK cells, macrophages, and B lymphocytes. Zinc deficiency impairs both innate and adaptive immunity and is common in older adults due to reduced dietary intake, reduced absorption efficiency, and increased urinary losses. The recommended dietary allowance for zinc is 11 mg/day for older men and 8 mg/day for older women, but surveys consistently document mean intakes below these levels in older adult populations. Supplemental zinc at modest doses (10–20 mg/day) has been shown in randomized trials in zinc-deficient older adults to improve T-cell function and reduce the incidence of respiratory infections. However, excessive zinc supplementation (above 40 mg/day) impairs copper absorption and itself causes immune dysfunction, making dose selection important. Vitamin C is essential for neutrophil and lymphocyte function, reducing the duration and severity of upper respiratory infections. Selenium supports antioxidant defense via glutathione peroxidases and is required for optimal NK cell and lymphocyte function. Vitamin E (particularly alpha-tocopherol) at supplemental doses (200 mg/day) has been shown in controlled trials to improve vaccine responses and reduce respiratory infection rates in older adults — one of the few instances where supplementation beyond dietary adequacy shows immunological benefit. Protein-energy malnutrition profoundly suppresses both innate and adaptive immunity, and correction of malnutrition may be the single most important nutritional intervention for supporting immune function.


Chapter 11: Energy Requirements and Metabolic Changes in Aging

Estimating Energy Requirements in Older Adults

Total energy expenditure (TEE) in older adults is determined by the same components as in younger adults — basal metabolic rate (BMR), the thermic effect of food (TEF), and activity energy expenditure (AEE) — but the relative contributions and absolute values of these components change substantially with aging. BMR — the energy required to maintain vital physiological functions at rest — decreases with aging primarily because of the loss of metabolically active lean body mass (skeletal muscle is the largest contributor to BMR). The approximately 2–3% decrease in BMR per decade observed in older adults is largely, though not entirely, explained by the accompanying decrease in fat-free mass; when BMR is expressed per kilogram of fat-free mass, the age-related decline is attenuated but not eliminated, suggesting that metabolic rate per unit of lean tissue also decreases slightly with aging. TEF — the energy expended in digesting, absorbing, and processing dietary nutrients — constitutes approximately 10% of energy intake and does not change substantially with aging. AEE — the most variable component — tends to decline with aging as both voluntary exercise and non-exercise activity thermogenesis (NEAT, encompassing fidgeting, posture maintenance, and spontaneous movement) decrease.

The predictive equations most commonly used for estimating BMR in clinical practice — the Harris-Benedict equation (published 1919) and the Mifflin-St Jeor equation (published 1990) — were derived from relatively young populations and demonstrate reduced accuracy in older adults. The Harris-Benedict equations are:

For men: BMR (kcal/day) = 88.362 + (13.397 × weight in kg) + (4.799 × height in cm) − (5.677 × age in years)

For women: BMR (kcal/day) = 447.593 + (9.247 × weight in kg) + (3.098 × height in cm) − (4.330 × age in years)

The Mifflin-St Jeor equations are considered more accurate for most adults and are currently the preferred equations for clinical use:

For men: BMR (kcal/day) = (10 × weight in kg) + (6.25 × height in cm) − (5 × age in years) + 5

For women: BMR (kcal/day) = (10 × weight in kg) + (6.25 × height in cm) − (5 × age in years) − 161

Both equations include age as a term, and the negative age coefficients mean that estimated BMR decreases with advancing age — but both equations are less accurate in older adults (particularly those over 75) compared to direct calorimetry. For frail older adults with significant muscle wasting, these equations tend to overestimate BMR because they use body weight (which may be elevated if edema is present) rather than lean body mass as the primary metabolic variable. The activity factor applied to BMR to estimate total energy requirements ranges from 1.2 (sedentary/bed-bound) to 1.9 (very active), with most community-dwelling older adults falling in the 1.3–1.5 range and most LTC residents in the 1.2–1.4 range.

Doubly labeled water (DLW) is the gold-standard method for measuring total energy expenditure in free-living conditions. Subjects consume water labeled with both stable isotopes 2H (deuterium) and 18O; 2H is eliminated from the body as water only, while 18O is eliminated as both water and carbon dioxide. By measuring the differential rate of elimination of the two isotopes in urine over 10–14 days, the rate of CO2 production — and from this, total energy expenditure — can be calculated. DLW measurements in free-living older adults have confirmed that predictive equations systematically overestimate TEE in sedentary older adults by 10–20%, highlighting the importance of individualized assessment rather than rigid adherence to equation-derived values.

The practical challenge of energy estimation in older adults is compounded by the need to account for disease states and injury. The injury/stress factor — a multiplier applied to estimated BMR in acutely ill or injured patients — ranges from 1.1 for minor illness to 1.6–2.0 for severe burns, reflecting the hypermetabolic state induced by systemic inflammation and stress hormone release. However, even moderate acute illness in a frail older adult who is already at the lower edge of adequate nutritional intake can precipitate rapid deterioration in nutritional status, making close monitoring and proactive nutritional support essential during acute illness episodes. Many older adults enter hospital with pre-existing malnutrition (documented in 30–50% of hospitalized older adults in many studies), and hospital-related reductions in food intake — from nil-by-mouth orders, diagnostic procedures, pain, nausea, and the general unpleasantness of hospital food — further erode nutritional status rapidly.

Physical Activity and Nutrition in Aging

Physical activity profoundly influences nutritional requirements and nutritional outcomes in older adults. Resistance exercise — any exercise that requires muscles to work against external resistance — is the most effective intervention for preventing and treating sarcopenia and is synergistic with dietary protein in stimulating muscle protein synthesis. The Canadian Society for Exercise Physiology (CSEP) recommends that older adults engage in at least 150 minutes per week of moderate-to-vigorous aerobic activity plus muscle-strengthening activities involving major muscle groups at least twice per week. Meeting these physical activity guidelines maintains lean body mass, preserves functional capacity, reduces fall risk, and — critically for nutritional needs — maintains a higher energy expenditure that allows a more generous total caloric intake and therefore easier achievement of micronutrient adequacy.

The timing of protein intake relative to exercise is an important consideration in designing nutritional strategies for sarcopenia prevention. A series of studies has demonstrated that consuming 20–40 g of high-quality protein (particularly whey protein, which is rapidly digested and has the highest leucine content of common protein sources) within 30–60 minutes of completing a resistance exercise session maximizes the exercise-induced increment in muscle protein synthesis, exploiting the period of enhanced sensitivity to amino acids that follows resistance exercise. This protein-exercise timing principle is relevant across age groups but particularly important for older adults because the window of opportunity may be narrower (blunted by anabolic resistance) and the cumulative muscle mass benefits of optimized protein timing may be more clinically meaningful in a group already losing muscle due to the aging process. Pre-sleep protein consumption — consuming 30–40 g of slow-digesting casein protein (from dairy or supplemental casein) before sleep — has also been shown to increase overnight muscle protein synthesis rates and net muscle mass gains, as overnight amino acid availability is typically limited and muscle protein balance is negative during fasting sleep.


Chapter 12: Macronutrient Metabolism and Aging

Protein: The Leucine Threshold, DIAAS, and Plant vs. Animal Protein

The quality of dietary protein — not only the quantity — determines its capacity to stimulate muscle protein synthesis and meet the body’s needs for essential amino acids. Protein quality is assessed by the completeness of the essential amino acid (EAA) profile, with particular emphasis on leucine (the primary mTORC1 activator), lysine, and the sulfur-containing amino acids methionine and cysteine. Traditional measures of protein quality — the Protein Digestibility Corrected Amino Acid Score (PDCAAS) and, more recently, the Digestible Indispensable Amino Acid Score (DIAAS) — provide frameworks for comparing protein sources.

DIAAS (Digestible Indispensable Amino Acid Score) is the current reference protein quality metric recommended by the FAO (2013). It is calculated as: DIAAS (%) = [mg of digestible dietary indispensable amino acid in 1 g of dietary protein ÷ mg of the same dietary indispensable amino acid in 1 g of the reference protein] × 100, where the limiting amino acid (the one with the lowest score relative to the reference) determines the overall DIAAS. Critically, DIAAS uses ileal digestibility coefficients — measured at the end of the small intestine — rather than fecal digestibility, which is inflated by bacterial fermentation in the colon. This makes DIAAS a more accurate reflection of the amino acids actually available for protein synthesis.

Typical DIAAS values illustrate the superiority of animal proteins for stimulating muscle protein synthesis. Whole egg has a DIAAS of approximately 113 (meaning it exceeds reference levels for all EAAs); milk protein has a DIAAS of approximately 114; beef has a DIAAS of approximately 111–114. In contrast, most plant proteins have DIAAS values below 100, reflecting limiting amino acids: pea protein has a DIAAS of approximately 82 (limited by methionine + cysteine); wheat has a DIAAS of approximately 40–50 (severely limited by lysine); soy protein isolate has a DIAAS of approximately 90–98 (limited by methionine). These differences have practical implications for older adults following plant-based diets: to achieve an equivalent anabolic stimulus to that obtained from high-DIAAS animal protein, larger amounts of plant protein must be consumed, and complementary protein combinations (legumes with grains, for instance) are needed to correct limiting amino acids. Additionally, leucine content per gram of protein is typically lower in plant sources than animal sources: whey protein contains approximately 11 g leucine per 100 g protein, while pea protein contains approximately 8 g leucine per 100 g, and wheat protein approximately 7 g — meaning that achieving the leucine threshold for maximal mTORC1 activation requires a larger serving of plant protein. The leucine threshold for older adults is approximately 3–4 g per meal, meaning a minimum of approximately 27–50 g of plant protein per meal depending on the source.

The practical translation of protein quality science to dietary advice for older adults involves identifying high-protein, high-leucine foods that can be incorporated into meals and snacks. Per-serving leucine content: 100 g cooked chicken breast provides approximately 2.3 g leucine; 100 g canned tuna provides approximately 2.5 g leucine; 200 g Greek yogurt provides approximately 1.5 g leucine; 25 g of whey protein powder provides approximately 2.5 g leucine; 200 g cooked lentils provides approximately 1.3 g leucine. These values illustrate that a typical 100 g serving of meat or fish approaches the leucine threshold, while plant protein servings of comparable mass typically fall short and must be combined with other protein sources or consumed in larger amounts to stimulate maximal muscle protein synthesis. For older adults with reduced appetites or early satiety, high-protein, leucine-fortified liquid supplements may provide a practical means of meeting protein needs without requiring large meal volumes.

Carbohydrate Metabolism, Glycemic Index, and the Aging Gut

Carbohydrate metabolism in older adults is characterized by progressive insulin resistance — a reduced sensitivity of peripheral tissues (particularly skeletal muscle and adipose tissue) to the glucose-lowering effects of insulin — resulting from multiple age-related changes including accumulation of intramuscular and intraabdominal lipid, mitochondrial dysfunction, chronic inflammation, and reduced physical activity. Insulin resistance in older adults increases postprandial blood glucose excursions in response to carbohydrate-containing meals, with clinical significance ranging from frank type 2 diabetes (in approximately 25–30% of older Canadians) to prediabetes (impaired fasting glucose or impaired glucose tolerance) in a substantial additional proportion. The glycemic index (GI) — a measure of how quickly blood glucose rises in response to a given carbohydrate-containing food relative to a reference food (glucose or white bread) — and glycemic load (GL) — which adjusts for the amount of carbohydrate in a typical serving — provide a dietary framework for managing postprandial glycemia that is particularly relevant for older adults with insulin resistance or diabetes.

Glycemic index (GI) is a numerical ranking (0–100) of carbohydrate-containing foods based on their effect on blood glucose levels over two hours, compared to a reference food (either glucose = 100 or white bread = 100). Low-GI foods (GI ≤ 55) produce a slower, lower, and more sustained blood glucose rise; high-GI foods (GI ≥ 70) produce a rapid, high glucose spike followed by a rapid decline. Glycemic load (GL) is calculated as GL = (GI × grams of available carbohydrate per serving) ÷ 100, and accounts for both the quality and quantity of dietary carbohydrate.

Dietary fiber plays multiple physiologically important roles in aging beyond glycemic management. The two broad categories of dietary fiber — soluble (viscous) and insoluble (non-viscous) — have distinct physiological mechanisms. Soluble fiber (found in oats, barley, legumes, psyllium, and many fruits) forms viscous gels in the gastrointestinal lumen that slow gastric emptying and nutrient absorption, attenuating postprandial glucose and insulin responses; binds bile acids in the lumen and reduces their reabsorption, lowering LDL cholesterol through compensatory upregulation of hepatic LDL receptors; and is fermented by gut bacteria in the colon, producing short-chain fatty acids (SCFAs — acetate, propionate, butyrate) that have multiple beneficial effects including colonocyte energy supply, mucosal barrier maintenance, anti-inflammatory signaling, and satiety hormone (GLP-1, PYY) stimulation. Insoluble fiber (found in wheat bran, whole grains, and many vegetables) increases stool bulk and softness, accelerates colonic transit, and reduces constipation. The gut microbiome — the community of trillions of microorganisms inhabiting the colon — changes substantially with aging: diversity decreases, beneficial fiber-fermenting species (Bifidobacterium, Lactobacillus, Faecalibacterium prausnitzii) decline in relative abundance, and pro-inflammatory species may increase. These age-related microbiome changes are associated with inflammaging, increased intestinal permeability, and reduced SCFA production. Dietary fiber intake — particularly prebiotic fibers that selectively stimulate growth of beneficial bacteria — is the most modifiable dietary determinant of gut microbiome composition and health in older adults.

Fat Metabolism: Omega-3 Fatty Acids, Inflammation, and Eicosanoid Pathways

Dietary fat in aging interacts with the inflammaging process in ways that make fat quality — not just quantity — of paramount importance. The omega-3 polyunsaturated fatty acids (PUFAs) — particularly eicosapentaenoic acid (EPA, 20:5n-3) and docosahexaenoic acid (DHA, 22:6n-3), found predominantly in fatty fish (salmon, mackerel, herring, sardines), fish oil supplements, and algae-based supplements — have potent anti-inflammatory properties mediated through several molecular mechanisms that are particularly relevant to aging-associated chronic inflammation.

At the level of eicosanoid biosynthesis: when cellular membranes are enriched in omega-6 arachidonic acid (AA, 20:4n-6) — as occurs with diets high in omega-6-rich vegetable oils (corn, sunflower, soybean) and low in marine omega-3s — phospholipase A2-mediated membrane AA release provides substrate for cyclooxygenase (COX) enzymes (COX-1 and COX-2) and lipoxygenase (LOX) enzymes (5-LOX, 12-LOX, 15-LOX). COX-1 and COX-2 convert AA to prostaglandins of the 2-series (PGE2, PGI2, thromboxane A2) — potent pro-inflammatory, vasoconstrictive, and pro-platelet-aggregating mediators. 5-LOX converts AA to leukotrienes of the 4-series (LTB4, LTC4, LTD4) — powerful chemoattractants and airway constrictors. When EPA replaces AA in membrane phospholipids (which occurs when dietary EPA intake is high), the same COX and LOX enzymes produce prostaglandins of the 3-series (PGE3, PGI3) and leukotrienes of the 5-series (LTB5) — biologically less potent versions with significantly lower pro-inflammatory activity. Additionally, EPA and DHA are precursors to resolvins (E-series from EPA, D-series from DHA) and protectins (neuroprotectin D1 from DHA) — specialized pro-resolving mediators (SPMs) that actively terminate the inflammatory response and promote tissue repair. This mechanistic framework explains why dietary omega-3 fatty acids consistently reduce circulating markers of inflammation (CRP, IL-6, TNF-alpha) in clinical trials and supports their therapeutic role in conditions driven by chronic inflammation, which include virtually all of the major chronic diseases of aging — CVD, type 2 diabetes, sarcopenia, cognitive decline, and cancer.


Chapter 13: Key Micronutrients in Aging

Calcium: Absorption Decline, Sources, and the Supplementation Controversy

Calcium is the most abundant mineral in the human body, with approximately 99% sequestered in the skeleton as hydroxyapatite. The remaining 1% circulates in the blood and extracellular fluid at tightly regulated concentrations (2.15–2.55 mmol/L) that are critical for muscle contraction, nerve signal transmission, blood coagulation, and intracellular signaling. Calcium homeostasis is maintained by the integrated actions of parathyroid hormone (PTH), calcitriol (active vitamin D), and calcitonin regulating intestinal absorption, renal reabsorption, and bone resorption.

Intestinal calcium absorption declines significantly with aging. In young adults, approximately 35–45% of dietary calcium is absorbed; in older adults, absorption efficiency may fall to 15–20%. The mechanisms include reduced renal production of calcitriol (the primary stimulator of active, transcellular calcium absorption in the duodenum via the TRPV6 calcium channel), reduced expression of intestinal vitamin D receptors, reduced production of calbindin-D9k (the intracellular calcium transport protein whose synthesis is vitamin D-dependent), and reduced intestinal surface area due to villous atrophy in some older adults. Achlorhydria further reduces calcium absorption from insoluble calcium salts like calcium carbonate, which require acid dissolution before ionization and absorption; calcium citrate, which is already in an ionized, acidsoluble form, is better absorbed under low-acid conditions and is therefore the preferred supplement form for older adults with achlorhydria or on PPIs.

Osteoporosis is a systemic skeletal disorder characterized by low bone mass and microarchitectural deterioration of bone tissue leading to increased bone fragility and fracture risk. It is defined by the World Health Organization using bone mineral density (BMD) measured by dual-energy X-ray absorptiometry (DXA): a T-score at the femoral neck or lumbar spine of −2.5 or below defines osteoporosis; T-score between −1.0 and −2.5 defines osteopenia; T-score above −1.0 is normal. The clinical significance of osteoporosis lies entirely in its fracture consequences: approximately 1 in 3 women and 1 in 5 men will experience an osteoporotic fracture during their lifetime.

The dietary calcium requirement increases after age 50 for women and after age 70 for men, reflecting the declining absorption efficiency and the need to maintain calcium balance (prevent net bone resorption to supply serum calcium needs) despite reduced absorption. Health Canada’s Dietary Reference Intake sets the RDA for calcium at 1,000 mg/day for men 51–70 years, and 1,200 mg/day for women 51+ years and men 71+ years. Achieving the RDA from dietary sources is challenging for many older adults: each cup of milk provides approximately 300 mg of calcium; fortified soy milk provides similar amounts; yogurt provides 200–400 mg per cup; hard cheese provides approximately 200–300 mg per 30 g serving; canned sardines with bones provide approximately 350 mg per 85 g; tofu made with calcium sulfate provides approximately 200–400 mg per 100 g. The recommendation is to achieve calcium targets from food first, and to use supplements only to bridge the gap between dietary intake and the RDA, at doses of 500–600 mg elemental calcium at a time (since the efficiency of calcium absorption decreases at doses above this threshold). The concern about cardiovascular risk from calcium supplementation — while not definitively established — provides additional reason to avoid exceeding the tolerable upper intake level of 2,000–2,500 mg/day total calcium (from all sources combined).

Folate, Vitamin B6, and Homocysteine in Aging

The interrelationship between folate, vitamin B6, vitamin B12, and homocysteine metabolism is of substantial clinical importance in aging. Homocysteine is a sulfur-containing amino acid that sits at a metabolic intersection between the methionine cycle and the transsulfuration pathway. In the methionine cycle, homocysteine is remethylated to methionine by the enzyme methionine synthase (which requires methylcobalamin — the active coenzyme form of B12 — and 5-methyltetrahydrofolate — the folate coenzyme that donates the methyl group). Alternatively, homocysteine can be exported and remethylated by betaine-homocysteine methyltransferase (BHMT), which uses betaine as the methyl donor. In the transsulfuration pathway, homocysteine is irreversibly converted to cystathionine by cystathionine beta-synthase (which requires pyridoxal phosphate — the active coenzyme form of vitamin B6), then further converted to cysteine, which can be used for glutathione synthesis or oxidized to taurine or inorganic sulfate for renal excretion. Deficiencies in folate, B12, or B6 all cause homocysteine to accumulate.

Elevated plasma total homocysteine (tHcy) — typically defined as tHcy above 15 μmol/L (hyperhomocysteinemia) — is associated in extensive epidemiological data with increased risks of cardiovascular disease, stroke, cognitive decline, and dementia. The causal significance of this association has been extensively debated; randomized trials of homocysteine-lowering with B-vitamin supplementation (folate, B12, B6) have generally reduced tHcy levels effectively but have not consistently reduced cardiovascular event rates, suggesting that homocysteine may be a marker rather than a causal mediator of vascular disease risk. For cognitive decline and dementia, the evidence is more suggestive: the VITACOG trial demonstrated that B-vitamin supplementation slowed brain atrophy (a structural marker of neurodegeneration) in older adults with elevated homocysteine and mild cognitive impairment, with the benefit confined to those with omega-3 levels in the upper tertile — a finding suggesting a synergistic interaction between omega-3 fatty acids and B vitamins in brain health.

Iron, Zinc, and Magnesium in the Aging Population

Iron metabolism in older adults presents a different pattern from younger populations. While iron deficiency anemia is common in younger women due to menstrual blood losses, in older adults the causes of iron deficiency shift to gastrointestinal blood loss (from colorectal cancer, angiodysplasia, gastric ulcers, or NSAID-related gastropathy), malabsorption (due to atrophic gastritis, celiac disease, or H. pylori infection), and reduced dietary intake. Diagnosing iron deficiency in older adults requires care because serum ferritin — the standard biochemical marker — is an acute phase reactant whose levels are elevated by inflammation; a malnourished, inflamed older adult may have depleted iron stores but a normal or even elevated serum ferritin. Combining serum ferritin with transferrin saturation (low in iron deficiency) and C-reactive protein (elevated in inflammation) allows more accurate interpretation. Conversely, iron overload in older adults — resulting from hereditary hemochromatosis (particularly HFE gene mutations), repeated transfusions, or excessive supplemental iron intake — promotes oxidative stress through Fenton chemistry (iron-catalyzed hydroxyl radical generation from hydrogen peroxide) and increases infection risk (iron is a required nutrient for many bacterial pathogens). Supplemental iron in older adults who are not iron deficient should be avoided.

Zinc and magnesium are two micronutrients that deserve special attention in older adults due to their prevalence of suboptimal status and their roles in multiple aging-relevant physiological processes. Zinc participates in the catalytic function of over 300 enzymes and is required for DNA synthesis and repair, protein synthesis, wound healing, smell and taste perception (zinc is a cofactor for carbonic anhydrase VI, which is secreted in saliva and required for taste receptor function), and immune function. Age-related zinc depletion occurs through multiple mechanisms: reduced dietary intake (older adults frequently have low red meat, seafood, and legume intakes), reduced absorption efficiency (phytate from grains and legumes chelates zinc and reduces its absorption), increased renal zinc losses in some conditions, and redistribution of zinc from functional pools into metallothionein (an intracellular zinc storage protein induced by inflammation and cortisol). Magnesium is a cofactor for over 300 enzymatic reactions including all ATP-utilizing reactions (since ATP exists primarily as Mg-ATP), DNA and RNA synthesis, protein synthesis, and the maintenance of cell membrane potential. Magnesium is concentrated in whole grains, legumes, nuts, seeds, and green leafy vegetables — food groups that are under-consumed by many older adults. Long-term PPI use causes magnesium malabsorption (PPIs impair TRPM6, the active transcellular magnesium transporter in the small intestine), frequently producing clinically significant hypomagnesemia that can cause cardiac arrhythmias, muscle cramps, and neuromuscular irritability.


Chapter 14: Hydration in Aging

Thirst Impairment and Dehydration Consequences

Adequate hydration is essential for virtually every physiological process, yet older adults are particularly vulnerable to dehydration due to a complex set of age-related changes affecting both fluid intake and renal fluid conservation. The most significant of these is the age-related blunting of thirst sensation: compared to young adults, older adults report significantly less subjective thirst in response to experimental hypertonic challenge (water deprivation, sodium infusion), indicating a reduced hypothalamic osmoreceptor sensitivity to the plasma osmolality signals that normally drive thirst. Older adults who are dehydrated therefore experience less thirst-driven urge to drink, and if left to drink voluntarily, do not adequately restore their fluid balance following water loss. This impaired thirst-fluid intake coupling is exacerbated by the reduced ability of the aging kidney to concentrate urine (due to reduced medullary concentration gradient and reduced responsiveness to antidiuretic hormone/ADH), meaning that older adults lose more water per unit of solute excretion than younger adults when fluid intake is inadequate.

Dehydration in clinical practice is defined as a loss of body water exceeding 1% of body weight, resulting in elevated serum osmolality (hyperosmolality) and reduced intracellular and extracellular fluid volumes. In older adults, even mild dehydration (1–2% body weight loss) produces measurable impairment in cognitive performance (particularly executive function, attention, and psychomotor speed), physical performance, and thermoregulatory capacity. Dehydration is classified as isotonic (equal loss of water and electrolytes, as in diarrhea or hemorrhage), hypotonic (greater loss of electrolytes than water, as in excessive sweating without electrolyte replacement), or hypertonic (greater loss of water than electrolytes, as in fever, inadequate fluid intake, or high solute loads — the most common form in nursing home residents).

The clinical consequences of dehydration in older adults are far-reaching and clinically serious. Delirium — an acute confusional state characterized by disturbance in attention, awareness, and cognition fluctuating over the day — is both caused by and complicated by dehydration; dehydration is among the most common precipitants of delirium in hospitalized older adults. Falls are precipitated by dehydration-induced orthostatic hypotension (reduced blood volume impairs the ability of the cardiovascular system to maintain blood pressure on standing) and cognitive impairment (which reduces situational awareness and reaction time). Urinary tract infections (UTIs) — the most common bacterial infection in older adults — are facilitated by dehydration-induced reduction in urine flow, which reduces mechanical flushing of bacteria from the urethra and bladder. Concentrated urine also creates a more favorable growth environment for uropathogens. Adequate fluid intake (producing urine output of at least 1.5 liters per day, corresponding to pale yellow urine) is a simple but effective strategy for UTI prevention. Constipation is worsened by dehydration, as the colon extracts water from the stool to compensate for systemic fluid deficits, producing harder, smaller stools that are more difficult to pass. Acute kidney injury can be precipitated or exacerbated by dehydration in older adults with already-reduced renal reserve, and certain nephrotoxic medications (NSAIDs, aminoglycosides, contrast agents) produce far greater renal injury in the dehydrated state.

Assessment of Hydration Status and Fluid Recommendations

Assessing hydration status in older adults is challenging because many conventional clinical indicators are unreliable in this population. Body weight change — a gold-standard indicator in athletes — is confounded in older adults by fluctuating edema (particularly in those with heart failure, hypoalbuminemia, or venous insufficiency), making it difficult to distinguish fluid loss from other causes of weight change. Serum osmolality above 296 mOsm/kg is a reliable indicator of hyperosmolar dehydration, as is serum sodium above 145 mEq/L (hypernatremia — particularly concerning in older adults because severe hypernatremia is associated with significant mortality). Blood urea nitrogen-to-creatinine ratio (BUN:Cr) above 20:1 suggests prerenal dehydration, though this ratio is also elevated by high protein intake and gastrointestinal bleeding. Urine specific gravity (above 1.020) and urine color (dark yellow to amber) are practical bedside indicators of dehydration, though both can be influenced by medications and other factors.

Fluid recommendations for older adults generally target a minimum intake of 1.5–2.0 liters (6–8 cups) of total fluid per day, with higher amounts warranted during hot weather, febrile illness, or significant exercise. In institutional settings, proactive hydration strategies — regular fluid rounds at scheduled intervals, offering fluids with medications, providing water-rich foods (fruits, soups, gelatin, yogurt), flavoring water to improve palatability, and using lidded cups with straws for those with limited hand function — can meaningfully increase total fluid intake. Thickened fluids (required for those with dysphagia-related aspiration of thin liquids) are less palatable than thin fluids, reducing voluntary intake and increasing dehydration risk; compensating with higher fluid volumes at meals and regular fluid checks is essential for dysphagic individuals. In frail older adults with heart failure or renal failure, fluid restrictions may be necessary and individualized goals should be set in collaboration with the medical team.


Chapter 15: Cardiovascular Disease — Detailed Nutritional Management

The DASH Diet and Sodium Restriction in Heart Disease

The DASH (Dietary Approaches to Stop Hypertension) diet was developed and tested in a landmark series of randomized controlled trials funded by the United States National Heart, Lung, and Blood Institute (NHLBI) in the 1990s. The core DASH dietary pattern emphasizes: abundant fruits and vegetables (8–10 servings per day); low-fat dairy products (2–3 servings per day); whole grains; lean proteins including poultry, fish, and legumes; nuts and seeds; and limited sodium (the original DASH trial tested intakes of 3,300 mg, 2,300 mg, and 1,500 mg per day), saturated fat (< 6% of calories), added sugars, and red meat. The DASH diet lowers systolic blood pressure by approximately 8–14 mmHg in hypertensive adults — a reduction comparable to first-line antihypertensive drug therapy — through mechanisms including increased potassium intake (potassium promotes renal sodium excretion and vasodilation), increased magnesium and calcium intake (both modulate vascular smooth muscle tone), increased nitrate intake from vegetables (converted to nitric oxide, a potent vasodilator), reduced dietary saturated fat (improving endothelial function), and the combined anti-inflammatory effect of its micronutrient and phytochemical density.

Sodium restriction is among the most evidence-supported dietary interventions for reducing cardiovascular events in older adults with hypertension and heart failure. The Canadian hypertension guidelines recommend sodium intakes below 2,300 mg/day for adults with hypertension and below 1,500 mg/day for those with heart failure, significantly below the average sodium intake of 3,400 mg/day in most Western populations. The major sources of dietary sodium in older adults include processed and packaged foods (contributing approximately 75–80% of total sodium in the North American diet), restaurant meals, and discretionary salt addition at the table and during cooking. Sodium reduction in LTC settings is particularly challenging because sodium is a major flavor enhancer in institutional food preparation; reducing sodium without compensating with other flavor strategies (herbs, spices, umami sources) often results in reduced palatability and food intake — an important consideration in a setting where malnutrition is already prevalent.

Heart Failure and Cardiac Cachexia

Heart failure (HF) — characterized by the inability of the heart to pump sufficient blood to meet the body’s metabolic demands — affects approximately 10% of Canadians over 75 and represents a major cause of hospitalizations, morbidity, and mortality in older adults. Nutritional management of heart failure is complex because the condition itself profoundly disrupts nutritional status through multiple mechanisms, while nutritional interventions must navigate competing constraints including sodium and fluid restriction, reduced appetite, and the anorexigenic effects of cardiac medications (particularly digoxin and spironolactone).

Cardiac cachexia — the progressive involuntary weight and muscle loss that occurs in advanced heart failure — is a particularly devastating complication affecting approximately 10–15% of HF patients, defined as non-oedematous weight loss of more than 6% of body weight over 6–12 months. The pathophysiology of cardiac cachexia involves elevated pro-inflammatory cytokines (TNF-alpha, IL-6, IL-1) produced by activated cardiac macrophages and circulating from ischemic myocardium that directly suppress appetite, stimulate muscle protein catabolism, and activate adipose tissue lipolysis. Neurohormonal activation — elevated angiotensin II, aldosterone, cortisol, and catecholamines — further drives proteolysis and lipolysis. Gastrointestinal congestion from venous hypertension reduces gut motility and impairs nutrient absorption; ascites causes early satiety by mechanical compression of the stomach. Nutritional support for cardiac cachexia must overcome these metabolic barriers: high-protein (1.2–1.5 g/kg/day), high-energy supplementation is often required, but must be balanced against fluid and sodium restrictions, and realistic weight gain goals are modest in the face of ongoing inflammatory and neurohormonal catabolic drive.


Chapter 16: Type 2 Diabetes — Expanded Management

Glycemic Management, Frailty, and the Risk of Hypoglycemia

The management of type 2 diabetes in older adults requires an individualized, goals-of-care approach that explicitly accounts for the dual risks of hyperglycemia (complications in longer-surviving, functionally intact patients) and hypoglycemia (potentially fatal in frail patients on insulin or sulfonylureas). Hypoglycemia — blood glucose below 4.0 mmol/L — triggers counterregulatory responses (glucagon, epinephrine, cortisol, growth hormone secretion) that attempt to restore euglycemia. In older adults, counterregulatory responses may be blunted (reduced epinephrine response, reduced glucagon response), hypoglycemic symptoms (tremor, sweating, tachycardia, hunger) may be attenuated or absent due to autonomic neuropathy, and the cognitive symptoms of hypoglycemia (confusion, dizziness, drowsiness) may be mistaken for dementia or delirium. The consequences of unrecognized hypoglycemia in an older adult — falls with fractures, cardiac arrhythmias (hypoglycemia activates the sympathetic nervous system, increasing arrhythmia risk), motor vehicle accidents, and acute cognitive injury — are severe and may themselves precipitate further functional decline.

The Canadian Diabetes Association’s 2018 clinical practice guidelines categorize older adults by functional status for glycemic target setting. For functionally independent older adults with type 2 diabetes and life expectancy exceeding 10 years, glycemic targets mirror those for younger adults (HbA1c 7.0%, fasting glucose 4.0–7.0 mmol/L, 2-hour postprandial glucose 5.0–10.0 mmol/L). For functionally dependent older adults with multiple comorbidities or moderate cognitive impairment, less stringent targets are recommended (HbA1c 7.1–8.0%). For end-of-life or very frail individuals, the emphasis shifts entirely to symptom management and avoidance of hypoglycemia, with targets of HbA1c up to 8.5% acceptable. Deprescribing — the systematic reduction or discontinuation of medications whose risks outweigh their benefits in the context of the individual’s current health, prognosis, and goals — is particularly important in older adults with type 2 diabetes: reducing or stopping insulin or sulfonylureas in frail patients is often clinically appropriate and requires careful nutrition coordination, as changes in glycemic medications may require corresponding adjustments in carbohydrate timing and distribution.

Meal Timing, Carbohydrate Distribution, and Exercise Timing in Diabetic Older Adults

The timing and distribution of carbohydrate intake across the day is a practical strategy for postprandial glycemic management that does not require medication change. Distributing carbohydrate evenly across three meals (rather than consuming a large proportion at one or two meals) reduces the magnitude of individual postprandial glucose excursions. The plate method — a simplified dietary counseling approach in which half the plate is filled with non-starchy vegetables, one quarter with lean protein, and one quarter with carbohydrate-rich foods — provides a practical visual guide for carbohydrate portioning without requiring formal carbohydrate counting, and is particularly well-suited for older adults or those with cognitive limitations who may find detailed carbohydrate counting burdensome.

The optimal timing of physical activity relative to meals is an emerging consideration in diabetes management for older adults. Aerobic exercise reduces blood glucose acutely through increased glucose uptake by contracting muscle via GLUT4 translocation (an insulin-independent mechanism), and this effect is greatest in the 30–120 minutes following a carbohydrate-containing meal — making post-meal walking a particularly effective strategy for blunting postprandial glucose excursions. A 2016 study (Reynolds et al.) demonstrated that 10-minute walks taken after each meal reduced 24-hour postprandial glucose more effectively than a single 30-minute walk, suggesting that brief, frequent post-meal physical activity is an underutilized strategy that is achievable even for older adults with reduced exercise capacity. For those on insulin or insulin secretagogues, however, exercise timing must be coordinated with insulin dosing and meal timing to avoid exercise-induced hypoglycemia.


Chapter 17: GI Conditions — Deep Dive

GERD, PPIs, and Long-Term Nutritional Consequences

Gastroesophageal reflux disease affects a substantial proportion of older adults and is among the most common indications for pharmacological therapy with proton pump inhibitors (PPIs) in this population. While PPIs effectively reduce gastric acid and provide symptomatic relief from GERD and treatment for peptic ulcer disease, their long-term use — often continued for years or indefinitely beyond the initially recommended 4–8 week courses — is associated with a growing list of nutritional and metabolic complications that are particularly concerning in older adults already at risk for the underlying deficiencies.

The long-term nutritional consequences of PPI use include: vitamin B12 deficiency (reduced gastric acid impairs release of protein-bound B12 from foods; estimates suggest that approximately 4% of long-term PPI users develop clinically significant B12 deficiency, with higher rates in older adults), magnesium deficiency (PPIs impair TRPM6-mediated active magnesium absorption in the small intestine; hypomagnesemia from PPI use can cause tetany, seizures, and cardiac arrhythmias, and does not respond to oral magnesium supplementation until the PPI is discontinued), calcium malabsorption (calcium carbonate requires acid dissolution; long-term PPI use is associated with a modest increase in hip fracture risk, likely through impaired calcium absorption from carbonate supplements and carbonate-containing foods, as well as through hypergastrinemia-mediated effects on parathyroid hormone), and iron deficiency (gastric acid promotes conversion of dietary ferric iron to the more absorbable ferrous form, and is required for release of iron from food matrices; long-term PPI use is associated with reduced iron absorption particularly from non-heme plant sources). The clinical implications for practice include: using the lowest effective PPI dose for the shortest duration; periodically attempting PPI deprescribing (with appropriate stepwise tapering to avoid rebound hyperacidity) in stable patients; monitoring serum B12, magnesium, and calcium in long-term PPI users; and choosing calcium citrate over carbonate supplements in those requiring calcium supplementation while on PPIs.

IBS, the Low-FODMAP Diet, and Diverticular Disease

Irritable bowel syndrome (IBS) — characterized by recurrent abdominal pain related to defecation and associated with changes in stool frequency or form — affects approximately 10–15% of the general population and remains prevalent in older adults, though it may be underdiagnosed because symptoms are commonly attributed to other GI conditions or to “normal” aging. The low-FODMAP diet — developed at Monash University, Australia — is the most evidence-based dietary intervention for IBS symptom management, with approximately 50–75% of IBS patients experiencing clinically meaningful symptom reduction.

FODMAPs (Fermentable Oligosaccharides, Disaccharides, Monosaccharides, And Polyols) are short-chain carbohydrates that are poorly absorbed in the small intestine, rapidly fermented by colonic bacteria, and osmotically active. The major categories are: oligosaccharides (fructans in wheat, onions, garlic; galacto-oligosaccharides in legumes), disaccharides (lactose in dairy), monosaccharides (excess fructose in honey, apples, high-fructose corn syrup), and polyols (sorbitol and mannitol in stone fruits, mushrooms, and many sugar-free products). In IBS, the rapid fermentation of unabsorbed FODMAPs produces gas (causing bloating and distension) and osmotically draws water into the colonic lumen (causing diarrhea), triggering pain through visceral hypersensitivity — the heightened pain response to normal intestinal stimuli that characterizes IBS.

The low-FODMAP diet is implemented in three phases. The elimination phase (2–6 weeks) involves strict restriction of all high-FODMAP foods. The reintroduction phase systematically reintroduces individual FODMAP subgroups one at a time to identify which specific FODMAPs trigger symptoms in the individual — because FODMAP sensitivity is variable between individuals, and many people tolerate some subgroups and not others. The personalization phase develops a long-term sustainable eating pattern that restricts only the individual’s identified trigger FODMAPs. A concern in using the low-FODMAP diet in older adults is its restriction of many prebiotic foods (legumes, whole wheat, onions, garlic) that support beneficial gut bacteria; long-term adherence to a strict elimination diet without individualized reintroduction may negatively affect gut microbiome diversity and increase nutritional inadequacy. Working with a registered dietitian trained in the low-FODMAP protocol is strongly recommended.

Diverticular disease — characterized by the formation of diverticula (small pouches) in the wall of the colon, particularly the sigmoid colon — becomes nearly universal with advancing age: approximately 50% of North Americans over 60 and 65% over 80 have diverticulosis. The long-held view that seeds and nuts precipitate diverticulitis (inflammation of diverticula) has been refuted by prospective data; current evidence supports a high-fiber diet for prevention of diverticulitis, not restriction. Acute diverticulitis — characterized by lower left quadrant abdominal pain, fever, and elevated inflammatory markers — is treated with bowel rest and antibiotics, with dietary modification transitioning from clear fluids back to a regular high-fiber diet as inflammation resolves. Complicated diverticulitis requiring surgery may necessitate temporary colostomy, with associated nutritional management considerations as described in the chapter on GI conditions.


Chapter 18: Cognitive Health and Nutrition

The MIND Diet: Mechanism and Evidence

Nutrition’s influence on cognitive aging and dementia risk has emerged as a major focus of geriatric nutrition research over the past two decades, driven by the observation that the same dietary patterns associated with cardiovascular protection — the Mediterranean diet, the DASH diet — are also associated with reduced cognitive decline and dementia incidence. The MIND (Mediterranean-DASH Intervention for Neurodegenerative Delay) diet, published by Morris et al. in 2015, synthesizes the elements from both parent diets that have the strongest neurological evidence and organizes them into 15 dietary components (10 “brain-healthy” and 5 “unhealthy”).

The brain-healthy MIND diet components, with evidence for specific neuroprotective mechanisms, include: green leafy vegetables (at least 6 servings per week) — providing folate (required for methylation reactions critical to neuronal function and epigenetic regulation), lutein and zeaxanthin (carotenoids concentrated in brain tissue and associated with reduced cognitive decline), vitamin K (emerging evidence for roles in neuronal lipid synthesis via carboxylation of Gas6 and Protein S, which regulate synaptic function), and flavonoids; berries (at least 2 servings per week) — uniquely in the MIND diet, berries specifically rather than fruit generally are recommended, reflecting the concentrated flavonoid content (particularly anthocyanins in blueberries and strawberries) that crosses the blood-brain barrier, reduces neuroinflammation, and promotes neuronal signaling through BDNF (brain-derived neurotrophic factor) upregulation; fish (at least 1 serving per week) — providing EPA and DHA, the omega-3 fatty acids concentrated in neuronal membranes that reduce neuroinflammation via resolvins and protectins and support synaptic membrane fluidity; nuts — providing vitamin E (alpha-tocopherol, the primary fat-soluble antioxidant in the brain), monounsaturated fats, and magnesium; and olive oil as the primary cooking fat, providing oleocanthal (an ibuprofen-like anti-inflammatory compound) and oleic acid.

The observational evidence for the MIND diet’s association with cognitive aging is compelling but still primarily epidemiological. The Morris et al. cohort study found that high MIND diet adherence was associated with cognitive function equivalent to being 7.5 years younger compared to low adherence, and with 54% lower odds of Alzheimer’s disease in the highest versus lowest tertile of adherence. Moderate adherence was associated with 35% reduced odds — a finding interpreted to mean that even partial adherence to the MIND diet may be beneficial. The MIND-LEAP trial (2021) and ongoing large randomized trials are attempting to establish causality through dietary intervention. While results of randomized trials are awaited, the epidemiological evidence, mechanistic plausibility, and overall health benefits of the MIND diet components for cardiovascular and metabolic health provide a strong rationale for recommending the MIND diet to older adults concerned about cognitive aging.

Omega-3 Fatty Acids, B Vitamins, and Vitamin E in Dementia Prevention

Despite the mechanistic plausibility of omega-3 fatty acid supplementation for dementia prevention — DHA is the dominant fatty acid in neuronal membranes, and its concentration in specific brain regions including the hippocampus correlates with cognitive function — randomized controlled trials of omega-3 supplementation for prevention of cognitive decline and dementia have produced inconsistent results. The AREDS2 trial (targeting age-related macular degeneration but including cognitive outcomes) and several dedicated RCTs found no significant effect of omega-3 supplementation on cognitive outcomes in cognitively normal older adults. However, subgroup analyses consistently suggest benefit in those with lower baseline omega-3 status — reinforcing the hypothesis that supplementation corrects deficiency rather than providing additional benefit to those already replete. For the majority of older adults who are not consuming fish regularly (the primary dietary source of EPA and DHA), achieving adequate omega-3 status through either regular fish consumption (two or more servings of fatty fish per week) or supplementation (approximately 1–2 g EPA+DHA per day) is a reasonable recommendation for overall brain health, even if the dementia prevention evidence from RCTs is not yet definitive.

Vitamin E illustrates the importance of distinguishing dietary versus supplemental nutrient effects in cognitive health research. Prospective cohort studies consistently find that high dietary vitamin E intake — from nuts, seeds, wheat germ, vegetable oils, and green leafy vegetables — is associated with lower rates of cognitive decline. However, large randomized trials of supplemental alpha-tocopherol (vitamin E supplements, 400–2000 IU/day) have not demonstrated cognitive benefit and, at high doses, have raised safety concerns (all-cause mortality increase at doses above 400 IU/day, and increased hemorrhagic stroke risk). The discrepancy between dietary and supplemental vitamin E effects likely reflects the absence, in vitamin E supplements, of the gamma-tocopherol, tocotrienols, and synergistic polyphenols present in natural vitamin E-rich foods — a broader illustration of the principle that whole-food dietary patterns typically outperform individual nutrient supplements in demonstrating health benefits in clinical trials.


Chapter 19: Dysphagia and Modified Texture Diets — Full IDDSI Framework

The IDDSI Framework: All Eight Levels

The International Dysphagia Diet Standardisation Initiative (IDDSI) framework, published in 2017 and adopted as the global standard for dysphagia diet terminology, provides a continuous framework of eight levels (0–7) for both drinks (levels 0–4) and foods (levels 3–7), with levels 3 and 4 representing the overlap where both foods and drinks can exist. Each level has a standardized descriptive name, defined textural/viscosity characteristics, and validated test methods for verification.

Level 0 — Thin: Normal water viscosity. Flows like water through a 10 mL syringe in less than 10 seconds. Suitable for individuals with normal swallowing function or those whose dysphagia does not involve aspiration of thin liquids. Examples: water, clear juice, tea, coffee, milk (unthickened). Individuals with pharyngeal dysphagia and/or aspiration of thin liquids require thickened liquids at Level 1, 2, or 3.

Level 1 — Slightly Thick: Slightly thicker than water. Takes more effort to drink than thin liquids. Flows through a 10 mL syringe in 1–10 seconds. Used for individuals who benefit from marginally slowed flow. Examples: some commercial nectars, some thickened beverages.

Level 2 — Mildly Thick: Flows off a spoon, but more slowly than thin liquid. 10 mL syringe empties in more than 10 seconds. Allows some control over flow speed. Examples: some fortified milkshakes, commercial thickened beverages.

Level 3 — Moderately Thick / Liquidised: Can be poured from a spoon but holds a thin layer on the spoon surface. Foods at this level are smooth, without lumps, and easily poured. Requires minimal oral processing. Examples: smooth pureed foods (smooth pureed soup, smooth yogurt drink), can be drunk from a cup or eaten with a spoon.

Level 4 — Extremely Thick / Pureed: Cannot be poured; retains shape on a plate; falls slowly off a spoon. Requires almost no chewing. Can be mashed with tongue. No lumps. Tested with the fork drip test (should not fall through the tines of a fork). Examples: smooth pureed mashed potato consistency, smooth pureed fish or meat.

Level 5 — Minced and Moist: Contains soft, moist small pieces (less than 4 mm for adults, 2 mm for children). Does not require biting but requires some tongue control to manipulate. Can be mashed with tongue alone. Passes through a fork with some effort (particle size criterion). Examples: minced moist meat with gravy, soft moist rice.

Level 6 — Soft and Bite-Sized: Soft, tender foods that can be broken apart with tongue and palate pressure alone (no need for teeth). Pieces should be no larger than 15 mm × 15 mm for adults. Fork-pressable — a piece should squash completely flat under normal fork pressure. Examples: soft ripe banana, soft cooked vegetables, tender fish, scrambled eggs.

Level 7 — Regular: Normal, everyday foods of any texture and piece size. No modifications required. Examples: any food not modified for dysphagia. The designation “Regular” confirms that no texture restriction applies.

Aspiration, in the context of swallowing, refers to the entry of material (food, liquid, saliva, or refluxed stomach contents) into the larynx below the level of the true vocal cords, potentially entering the trachea and lungs. Silent aspiration occurs when material enters the airway without producing a cough or other obvious clinical sign — estimated to occur in 40% of individuals who aspirate — and is particularly dangerous because it may go unrecognized until aspiration pneumonia develops. Silent aspiration is common in individuals with reduced laryngeal sensation (which declines with aging), those with neurological disease (stroke, Parkinson's, dementia), and those on medications that suppress cough reflexes.

The IDDSI framework mandates standardized testing methods for food levels: the fork drip test verifies that a liquid or pureed food at Level 3 will drip continuously through fork tines; the fork pressure test verifies that a Level 6 food compresses fully under the force that can be applied by a thumb; the spoon tilt test verifies that Level 4 food holds its shape on a tilted spoon without flowing off; and the finger test confirms absence of small particle pieces in pureed foods. These objective tests allow food service staff, nurses, and caregivers (not only SLPs) to verify that food preparation meets the prescribed IDDSI level — a critically important quality assurance function in institutional settings where dysphagia diets are prepared in large quantities. The nutritional challenge of texture-modified diets is significant: Level 4 pureed diets typically contain 20–30% fewer calories per serving than regular meals (because the puréeing process adds liquid and reduces energy density), and the visual monotony and reduced flavor of pureed foods reduces appetite and food intake. Strategies to mitigate this include food fortification (adding protein powder, oil, or other energy-dense ingredients to pureed foods), using molds to present pureed foods in shapes resembling their original form (improving visual appeal), and ensuring adequate seasoning to maximize flavor intensity.


Chapter 20: Long-Term Care and Institutional Nutrition in Detail

Nutritional Standards, Food Fortification, and Mealtime Assistance

The nutritional care of LTC residents is governed in Ontario by the Long-Term Care Homes Act (2007) and its regulations, which specify minimum standards for meal service including the number of meals and snacks per day (three meals plus at least two snacks), minimum caloric targets, provision of meals appropriate for residents’ dietary restrictions and preferences, and requirements for registered dietitian oversight of nutritional care. Despite these regulatory requirements, malnutrition rates in Canadian LTC facilities remain high, reflecting the complex interaction of resident-level factors (severe frailty, advanced dementia, dysphagia, end-of-life decline) and system-level factors (insufficient staffing for mealtime assistance, lack of individualized care, institutional meal timing that may not align with residents’ preferences, and food preparation that prioritizes ease and safety over palatability and sensory appeal).

Food fortification — adding energy and protein to regular foods during preparation — is a practical strategy for increasing nutrient density without requiring residents to eat larger volumes. Common fortification strategies include: adding full-fat cream or butter to pureed vegetables and soups; stirring skim milk powder (providing approximately 30 g protein per 100 g) into mashed potatoes, oatmeal, or smoothies; adding grated cheese to scrambled eggs or pasta; incorporating commercial protein powders into beverages; and drizzling olive oil over dishes. Studies in LTC settings have demonstrated that fortified meals can increase daily energy intake by 200–400 kcal without increasing meal volumes, producing modest but clinically meaningful improvements in weight maintenance and muscle mass in malnourished residents. Fortification programs require careful implementation — kitchen staff training, standardized recipes with fortification specified, and regular monitoring of nutritional outcomes — to be consistently effective.

Mealtime assistance is another powerful but resource-intensive intervention for improving nutritional intake in residents who cannot self-feed adequately. Observational studies document that residents with dementia left to eat independently at communal meals may consume as little as 30–50% of the food placed before them, due to distraction, inability to initiate eating, difficulty with utensils, and forgetting that they have food in front of them. Direct mealtime assistance — sitting at the resident’s level, making eye contact, cueing verbally and physically, offering food consistently, and allowing adequate time — dramatically increases intake but requires trained staff with adequate time. Feeding assistance volunteers, trained family caregivers, and enhanced staffing ratios at meal times have been evaluated in research settings and consistently improve nutritional outcomes; their widespread implementation in LTC is an ongoing challenge given staffing cost constraints. Environmental modifications — adequate lighting, reduced noise and distraction, pleasant dining room settings, social dining rather than in-room meals when possible — also meaningfully increase food intake and enjoyment for cognitively impaired residents.

Oral Health and Its Impact on Nutritional Intake in Older Adults

Oral health — encompassing the condition of natural teeth, dentures, gums, and oral mucosal tissues — is a frequently overlooked but highly significant determinant of nutritional intake and dietary quality in older adults. Edentulism (complete tooth loss) affects approximately 20% of Canadians over 65, with rates much higher in older age cohorts and in lower-income groups. Partial edentulism (loss of some teeth) and poor denture fit (dentures that no longer fit well due to progressive jaw bone resorption following tooth extraction) are even more prevalent. The functional consequences of tooth loss and poor denture fit for nutrition are direct and measurable: individuals with poor oral status avoid hard, crunchy, and chewy foods — raw vegetables and fruits, nuts, lean meats, whole grain breads — in favor of soft, easily managed foods that are frequently lower in fiber, higher in simple carbohydrates, and less nutrient-dense overall. Multiple studies have documented that edentulous older adults have significantly lower intakes of dietary fiber, vitamin C, folate, and beta-carotene and higher intakes of added sugars and saturated fat compared to dentate older adults.

Xerostomia (subjective sensation of dry mouth) and hyposalivation (objective reduction in salivary flow, defined as less than 0.1 mL/min stimulated salivary flow) are extremely common in older adults — affecting 30–40% of those over 65 — and significantly impair oral nutritional function. Saliva serves multiple functions in the oral phase of swallowing: it lubricates food to form a bolus, initiates starch digestion via salivary amylase, maintains oral mucosal health, and buffers oral pH to protect teeth from acid erosion. Reduced salivary flow impairs all of these functions, causing difficulty chewing and swallowing dry or crumbly foods, altered taste perception, increased dental caries and oral mucosal infections (particularly oral candidiasis), impaired denture retention, and reduced food enjoyment. The most common cause of hyposalivation in older adults is polypharmacy: over 400 medications have anticholinergic side effects that reduce salivary flow, including antihistamines, antidepressants (particularly tricyclics and paroxetine), antipsychotics, antihypertensives, and bladder antimuscarinics. The nutritional management of xerostomia includes consuming moist, soft foods; using sauces and gravies to lubricate dry foods; sipping water with meals; using sugar-free candies or gum to stimulate salivary flow (if any salivary gland function remains); and working with the medical team to identify and substitute medications with anticholinergic burden where possible.


Chapter 21: Bone Health — Osteoporosis and Nutritional Management

The skeleton is not a static mineral reservoir but a metabolically active tissue continuously remodelled by the opposing activities of osteoblasts (bone-forming cells) and osteoclasts (bone-resorbing cells). In healthy young adults, bone resorption and formation are tightly coupled through the RANK-RANKL-OPG signaling system: RANKL (receptor activator of nuclear factor kappa-B ligand), produced by osteoblasts and stromal cells in response to PTH, vitamin D, IL-6, and other signals, binds RANK on osteoclast precursors and stimulates their differentiation and activation. OPG (osteoprotegerin), also produced by osteoblasts, is a decoy receptor that competitively inhibits RANKL, thus reducing osteoclast activity. In young adulthood, when peak bone mass has been achieved, remodelling is in approximate balance. With aging — particularly after menopause in women and progressively from age 50 onward in men — the RANKL/OPG ratio shifts in favor of resorption: estrogen normally promotes OPG production and suppresses RANKL, so its loss at menopause disinhibits osteoclastogenesis and accelerates bone resorption. Additionally, rising PTH levels (secondary to vitamin D deficiency and declining calcium absorption) stimulate osteoclast activity and bone resorption to maintain serum calcium.

Nutritional determinants of peak bone mass — achieved in the late second and early third decade — are primarily calcium and vitamin D intake during childhood and adolescence, as well as protein (which provides the amino acids for collagen synthesis) and physical loading. Because peak bone mass is the “bone bank” from which age-related losses are drawn, optimizing it through adequate childhood and adolescent nutrition is arguably the most important long-term intervention for osteoporosis prevention. However, for adults who have already reached their maximum bone mass, slowing the rate of age-related bone loss — through optimizing calcium and vitamin D intake, maintaining weight-bearing physical activity, avoiding smoking and excessive alcohol, and pharmacological intervention when indicated — becomes the focus of management.

Vitamin D3 (cholecalciferol) is synthesized in the epidermis when 7-dehydrocholesterol is converted to pre-vitamin D3 by UVB radiation (wavelength 290–315 nm), then undergoes thermal isomerization to D3. This cutaneous synthesis is impaired in older adults by: (1) reduced 7-dehydrocholesterol content of the aging dermis (approximately 75% lower in adults over 70 than in young adults); (2) reduced efficiency of thermal isomerization; (3) reduced sun exposure from decreased outdoor time, use of sun-protective clothing and sunscreen, and institutionalization; and (4) darker skin pigmentation (melanin competes with 7-dehydrocholesterol for UVB photons). Together, these factors mean that older adults in northern latitudes like Waterloo, Ontario — where UVB radiation is insufficient for cutaneous synthesis from October through April regardless of sun exposure — require dietary or supplemental vitamin D to maintain adequate serum 25(OH)D year-round.

Alendronate (a bisphosphonate), the most commonly prescribed pharmacological treatment for osteoporosis, works by inhibiting osteoclast activity through farnesyl diphosphate synthase inhibition in the mevalonate pathway, reducing bone resorption. Bisphosphonates have an unusual interaction with dietary calcium: they must be taken on an empty stomach (30–60 minutes before any food, drink other than water, or other medications) because their absorption is dramatically reduced by calcium, other minerals, and food in general. Additionally, their esophageal safety requires that patients remain upright for 30 minutes after taking the medication and that they not crush the tablets. These practical requirements complicate medication adherence in older adults with cognitive impairment or functional limitations. Nutritional counseling for patients on bisphosphonates includes ensuring adequate dietary calcium and vitamin D to provide the substrate for the bone formation that bisphosphonate-preserved coupling will promote, while reminding patients about the timing requirements for medication administration.


Chapter 22: Pressure Injuries, Wound Healing, and Nutritional Support

Arginine, Micronutrients, and Enteral Nutrition in Wound Care

The nutritional management of pressure injuries extends beyond the general principles of adequate protein and energy intake discussed in Chapter 9. Arginine — an amino acid classified as conditionally essential in stress states — warrants particular attention in wound healing. Arginine is the substrate for both nitric oxide (NO) synthesis (via nitric oxide synthase) and polyamine synthesis (via arginase/ornithine decarboxylase). NO is required for angiogenesis (new capillary growth into the wound bed), macrophage bactericidal activity, and smooth muscle vasodilation that perfuses healing tissue. Polyamines (putrescine, spermidine, spermine) are required for cell proliferation — particularly fibroblast and keratinocyte proliferation during re-epithelialization. Arginine availability may become rate-limiting for wound healing in severe pressure injuries, and oral arginine supplementation (4.5–18 g/day) or specialized enteral formulas enriched in arginine (Juven, Arginaid) have been shown in several RCTs to accelerate healing of Stage III-IV pressure injuries.

The choice between oral nutritional supplementation and enteral nutrition (tube feeding) in malnourished older adults with pressure injuries should be guided by the individual’s ability to eat and swallow safely and adequately, their goals of care, and the risk-benefit balance of tube placement. For individuals who can eat but are not achieving adequate intake orally, oral nutritional supplements (ONS) — commercially prepared liquid formulas (e.g., Ensure, Boost, Fortisip) providing 200–400 kcal and 10–20 g protein per 200 mL serving — represent a practical first-line intervention. ONS have been shown in meta-analyses to reduce malnutrition rates, reduce mortality, and improve functional outcomes in malnourished hospitalized older adults. For those who cannot eat safely or adequately — due to severe dysphagia, coma, severe anorexia, or extreme functional limitations — enteral nutrition (EN) via nasogastric (NG), nasojejunal (NJ), or gastrostomy (PEG, percutaneous endoscopic gastrostomy) tube may be considered. However, enteral tube feeding in older adults with advanced dementia is not recommended by major clinical guidelines (including those of the Canadian Geriatrics Society), because evidence consistently shows that tube feeding in this context does not prevent aspiration pneumonia (as oropharyngeal secretions continue to be aspirated regardless of nutrition route), does not improve quality of life, does not reduce pressure injury risk (which is driven primarily by immobility, shear, and moisture rather than nutrition alone), and may increase distress and discomfort in persons who are near end of life.


Chapter 23: Synthesis — Integrating Nutritional Care Across the Continuum

The Interprofessional Team in Geriatric Nutritional Care

Optimal nutritional care for older adults across the continuum of care requires a coordinated interprofessional approach. The registered dietitian (RD) is the primary clinical expert in nutritional assessment, diagnosis, and intervention, and their involvement is the strongest single predictor of nutritional care quality in any setting. However, the RD cannot accomplish comprehensive nutritional care alone: physicians and nurse practitioners diagnose and manage the medical conditions and medications that drive nutritional risk; nurses and personal support workers implement feeding assistance strategies, monitor intake, and recognize early signs of nutritional decline; speech-language pathologists assess and manage dysphagia; occupational therapists assess and address functional barriers to eating (grip strength, range of motion, adaptive equipment); pharmacists identify drug-nutrient interactions and recommend appropriate drug formulations for those with swallowing difficulties; social workers and case managers address social determinants of food insecurity; and family members and caregivers provide meal preparation, shopping, and mealtime support in home settings.

Kinesiology professionals working with older adults contribute to nutritional outcomes through several specific mechanisms. Exercise prescription for sarcopenia prevention must be coordinated with protein timing and dietary recommendations to maximize the synergistic anabolic effect of resistance exercise plus dietary protein. Fitness assessment including grip strength, gait speed, and the Short Physical Performance Battery (SPPB) — measures used in the EWGSOP2 diagnostic criteria for sarcopenia — are within the kinesiology scope of practice and provide data directly relevant to nutritional risk stratification. Falls prevention programs that integrate exercise and nutritional interventions (vitamin D supplementation, protein adequacy, hydration) produce better outcomes than either component alone. Physical activity counseling that accounts for the energy expenditure implications for nutritional requirements — recognizing that increasing physical activity in a malnourished older adult may worsen energy balance if food intake is not simultaneously increased — requires nutritional literacy. The KIN 342 course equips kinesiology students with the nutritional assessment skills and condition-specific knowledge necessary to contribute meaningfully to interprofessional nutritional care for older adults in all practice settings.

Summary of Key DRI Values for Older Adults (71+ Years)

The following summarizes the key nutrient reference values for adults 71 and older according to Health Canada’s Dietary Reference Intakes:

Energy: Approximately 1,800–2,200 kcal/day for women and 2,000–2,600 kcal/day for men depending on activity level (estimated using predictive equations adjusted for physical activity).

Protein: RDA 0.80 g/kg/day per official DRI; clinical guidelines for healthy older adults recommend 1.0–1.2 g/kg/day; for those with sarcopenia or acute illness, 1.2–1.5 g/kg/day.

Carbohydrate: RDA 130 g/day (acceptable macronutrient distribution range 45–65% of total energy); fiber adequate intake 21 g/day for women 51+, 30 g/day for men 51+.

Fat: Acceptable macronutrient distribution range 20–35% of total energy; omega-3 adequate intake (EPA+DHA): 0.5 g/day (dietary; most clinical guidelines for cardiovascular and brain health recommend 1–2 g EPA+DHA/day).

Vitamin D: RDA 800 IU (20 mcg)/day for adults 71+; many clinical guidelines recommend 1,000–2,000 IU/day given the prevalence of deficiency. Tolerable upper intake level: 4,000 IU/day.

Calcium: RDA 1,200 mg/day for women 51+ and men 71+; 1,000 mg/day for men 51–70. Tolerable upper intake level: 2,000 mg/day.

Vitamin B12: RDA 2.4 mcg/day; older adults are advised to meet most of their requirement from crystalline B12 (fortified foods or supplements) rather than food-bound B12.

Folate: RDA 400 mcg DFE (dietary folate equivalents)/day. Tolerable upper intake level for synthetic folic acid: 1,000 mcg/day.

Vitamin B6: RDA 1.7 mg/day for men 51+; 1.5 mg/day for women 51+. Tolerable upper intake level: 100 mg/day.

Iron: RDA 8 mg/day for both men and women over 50 (requirements decrease for women post-menopause).

Zinc: RDA 11 mg/day for men 51+; 8 mg/day for women 51+. Tolerable upper intake level: 40 mg/day.

Magnesium: RDA 420 mg/day for men 51+; 320 mg/day for women 51+.

Vitamin C: RDA 90 mg/day for men; 75 mg/day for women (additional 35 mg/day recommended for smokers). Tolerable upper intake level: 2,000 mg/day.

Vitamin A: RDA 900 mcg RAE/day for men; 700 mcg RAE/day for women. Tolerable upper intake level: 3,000 mcg preformed vitamin A/day (caution with supplements, as excess preformed vitamin A increases fracture risk).

Potassium: Adequate intake 4,700 mg/day; important for blood pressure regulation; most older adults are below this target.

Sodium: Adequate intake 1,500 mg/day for adults 51+; chronic disease risk reduction intake is below 2,300 mg/day; most older Canadians consume 3,000–3,500 mg/day, substantially above recommendations.

Understanding these reference values — not as rigid prescriptive targets but as evidence-based benchmarks against which individual assessment findings are interpreted — is central to competent nutritional practice with older adults. The DRIs represent population-level averages that must be individualized based on disease status, medications, functional capacity, food preferences, and goals of care. The skilled clinician — whether registered dietitian, physician, nurse, or kinesiology professional — uses DRI values as a starting framework and modifies recommendations based on the complete picture of the individual older adult’s needs, resources, and values. This integration of science and person-centred practice is the ultimate goal of KIN 342.


Chapter 24: Special Topics — Alcohol, Medications, and Emerging Evidence

Alcohol Use in Older Adults: Detailed Nutritional and Clinical Considerations

Alcohol occupies a complex position in the nutritional landscape of aging. While moderate alcohol consumption — typically defined as up to one standard drink per day for women and two for men — has been associated in many observational studies with reduced cardiovascular mortality (the so-called “J-curve” relationship between alcohol intake and total mortality), the interpretation of this association has been substantially revised. Mendelian randomization analyses — which use genetic variants as instrumental variables to estimate causal effects of alcohol, bypassing confounding by lifestyle factors — suggest that the apparent cardiovascular benefit of moderate drinking observed in epidemiological studies may largely reflect confounding by healthy user bias (moderate drinkers tend to have many other health-promoting behaviors compared to both abstainers and heavy drinkers) and sick quitter bias (former drinkers who have quit due to illness are classified as non-drinkers, artificially making the abstainer group appear less healthy than moderate drinkers). The World Health Organization, in a position statement updated in 2022, stated that there is no safe level of alcohol consumption when it comes to cancer risk — alcohol is a Group 1 carcinogen (International Agency for Research on Cancer) causally associated with cancers of the oral cavity, pharynx, larynx, esophagus, liver, colorectum, and breast.

For older adults specifically, the pharmacokinetics of alcohol are substantially altered relative to younger individuals, creating heightened vulnerability to adverse effects at any given dose. First, total body water decreases by approximately 10–15% between young adulthood and age 70 (due to loss of muscle mass, which is highly hydrated), while total body fat increases. Since alcohol distributes into body water but not fat, an older adult who consumes the same dose of alcohol as a younger adult of the same weight will achieve a higher peak blood alcohol concentration (BAC) because alcohol is distributed into a smaller volume of body water. Second, gastric alcohol dehydrogenase (ADH) activity — which begins metabolizing alcohol in the stomach wall before it enters the portal circulation — declines with aging (particularly in women), reducing first-pass metabolism and increasing systemic bioavailability of a given oral dose. Third, hepatic ADH and microsomal ethanol-oxidizing system (MEOS/CYP2E1) activities may decline modestly with aging, prolonging the duration of elevated BAC. The net result is that older adults experience the same BAC (and therefore the same degree of psychomotor impairment, cognitive slowing, and balance disruption) at a lower alcohol dose than younger adults — a phenomenon that has led both Canadian and international clinical guidelines to recommend that older adults follow lower alcohol limits than younger adults, with many geriatric specialists recommending no more than 1 standard drink per day and complete abstinence in those with fall risk, cognitive impairment, or relevant medication interactions.

The nutritional consequences of chronic heavy alcohol use in older adults are extensive and compound existing nutritional vulnerabilities. Thiamine (vitamin B1) deficiency is among the most clinically serious: alcohol directly impairs thiamine absorption, inhibits hepatic thiamine phosphorylation to the active coenzyme form (thiamine pyrophosphate, TPP), and increases renal thiamine excretion. Thiamine deficiency in the context of alcohol use causes Wernicke’s encephalopathy — a neurological emergency characterized by the triad of ophthalmoplegia (eye movement abnormalities), ataxia, and confusion — and if untreated, progresses to Korsakoff’s syndrome, characterized by severe anterograde and retrograde amnesia, confabulation, and a profound inability to form new memories that is frequently irreversible. All older adults presenting with confusion and a history of alcohol use should be treated empirically with parenteral thiamine (100 mg IV before glucose is administered, as glucose loads accelerate thiamine depletion) pending assessment. Folate deficiency from chronic alcohol use results from reduced dietary intake (heavy drinkers frequently have poor diets), impaired intestinal folate absorption, reduced hepatic folate storage, and increased renal folate excretion. Alcohol-related folate deficiency contributes to megaloblastic anemia, elevated homocysteine, and — particularly concerning in older adults — increased risk of colorectal cancer (folate is required for normal DNA methylation and repair). Magnesium and zinc deficiencies are common from increased urinary losses driven by alcohol-induced inhibition of tubular reabsorption.

Emerging Nutritional Interventions: Creatine, Beta-Hydroxy-Beta-Methylbutyrate, and Gut Microbiome Modulation

Beyond the established nutritional interventions reviewed throughout these notes, several emerging nutritional strategies for aging are generating substantial research interest and may have clinical applications in the coming decade.

Creatine — a nitrogen-containing compound synthesized endogenously from arginine, glycine, and methionine, and consumed primarily from meat and fish — functions as a rapid energy buffer in skeletal muscle through the phosphocreatine system (creatine phosphate + ADP → creatine + ATP, catalyzed by creatine kinase). Creatine supplementation (3–5 g/day after an initial loading dose of 20 g/day for 5–7 days, or 3 g/day without loading) has been shown in younger adults to increase intramuscular phosphocreatine concentrations, augmenting the capacity for high-intensity exercise and potentiating the hypertrophic response to resistance exercise. In older adults, a series of randomized trials — including a well-designed trial by Candow et al. at the University of Regina — have demonstrated that creatine supplementation combined with resistance exercise produces significantly greater gains in lean body mass, muscle strength, and functional performance than resistance exercise alone. The mechanism in older adults is not limited to energy buffering: creatine may also reduce markers of muscle protein catabolism, increase insulin-like growth factor-1 signaling in muscle, and — intriguingly — reduce levels of myostatin, the TGF-beta family member that inhibits muscle growth. Creatine is generally well-tolerated in older adults, with the primary side effect being a small (1–2 kg) initial increase in body weight from water retention in creatine-loaded muscle (beneficial for lean mass appearance and potentially for muscle hydration status, but potentially undesirable in those with heart failure or renal failure).

Beta-hydroxy-beta-methylbutyrate (HMB) — a metabolite of the amino acid leucine, produced in small amounts endogenously from leucine transamination and subsequent oxidation — has been marketed as a muscle-preserving supplement based on its proposed mechanisms of inhibiting proteasomal protein degradation and activating mTORC1. Early trials in older adults produced promising results; however, a landmark trial by Stout et al. (2013) in community-dwelling older adults did find that HMB supplementation (3 g/day for 12 months) significantly reduced loss of lean body mass. However, more recent meta-analyses have found inconsistent results, with effect sizes smaller than for protein supplementation alone and larger variability across studies. HMB may be most beneficial in conditions of accelerated muscle catabolism (acute illness, prolonged bed rest) rather than as a general muscle mass supplement. HMB is produced naturally from leucine oxidation, meaning that a high-leucine protein diet generates some endogenous HMB — the additional benefit of HMB supplementation on top of an already high-protein diet may therefore be marginal.

Modulation of the gut microbiome through dietary strategies represents an exciting frontier in aging research. The age-related microbiome changes described earlier — reduced diversity, reduced beneficial fiber-fermenting taxa, increased pro-inflammatory species — are now understood to contribute to inflammaging, metabolic dysfunction, and even cognitive decline through the gut-brain axis. Dietary strategies that favorably modify the aging microbiome include: increasing dietary fiber (particularly prebiotic fibers — inulin, fructooligosaccharides from chicory root, garlic, leeks, asparagus, and bananas; galacto-oligosaccharides from legumes; arabinoxylan from whole wheat — that selectively stimulate growth of beneficial Bifidobacterium and Lactobacillus species and increase SCFA production); consuming probiotic foods (fermented foods including yogurt, kefir, sauerkraut, kimchi, miso, and tempeh containing live beneficial bacteria); and reducing ultra-processed food intake (which is associated with reduced microbiome diversity and increased levels of lipopolysaccharide-producing Gram-negative bacteria that contribute to endotoxemia and systemic inflammation). The clinical evidence for probiotic supplementation in older adults remains heterogeneous, with different probiotic strains producing different effects and the most consistent evidence for specific probiotic strains in reducing the duration and severity of antibiotic-associated diarrhea, preventing Clostridioides difficile infection, and improving specific immune outcomes. The integration of microbiome science into individualized nutritional recommendations for older adults is likely to be a defining feature of geriatric nutrition practice in the coming decade, making familiarity with its foundational concepts an important component of contemporary nutritional training.

Nutritional Considerations in Palliative and End-of-Life Care

The final dimension of nutritional care across the continuum is the care of older adults who are approaching the end of life. In palliative and end-of-life care, the goals of nutritional intervention shift fundamentally from optimization of nutritional status (preventing malnutrition, maintaining muscle mass, managing disease) to comfort and quality of life. As death approaches — whether from advanced dementia, cancer, heart failure, or other terminal conditions — the natural decline in appetite and food intake reflects the body’s progressive withdrawal from metabolic processes rather than a treatable nutritional deficit. Attempting to reverse this decline through aggressive nutritional intervention (tube feeding, parenteral nutrition) in terminally ill older adults has not been shown to prolong life, improve comfort, or enhance quality of life in high-quality clinical research, and may increase suffering through the discomfort of tube placement, restraints to prevent tube removal, aspiration complications, and the distress of forced feeding against the individual’s wishes.

The role of the nutritional care team in palliative settings is to support the patient’s dignity, comfort, and autonomy in relation to food and eating. This means: offering favorite foods in preferred textures and flavors; respecting food refusals without pressure; providing pleasure-focused eating — small amounts of preferred foods for sensory enjoyment even when nutritional adequacy is not achievable; supporting family understanding of the natural process of appetite decline in dying (which families often experience as deeply distressing, interpreting food refusal as starvation that could be prevented if only the patient would eat); and facilitating culturally meaningful food practices in the final days of life. Thirst and dry mouth at end of life can be managed with ice chips, mouth swabs, and small sips of preferred liquids rather than intravenous hydration in most cases, with the goals of comfort rather than physiological hydration maintenance. These nuanced and deeply human aspects of nutritional care at the end of life are among the most important — and least technically mediated — dimensions of geriatric nutrition practice, requiring communication skills, empathy, and an understanding of palliative care goals that complements the scientific knowledge developed throughout this course.

Back to top