Associations between moderate alcohol consumption, brain iron, and cognition in UK Biobank participants: Observational and mendelian randomization analyzes

Associations between moderate alcohol consumption, brain iron, and cognition in UK Biobank participants: Observational and mendelian randomization analyzes

Summary of findings

Alcohol consumption, including at low levels, was observationally associated with higher brain iron in multiple basal ganglia regions. There was some evidence supporting a causal relationship between genetically predicted alcohol consumption and putamen and hippocampus susceptibility, although this did not survive multiple testing correction. Alcohol was associated with both higher liver iron, an index of systemic iron load, and genetically predicted AUD associated with genetically predicted serum iron markers. Brain iron accumulation in drinkers was only partially mediated via higher systemic iron. Markers of higher brain iron (higher susceptibility) were associated with poorer executive function and fluid intelligence and slower reaction speed.

The accumulation of iron in the brain we observed in moderate drinkers overlaps with findings of an observational study in AUD. Higher putamen and caudate iron levels were described in a small study of males with AUD (not = 20) [9]. These individuals were drinking substantially more than our sample—a mean of 22 standard drinks per day (>37 units daily). In contrast, we observed associations in those drinking just >7 units per week. A recent phenome-wide association study of quantitative susceptibility in the same dataset reported significant associations in basal ganglia regions with higher frequency binge drinking [10]. Regional heterogeneity in iron concentrations is well described although the aetiology is not understood [47]. The basal ganglia, including the putamen [48]have some of the highest iron concentrations in the brain and suffer the greatest age-related increases [49]. Interestingly, we found significant alcohol-age interactions with susceptibility, suggesting that alcohol may magnify age effects on brain iron. We are mindful however that within UKB, changes with age could represent a cohort effect. In this sample, associations with susceptibility and T2* measures were observed at lower alcohol intakes in females. In haemochromatosis, females are relatively protected against the clinical manifestations of iron overload through menstrual blood loss [50]. The majority of our included sample, however, (70%) was postmenopausal and menopause status did not alter alcohol–brain iron associations. Sex differences in alcohol metabolism therefore may be responsible. These findings do not support current UK “low risk” drinking guidelines that recommend identical amounts for males and females [51]. We found some support for a causal relationship between alcohol consumption and susceptibility in the putamen and hippocampus, and between AUD and putamen susceptibility in MR analysis. Although these associations did not survive multiple comparisons correction, they are in the same direction as the highly significant observational associations. Associations between genetically predicted alcohol and susceptibility in other regions were not significant. We suspect this results from our limited power to detect small associations despite the sample size, given that the genetic instruments explain less than 1% of the phenotypic variation in alcohol consumption [29]. Furthermore, weak instrument bias, in the direction of the null, may be contributing [52]. Using UKB for our calculations, about one third of SNPs we used to instrument alcohol consumption had F statistics <10 (S11 Table).

Our MR results provide evidence for a causal role of AUD in increasing serum iron and transferrin saturation, a sensitive marker of iron overload [53]. While genetically predicted alcohol use was not significantly associated with ferritin, this mirrors findings in early haemochromatosis, where ferritin levels can be normal and transferrin saturation is the earliest marker of iron overload [54]. The associations we found with liver iron, a reliable marker of systemic iron stores, were consistent with the serum results. In fact, in our study, which we believe the largest investigation of alcohol and liver iron by an order of magnitude [55], iron levels were the most sensitive liver marker of alcohol-related damage. Alcohol suppresses hepcidin production, the major hormone-regulating iron homeostasis [56]. This suppression increases intestinal absorption of dietary iron [57] and limits export of iron from hepatocytes. In our CMA, higher systemic iron levels only explained 32% of alcohol’s effects on brain iron, suggesting other mechanisms are also involved. These could include an increase of blood–brain barrier permeability to iron, in turn mediated by reduced thiamine that commonly occurs in AUD due to a combination of inadequate dietary intake, reduced absorption, and metabolic changes [58,59]. In cerebral autosomal dominant arteriopathy with subcortical infarcts and leukoencephalopathy (CADASIL) patients, iron leakage has been linked to blood–brain barrier permeability [60]. Other possible mechanisms include dopamine surges following alcohol ingestion or chronic inflammatory processes [61]. The alternative possibility is that individuals with higher brain iron drink more alcohol. One potential mechanism for this is that tyrosine hydroxylase, an enzyme in the dopamine synthesis pathway, is iron dependent [62]. Dopamine has been linked to alcohol cravings in dependence [63]. For this reason, we used MR to support/refute the observational analyses.


Higher putamen and caudate susceptibility interacted with age in predicting executive function and fluid intelligence, but not with simple motor tasks. Most, but not all, previous work has highlighted the importance of the putamen to complex motor tasks [64]. Interestingly, both trail making and the fluid intelligence tasks were performed within a time limit, and perhaps represent a measure of motor response linked to cognition, rather than a simple motor response. TMTs appear to be among the most sensitive to aging effects in the UKB cognitive battery [34]. Frontal dysfunction is well described in chronic heavy alcohol use [65]. Several putamen metrics have been associated with executive function, including blood flow [66]structural atrophy [67]and functional connectivity [68]. Iron accumulation in the putamen has also been described in developmental stuttering [69] and CADASIL [70]. Although most studies of dietary iron and cognition have been in children or anemic individuals, there is some evidence that high dietary iron associates with poorer cognition [71]. While sex differences in cognition have been described [72], it is difficult to tellangle differing iron levels from hormonal factors in the aetiology. How iron deposition could result in cognitive deficits requires further investigation. Iron co-localizes in the brain with tau and beta amyloid [73]and can cause apoptosis and ferroptosis [74]. Higher substantia nigra susceptibility associated with slower reaction speed. The substantia nigra plays a vital role in movement regulation, and iron deposition in the substantia nigra has been linked to Parkinson’s disease [75,76]a disorder with marked impairments in reaction speed [77].

To our knowledge, this is the largest study of moderate alcohol consumption and multiorgan iron accumulation. It is also the first study to use MR to investigate causality of alcohol on serum and brain iron.

We did not observe widespread associations between susceptibility or T2* and other cognitive tests or self-reported motor measures. Brain iron is likely to be an early marker of disease, and participants may have been examined too early in the process to detect clinical manifestations. Additionally, we are not likely to have captured the best phenotypes to assess basal ganglia function in the absence of objective motor measurements such as gait speed or a pegboard test. Self-reported walking speed may poorly approximate actual motor function. The cognitive tests were limited in scope and concerns have been raised about the reliability of the tests used [34]. Healthy selection biases in UKB are well described, and are likely magnified in the imaging subsample, but will equally bias the study towards null results [78]. Furthermore, associations in UKB seem to track with those observed in representative cohorts [79].

Changes in T2* and χ can reflect changes in iron but also myelin [80,81]. One key difference between T2* and χ is that iron (paramagnetic) and myelin (diamagnetic) have the opposite effect on χ in QSM, but the same effect on T2*. Hence, the positive associations we observed between χ and alcohol could theoretically be driven by increased iron or reduced myelin. If the latter, then alcohol would also be positively associated with T2* (reduced myelin leads to longer T2*). In contrast, we observed negative associations between T2* and alcohol. This supports our interpretation that increased iron is driving our results, given one highly plausible assumption, that alcohol does not increase gray matter myelination [82,83].

Partial volume effects could confound associations between hippocampal susceptibility and alcohol. For example, hippocampal atrophy, previously observed in drinkers [1], could be conflicted with changes in χ. However, this would tend to reduce estimated χ. Alcohol was self-reported, but this is the only feasible method to ascertain intake at scale. Serum markers of iron homeostasis were not directly measured in UKB. Although analyzes were controlled for the strongest SNPs associated with serum iron, these are likely to explain a low proportion of the variance. MR techniques rely on a number of assumptions that we have tried to test where possible, but residual uncertainty inevitably remains. Estimates were calculated in European individuals, but it is unclear how they generalize to other populations. MR estimates the effect of lifelong exposure, which does not necessarily translate into potential effects resulting from an intervention in adult life. Liver T2* has been useful in some studies to monitor iron overload, but further validation of this biomarker as a diagnostic marker of iron overload is needed [84]. Genetic variants explain a low variance of alcohol traits. Therefore, our analysis within the imaging sample, despite its large size, has limited power to detect small effects. The power for the larger serum iron measures was greater. For this reason, although nonlinear relationships between alcohol and health outcomes are of interest, we limited MR analyzes to linear models. Mediation analysis is not experimental in design, and relies on intervention–outcome, intervention–mediator, and mediator–outcome effects being unconfounded to permit valid causal inferences. Alcohol exposure prior to study baseline (left truncation) may bias observational estimates [85]. In this study, liver and brain iron were measured at the same time, meaning reverse causation is possible. However, it is difficult to conceive of a plausible mechanism by which brain iron levels could substantially affect systemic iron.

Never drinkers appeared to have the lowest levels of brain iron. This is in keeping with our earlier work indicating there may be no safe level of alcohol consumption for brain health [20]. Moderate drinking is highly prevalent, so if elevated brain iron is confirmed as a mechanism by which alcohol leads to cognitive decline, there are opportunities for intervention on a population scale. Iron chelation therapy is already being investigated for Alzheimer’s and Parkinson’s diseases [17,18,86]. Furthermore, if reduced thiamine is mediating brain iron accumulation, then interventions to improve nutrition and thiamine supplementation could be extended beyond harmful and dependent drinkers, as is currently recommended [87]to moderate drinkers.



Leave a Reply

Your email address will not be published.