The Perfect Storm for ADHD Overdiagnosis
Adam Omary and Jeffrey A. Singer
The sharp increase in attention-deficit/hyperactivity disorder (ADHD) diagnoses—nearly doubling among American children between 1997 and 2022, and more than tripling among adults from 2012 to 2023—has been chalked up to better screening, increased awareness, and the corrosive effects of smartphones and social media on developing brains. None of these factors holds up well under scrutiny.
,
The diagnostic category itself has been steadily widened by the institutions that define it and the financial structure that rewards every participant for applying the ADHD label. As we have argued in our Cato Institute analysis of how the American healthcare system rewards psychiatric overdiagnosis, subjective diagnostic criteria interact with a payment system that rewards diagnosis to produce predictable inflation across psychiatric categories. The result is labeling ordinary behavior as pathological. ADHD is among the cleanest case studies of that pattern.
Foraging minds in an industrial classroom.
Human cognition was shaped over hundreds of thousands of years in small foraging bands, where attentional flexibility was an asset rather than a liability. A child who scanned the horizon, registered novel stimuli, and shifted focus rapidly between threats and opportunities was a child more likely to survive. Sustained, narrowly channeled attention to a single abstract task for hours at a time was simply not part of the ancestral environment, and the cognitive machinery to produce such concentration on demand was never uniformly selected for. What we now call distractibility is, in another light, vigilance. Most animals, including ancestral humans, evolved to be constantly on the lookout for novelty and threat.
,
,
Mass schooling, which emerged in the 19th century in part to prepare children for industrial labor, asks something quite different. It asks 6- and 7‑year-olds to sit still in rows, suppress physical movement, attend to a single voice for extended stretches, and produce written output on a fixed schedule. Most children adapt. The variance in how easily they do so is enormous, and the children at the lower tail of conformity to that demand have come to define the diagnostic category.
Boys end up there more often than girls, for reasons that are not mysterious. Boys, on average, are more physically active, take longer to develop self-control, and are more drawn to rough play. The same pattern shows up in other mammals and tracks the effects of testosterone on brain development. Put boys in a room and tell them to sit still for six hours, and a predictable share of them will fail, not because they are mentally ill but because they are boys. The youngest children in any classroom are also more likely to be diagnosed with ADHD than their older peers, a finding so robust across studies and countries that it points to ordinary developmental variation rather than disease.
The evolutionary frame also provides a more nuanced understanding of the fear that screen time in childhood harms brain development and attention span. The brain is plastic, especially in childhood, and it adapts to the environment it is given. That plasticity is precisely what allowed generations raised under modern industrialized education systems to develop the sustained attention style that schools reward, despite it being so far from our environment of evolutionary adaptedness. But a generation that grows up navigating fast-moving feeds, switching between applications, and processing rapid streams of visual information will predictably develop a different attentional profile than one raised on books and chalkboards.
That does not mean the learning or attention span of youth raised on digital technology is impaired. Heavy media multitaskers and habitual users of touchscreen devices do tend to perform worse on tasks that demand sustained, narrowly focused attention and inhibitory control. But they also tend to perform better on tasks that demand rapid visual search, parallel processing of multiple objects, and flexible reallocation of attention. Action video game play, in particular, has been shown to enhance visual selective attention, processing speed, and the spatial resolution of vision. These effects transfer beyond the trained task, improving general abilities to track several moving things at once, spot relevant objects in a crowded scene, and pick out a target faster when surrounded by distractions.
These cognitive trade-offs elucidate how neural plasticity gives rise to different forms of intelligence. The brain has a finite budget of computational and metabolic resources, and the cortex reallocates them in response to the demands placed on it. The clearest demonstrations come from sensory deprivation: In people who lose their sight, the visual cortex does not simply lie fallow but is recruited for auditory and tactile processing, including Braille reading, with measurable gains in those domains. Congenitally deaf individuals show analogous repurposing of the auditory cortex for vision and smell. Every brain is continuously specializing toward whatever it does most, and different cognitive skillsets have different trade-offs.
A brain trained on rapid feeds and parallel streams gets better at rapid visual search, switching, and parallel processing while getting worse at slow, serial, endogenous focus. A brain trained on long books and chalkboards makes the opposite trade. The picture is not simply that screens damage children’s brains or lower their intelligence. Claims of generalized cognitive harm typically rest on measures of a single attentional style, the one schools happen to demand, and ignore the capacities that grow on the other side of the ledger. Calling the resulting attentional profile ADHD, treating it as a chronic illness, and medicating it accordingly is a category error. The trade-offs are real, but the diagnostic system measures only the deficits because only the deficits are reimbursable.
From hyperkinetic boys to inattentive adults.
The diagnostic category we now call ADHD has been progressively widened almost from the moment it entered the Diagnostic and Statistical Manual of Mental Disorders, the reference text published by the American Psychiatric Association that defines the criteria for every recognized psychiatric condition in the United States. The DSM-II, published in 1968, listed the condition as “hyperkinetic reaction of childhood” and described it in a single sentence, focused on the restless, disruptive child, almost always identified as a boy, who would supposedly grow out of the condition by adolescence. The DSM-III, in 1980, renamed it attention deficit disorder, with or without hyperactivity, and for the first time treated inattention as a stand-alone presentation rather than a symptom of restlessness. That single revision opened the category to a far larger population of children, especially girls, whose attentional patterns had previously been invisible to the diagnostic system.
The DSM-III‑R, in 1987, folded the subtypes back together and introduced the current acronym, ADHD. The DSM-IV, in 1994, separated the disorder again into three presentations—predominantly inattentive, predominantly hyperactive-impulsive, and combined—and explicitly extended the diagnosis into social, academic, and vocational contexts beyond childhood. Studies comparing the DSM-III‑R and DSM-IV criteria directly found that prevalence rose from 9.6 to 17.8 percent under one set of comparisons and from 7.3 to 11.4 percent under another, almost entirely on the strength of newly identified inattentive cases. The DSM‑5, in 2013, raised the age-of-onset requirement from 7 to 12 and lowered the symptom threshold for adults.
Each revision expanded the population eligible for diagnosis, and, with it, the population eligible for stimulant prescriptions, academic accommodations, and disability protections. The trajectory runs in one direction. There is no edition of the DSM in which the criteria for ADHD became more restrictive.
The incentive problem.
Layered atop the definitional and developmental story is a set of economic incentives that quietly lower the threshold for diagnosis. A growing share of these diagnoses now comes from primary care clinicians rather than specialists, reflecting how rapidly ADHD treatment has migrated into routine primary care, and how the expansion of telehealth lowered the friction of obtaining a prescription.
One of the clearest examples of incentives for overdiagnosis comes from how we finance education. When special-education funding is tied to specific diagnoses, schools have a built-in reason to identify more students with ADHD, because the label unlocks additional resources. Researchers have documented systematic differences in diagnosis and treatment that align with funding formulas rather than with underlying disease rates, a pattern consistent with third-party financial incentives shaping who gets labeled. Clinicians do not work in isolation; they respond to expectations from schools, families, and the broader system. Once stakeholders recognize that a diagnosis unlocks services, pressure to apply the label tends to grow.
Primary care clinicians typically practice in fee-for-service systems, where assigning a diagnosis makes the encounter billable and enables reimbursement for follow-up visits and medication management. Patients have their own incentives to seek the diagnosis, including academic accommodations, workplace protections, and access to performance-enhancing stimulants such as Adderall. In an environment where the condition is defined by subjective criteria rather than objective tests, it is unsurprising that some individuals exaggerate or feign symptoms to obtain those benefits.
The pattern is by now familiar. As we documented in our analysis of Medicaid-funded autism therapy, the broadening of autism criteria, combined with open-ended reimbursement, produced an explosion in spending on applied behavior analysis that far outpaced any plausible change in the prevalence of disabling autism. The broadening of ADHD criteria has produced a parallel surge in stimulant prescriptions, and our recent piece against the campaign to formalize “social media addiction” anticipates the same trajectory if that diagnosis is formalized. In each case, subjective diagnosis and financial incentives that reward diagnosis push the boundaries of illness outward.
What this should teach us.
The growth in diagnoses is best understood as the cumulative output of several systems, each behaving in a way its incentives reward. Definitions expand because there is little institutional pressure to keep them tight. Clinicians diagnose because diagnosis is what the system pays for. Schools refer because referrals bring resources. Patients seek labels because labels bring access to special accommodations. The aggregate effect is a steady erosion of the line between ordinary human variation and clinical disease.
That erosion has costs. Children whose ordinary inattentiveness is medicated as a chronic condition, adults who organize their identities around a label, and patients with severely impairing ADHD whose treatment resources are diluted across an ever-larger pool all bear those costs. The path to more reliable diagnoses runs through more reliable incentives: tighter criteria, independent assessments, and payment structures that do not reward expanding the definition of illness. Policymakers should stop structuring schools, insurers, and healthcare systems so that people must acquire a medical diagnosis to receive help, accommodations, or reimbursement.
Children today have greater safety, resource availability, and tools for education than any cohort in human history. It is the schoolroom that asks kids to sit still for hours and the diagnostic system that pathologizes the ones who cannot that are the more unusual and pathological features of modernity. A more honest accounting would distinguish severely impairing attentional disorders from the wider band of ordinary human variation. It would recognize that the temperaments now most likely to be medicalized are, in a different setting, the temperaments that helped aid survival and human progress.
Source: https://www.cato.org/commentary/perfect-storm-adhd-overdiagnosis
Anyone can join.
Anyone can contribute.
Anyone can become informed about their world.
"United We Stand" Click Here To Create Your Personal Citizen Journalist Account Today, Be Sure To Invite Your Friends.
Before It’s News® is a community of individuals who report on what’s going on around them, from all around the world. Anyone can join. Anyone can contribute. Anyone can become informed about their world. "United We Stand" Click Here To Create Your Personal Citizen Journalist Account Today, Be Sure To Invite Your Friends.
LION'S MANE PRODUCT
Try Our Lion’s Mane WHOLE MIND Nootropic Blend 60 Capsules
Mushrooms are having a moment. One fabulous fungus in particular, lion’s mane, may help improve memory, depression and anxiety symptoms. They are also an excellent source of nutrients that show promise as a therapy for dementia, and other neurodegenerative diseases. If you’re living with anxiety or depression, you may be curious about all the therapy options out there — including the natural ones.Our Lion’s Mane WHOLE MIND Nootropic Blend has been formulated to utilize the potency of Lion’s mane but also include the benefits of four other Highly Beneficial Mushrooms. Synergistically, they work together to Build your health through improving cognitive function and immunity regardless of your age. Our Nootropic not only improves your Cognitive Function and Activates your Immune System, but it benefits growth of Essential Gut Flora, further enhancing your Vitality.
Our Formula includes: Lion’s Mane Mushrooms which Increase Brain Power through nerve growth, lessen anxiety, reduce depression, and improve concentration. Its an excellent adaptogen, promotes sleep and improves immunity. Shiitake Mushrooms which Fight cancer cells and infectious disease, boost the immune system, promotes brain function, and serves as a source of B vitamins. Maitake Mushrooms which regulate blood sugar levels of diabetics, reduce hypertension and boosts the immune system. Reishi Mushrooms which Fight inflammation, liver disease, fatigue, tumor growth and cancer. They Improve skin disorders and soothes digestive problems, stomach ulcers and leaky gut syndrome. Chaga Mushrooms which have anti-aging effects, boost immune function, improve stamina and athletic performance, even act as a natural aphrodisiac, fighting diabetes and improving liver function. Try Our Lion’s Mane WHOLE MIND Nootropic Blend 60 Capsules Today. Be 100% Satisfied or Receive a Full Money Back Guarantee. Order Yours Today by Following This Link.

