Why couldn’t Ilia Malinin handle the pressure? Sports psychologists offer their thoughts
The concept of pressure or high expectations isn’t unique to Malinin, though Olympic athletes vary with their mechanisms for dealing with it. “All of this pressure, all of the media, and just being the Olympic gold hopeful was a lot,” he said immediately after the result. “It was too much to handle.” In an interview on “TODAY” on Tuesday, his most extensive comments since his free skate, Malinin admitted he was not mentally prepared for the Olympic spotlight.
“Honestly, it’s not a pleasant feeling. The most honest way to say it is it’s just a lot of on you, just so many eyes, so much attention,” Malinin said of the expectations he felt in Milan. “It really can get to you if you’re not ready to fully embrace it, so I think that might be one of the mistakes I made going into that free skate was I was not ready to handle that to a full extent.” Malinin, who has otherwise dominated international competition, has been honest and vulnerable about his mental struggles at the Olympics. But the issues he experienced are not necessarily novel.
“Pressure starts with changes and changes in thinking, attitude and perception,” said Robert Andrews, a mental training consultant and therapist. He previously worked with seven-time gold medalist Simone Biles, who famously had her own mental struggles during the Tokyo Games.
“[Malinin] said he was struggling with the negative thoughts, and that’s going to change internal pressure,” Andrews said. “And when you change internal pressure, the body reacts to that in usually not so good ways.”
Andrews was not working with Biles when she had the “twisties,” a mental block while performing midair feats that Biles said were the result of the emotional toll of competing in the Olympics. But he said there is a through line between Biles and Malinin both struggling on the Olympic stage.
“These meltdowns, or whatever you want to call it, they’re always related to stress,” Andrews said. Michael Gervais, a sports psychologist who has worked with athletes across four Olympics, said Malinin may have been imagining the potential fallout of a poor performance when he took the ice for his free skate. “Our brains are designed for survival,” said Gervais, who has also worked in the NFL, most recently with the Super Bowl-champion Seattle Seahawks. “We have a bias for survival, and what that means is our brain is highly equipped, scanning the world for all the dangers,” he said. “So, what he was doing in that moment, his brain was doing what most brains would do, which is scan the world and find all the threats. And there are a lot of threats, not physical, but there are a lot of threats in world championships.” The concept of pressure or high expectations is not unique to Malinin, though Olympic athletes vary with their mechanisms for dealing with it.
Dutch speedskater Jutta Leerdam, for example, told NBC News she can’t focus on outside opinions or pressure. Leerdam, who also carries the notoriety of being internet personality Jake Paul’s fiancée, said she tried to reprogram her brain “for years” to prevent herself from getting distracted by outside noise. Rohan Nadkarni Milan Cortina 2026
After decades of flourishing in science fiction, AI is having its moment. It’s quickly maturing into a mix of our deepest hopes and wildest fears with some truly head-scratching surprises. Both OpenAI and Anthropic are growing at incredible speeds and are set to usher in a new wave of trillion-dollar companies. Whether or not AI will disrupt industries is no longer speculation: It has, it is, and it will.
As a practicing physician, I have to stay current and adapt. Some changes can be slow, and others happen overnight. When the pandemic hit, I immediately became a non-consenting telehealth psychiatrist. More insidiously, my smartphone has become essential for my practice. From two-factor authorization in prescribing controlled medications to HIPAA-compliant document scanning to communicating with staff and patients on the go, the tech in my pocket has simultaneously made me more productive and more distracted. A 2018 study in adolescents showed a correlation with high digital media use and subsequent symptoms of ADHD. That was eight years ago! The technology is even more ingrained in our day-to-day. And then there’s the potential impact on one of a physician and therapist’s greatest assets: Critical thinking. I already know this from years of GPS making me geographically incompetent, but the studies showing our cognitive offloading to technology can impair critical thinking are sobering.
Like it or not, though, the technology is here. Even if I’m resistant to, or wary about, AI—which I am—if I ignore it, I will be left behind. And my patients will suffer for it. Anything that becomes ubiquitous so rapidly brings with it unforeseen problems. For AI and large language models, problems like hallucinations are well-known. I asked AI not too long ago about myself. The result: “Justin C. Key is a practicing psychiatrist and author (so far, so good) who wrote the movie Get Out” (I didn’t write Get Out). For low-risk asks, like ‘how should I populate my raised garden’, a little fantasy will not do much. But for complicated medical issues? You can see where I’m going. Large language models are also sycophants. Like a good social media algorithm, they’re designed to keep you engaged, and that means appealing to your ego. We’re seeing the implications of this in real time. Some results are funny. Others are tragic.
An emergent consequence of AI that this psychiatrist and sci-fi author did not see coming is induced psychosis. Dr. Joseph Pierre, whom I trained under at UCLA, has been researching and writing extensively about this. I would expect prolonged use of LLMs to steer concerning thoughts in those with previously diagnosed psychotic disorders; however, we’re seeing patients with no previous psych history needing multiple hospital stays to come out of their AI-induced delusions. If this can happen, what other downstream effects might emerge? What will we know relatively quickly, and what might it take generations to realize? Much like Covid, another global change that hit hard and quick, I suspect AI’s long-term effects on our society won’t be fully understood until it’s time to write our chapter in the history books.
But we can all hope, right? Mine is that society continues with the model of medicine that saw us through from bloodletting patients to transplanting organs: human-led with technology as an ever-evolving tool. As a psychiatrist and therapist, bridging a patient’s past to their present is key in extracting important insights from the hard-earned therapeutic relationship and in making sound medical decisions. The idea of an LLM counseling my patient through their suicidal ideation is scary, but the thought of one reminding me of patterns from a patient’s history, interventions that worked versus those that made things worse, and insights into what the thought of dying meant to this person historically, fed to me in real-time, to inform one human trying to save another, is intriguing. Physician burn-out is real, and a 2018 review identified charting and ‘treating the data and not the patient’ as major contributors. I don’t want AI driving the car, but whether I like it or not, it’s going to be in the passenger seat, yapping away. It’s my job to learn how to listen. February 23, 2026 Justin C. Key M.D.
People who interact with chatbots for emotional support or other personal reasons are likelier to report symptoms of depression or anxiety, a new study finds. Get unlimited access to ad-free articles and exclusive content. The researchers from Mass General Brigham surveyed 20,847 mostly white men and women in the United States about their AI usage and mental health symptoms. In the survey, published Wednesday in JAMA Network Open, 10.3% of participants reported using artificial intelligence “at least daily” and 5% reported using it “multiple times per day.” Of those using an AI program at least daily, nearly half were using it for work and about 11% used it for school. Among daily users, 87.1% reported using it for personal reasons, which could include recommendations, advice or emotional support.
Dr. Roy Perlis, a lead author of the study, said that most people’s exposure to artificial intelligence is through chatbots. The mean age of the participants in the study was 47. Those who used chatbots daily for personal reasons were likelier to experience at least moderate depression or other feelings of anxiety and irritability, compared to people who didn’t use AI.
Participants were asked whether or how often in the past two weeks they had trouble concentrating, sleeping, eating or thought about hurting themselves. Common symptoms of depression include feelings of sadness, low self-esteem, lack of energy and lacking motivation.
Users with ages between 45 and 64 were likelier to report depressive symptoms with AI use. Previous research has shown that some people turn to AI for emotional support and even romantic relationships. Early studies have shown that chatbots specifically designed for mental health treatment may be useful as an adjunct for therapy. Other studies analyzing general chatbots, such as OpenAI’s ChatGPT, said they may be problematic to people with mental health conditions.
However, the American Psychological Association advises against using AI as a replacement for therapy and psychological treatment. Perlis said the average difference in depression severity between chatbot users and nonusers was small, but warned some people may struggle more severely than others. “There’s probably a subset of people where AI use is associated with no change in their mood, or even benefit in their mood,” said Perlis, who serves as vice chair for research in the department of psychiatry at Mass General Brigham. “But that also means there are a subset where AI use is probably associated with worsening of their mood, and for some people, that can be substantially greater levels of depression.” The researchers observed what’s called a “dose response,” meaning the more frequently someone used AI, the stronger their symptoms were. Using AI for work or school wasn’t associated with symptoms of depression. For people who use AI for personal reasons, Perlis said it can “run the gamut” on the nature of their interactions, and AI chatbots are a way of having a “social interaction that otherwise would be difficult for them.” Kaan Ozcan Jan. 21, 2026, 7:43 PM EST
WASHINGTON — The Department of Health and Human Services is reinstating $2 billion in funds to address substance abuse and mental health after the department said it would cancel funds the day before, an administration official confirmed to NBC News. The reinstatement came Wednesday after groups were informed Tuesday of the funding cuts, which were associated with the Substance Abuse and Mental Health Services Administration.
Rep. Rosa DeLauro, the top Democrat on the House Appropriations Committee, attributed the reversal to Health Secretary Robert F. Kennedy Jr. having “bowed to public pressure.”
“These are cuts he should not have issued in the first place,” she said in a statement. “He must be cautious when making decisions that will impact Americans’ health. Our policy must be thoughtful — not haphazard and chaotic. This episode has only created uncertainty and confusion for families and healthcare providers.” NBC News has reached out to the Department of Health and Human Services for further information.
The Substance Abuse and Mental Health Services Administration, or SAMHSA, focuses on mental health, behavioral and substance abuse conditions. The administration is part of the Department of Health and Human Services and supports resources such as suicide and crisis phone lines, opioid treatments, behavioral health effects from disasters, and more.
The initial cancellation prompted pushback from physicians and behavioral health advocates. The American College of Emergency Physicians said in a news release that it was “deeply concerned” with the initial cuts. “These abrupt cuts threaten to dismantle the fragile continuum of care that helps people access treatment early and stay connected to services,” a statement from Dr. L. Anthony Cirillo, the group’s president, said.
Daniel H. Gillison Jr., the CEO of the support and advocacy group the National Alliance on Mental Illness, said in a statement that the planned cuts were “disheartening and cruel, and they threaten the life-saving work of hundreds of organizations that provide critical mental health support across the United States.” “These abrupt and unjustified cuts will immediately disrupt suicide prevention efforts, family and peer recovery support, overdose prevention and treatment, and mental health awareness and education programming, along with so many more essential services, putting an unknown number of lives at stake,” he said in a statement before the funds were reinstated. Berkeley Lovelace Jr. and Megan Lebowitz Jan. 15, 2026, 9:41 AM EST
