Skip to Content Facebook Feature Image

US 'rock star' Paralympic skier wins silver for his late twin brother

Sport

US 'rock star' Paralympic skier wins silver for his late twin brother
Sport

Sport

US 'rock star' Paralympic skier wins silver for his late twin brother

2026-03-10 05:43 Last Updated At:06:11

CORTINA D'AMPEZZO, Italy (AP) — Patrick Halgren, the self-proclaimed "rock star" of the Milan Cortina Paralympics, said he could feel the presence of his late twin brother in his silver medal-winning Para alpine ski run on Monday.

"He made this happen for real. He is the ski god and he has blessed me with speed today,” Halgren said of his brother Lucas Sven Halgren.

Lucas Sven died in a motorcycle accident in New Zealand in 2016, three years after Patrick nearly died and lost most of his left leg in another motorcycle accident.

Patrick has been plastering blue-and-yellow stickers that read “ SvendIt ” around Cortina d'Ampezzo, a play on “send-it,” which is Patrick’s mantra on life and a reference to his brother who went by Sven.

“He’s the reason I’m here. I’m just a vessel to cram love and combat hate down your throats. He inspired me to live life, but life is fragile. You can die,” Patrick said. "It’s all for him. It’s for my family. It’s for the people that have struggled. He’s dead. I’m here living, talking to you guys. I’m going to have who knows how many women and champagne after this. He liked that, too, but he doesn’t get that, and I do. I recognize that.”

Halgren celebrated in front of his parents, Peter and Kathy, which he said was “pretty cool.”

"But also, it sucks not having Sven here, so ups and downs,” the 33-year-old Halgren said after winning his first medal in his second Paralympics appearance.

“They went to Tijuana, Mexico, for their honeymoon 50 years ago. They picked up their dead kid in New Zealand, and they’ve watched me win the Paralympics at the most beautiful ski valley in the world,” Halgren told the Olympics website. “This is a surreal moment for them. This is an experience that will create a memory lasting a lifetime and such a good moment.”

Halgren said it was Sven who steered him to Para alpine skiing after the 2013 crash that nearly killed him and resulted in the above-the-knee amputation of his left leg.

“I died myself. I was in a coma for a month. I died four times," he said. "They used a defibrillator to start my heart. Blood transfusion. I get it, and I’m lucky that I have that because I know what it’s like. Not many people do.”

Now it's hard not to miss Halgren, and not only because of his long braids that are dyed red, white and blue. Always entertaining and joking with those around him, the outspoken American has taken on a showman personality at the Games.

At the podium ceremony, he performed an air guitar solo using his crutch. He said it was just “another Monday” for him.

“I am a rock star,” he added. "I always wanted to be this guy, Jim Brown, he was my idol. He was a professional football player, played lacrosse at Syracuse. I did both those sports, and he retired at the peak of his career and became basically the first Black action movie star. I always wanted to be him, and now I am him.”

Halgren said he “learned to be un-irritable, un-embarrassable.”

“It’s about being vulnerable in this life. It’s about trying things and failing. It’s OK to be embarrassed. It’s OK to look weird.”

As he talked to the media, Halgren was congratulated by nearly every rival that passed by. He was second to Switzerland’s Robin Cuche in the men's super-G standing.

“Medals don’t mean anything to me. The love from all the people supporting me is what means anything to me,” he said. “I can feel, I can literally feel all the people who have ever given me well wishes and ‘Thanks’ and ‘Good lucks.’ I can feel them loving me and they’re the reason I won.

“You celebrate the victories the same as the defeats. I’ve been blessed to have to develop my character over the last 11 years, losing my leg and could either roll over and die, or I could become the greatest Patrick Halgren on Earth, and that’s what you’re seeing.”

His future plans?

“I would like to dominate the Earth in every category with one leg.”

Anything else? “My horse is thirsty, I'm out.”

AP Winter Paralympics: https://apnews.com/hub/paralympic-games

Patrick Halgren, of the United States, poses on the podium after winning the silver medal in the alpine skiing men's super-G standing at the 2026 Winter Paralympics, in Cortina d'Ampezzo, Italy, Monday, March 9, 2026. (AP Photo/Emilio Morenatti)

Patrick Halgren, of the United States, poses on the podium after winning the silver medal in the alpine skiing men's super-G standing at the 2026 Winter Paralympics, in Cortina d'Ampezzo, Italy, Monday, March 9, 2026. (AP Photo/Emilio Morenatti)

Patrick Halgren, of the United States, celebrates on the podium after winning the silver medal in the alpine skiing men's super-G standing at the 2026 Winter Paralympics, in Cortina d'Ampezzo, Italy, Monday, March 9, 2026. (AP Photo/Emilio Morenatti)

Patrick Halgren, of the United States, celebrates on the podium after winning the silver medal in the alpine skiing men's super-G standing at the 2026 Winter Paralympics, in Cortina d'Ampezzo, Italy, Monday, March 9, 2026. (AP Photo/Emilio Morenatti)

Patrick Halgren, of the United States, celebrates on the podium after winning the silver medal in the alpine skiing men's super-G standing at the 2026 Winter Paralympics, in Cortina d'Ampezzo, Italy, Monday, March 9, 2026. (AP Photo/Emilio Morenatti)

Patrick Halgren, of the United States, celebrates on the podium after winning the silver medal in the alpine skiing men's super-G standing at the 2026 Winter Paralympics, in Cortina d'Ampezzo, Italy, Monday, March 9, 2026. (AP Photo/Emilio Morenatti)

Artificial intelligence company Anthropic is suing to stop the Trump administration from enforcing what it calls an “unlawful campaign of retaliation” over its refusal to allow unrestricted military use of its technology.

Anthropic asked federal courts on Monday to reverse the Pentagon’s decision last week to designate the artificial intelligence company a “ supply chain risk.” The company also seeks to undo President Donald Trump's order directing federal employees to stop using its AI chatbot Claude.

The legal challenge intensifies an unusually public dispute over how AI can be used in warfare and mass surveillance — one that has also dragged in Anthropic's tech industry rivals, particularly ChatGPT maker OpenAI, which made its own deal to work with the Pentagon just hours after the government punished Anthropic for its stance.

Anthropic filed two separate lawsuits Monday, one in California federal court and another in the federal appeals court in Washington, D.C., each challenging different aspects of the government's actions against the San Francisco-based company.

“These actions are unprecedented and unlawful," Anthropic's lawsuit says. "The Constitution does not allow the government to wield its enormous power to punish a company for its protected speech. No federal statute authorizes the actions taken here. Anthropic turns to the judiciary as a last resort to vindicate its rights and halt the Executive’s unlawful campaign of retaliation.”

The Defense Department declined to comment Monday, citing a policy of not commenting on matters in litigation.

Anthropic said it sought to restrict its technology from being used for mass surveillance of Americans and fully autonomous weapons. Defense Secretary Pete Hegseth and other high-ranking officials publicly insisted the company must accept “all lawful" uses of Claude, threatened punishment if Anthropic did not comply and condemned the firm and its CEO Dario Amodei on social media.

Designating the company a supply chain risk cuts off Anthropic's defense work using an authority that was designed to prevent foreign adversaries from harming national security systems. It was the first time the federal government is known to have used the designation against a U.S. company. Hegseth said in a March 4 letter to Anthropic that it was “necessary to protect national security,” according to Anthropic's lawsuit.

Trump also said he would order federal agencies to stop using Claude, though he gave the Pentagon six months to phase out a product that’s deeply embedded in classified military systems, including those used in the Iran war.

Anthropic's lawsuit also names other federal agencies, including the departments of Treasury and State, after agency officials ordered employees to stop using Claude.

Anthropic makes several strong First Amendment and due process arguments in a case that has “escalated beyond comprehension,” said Michael Pastor, a professor at New York Law School who previously worked as a New York City general counsel helping to craft its technology contracts.

“I’ve never seen a case like this,” Pastor said. “It would never have struck our minds that, when we were having difficulty in a negotiation, we would threaten the company essentially with destruction.”

Even as it fights the Pentagon’s actions, Anthropic has sought to convince businesses and other government agencies that the Trump administration’s supply chain risk designation is a narrow one that only affects military contractors when they are using Claude in work for the Department of Defense.

Making that distinction clear is crucial for the privately held Anthropic because most of its projected $14 billion in revenue this year comes from businesses and government agencies that are using Claude for computer coding and other tasks. More than 500 customers are paying Anthropic at least $1 million annually for Claude, according to a recent investment announcement that valued the company at $380 billion.

Anthropic said in a statement Monday that “seeking judicial review does not change our longstanding commitment to harnessing AI to protect our national security, but this is a necessary step to protect our business, our customers, and our partners."

The lawsuit positions AI safety and "positive outcomes for humanity” as critical to Anthropic's mission since its founding in 2021 by Amodei and six other former OpenAI employees.

Its usage policy always prohibited "lethal autonomous warfare without human oversight and surveillance of Americans en masse,” the company said in its lawsuit. Anthropic said it has never tested Claude on those applications and doesn't have the confidence its products could “function reliably or safely if used to support lethal autonomous warfare.”

At the same time, it has enabled the military to use Claude in ways that civilians could not, including military operations and in analyzing “lawfully collected foreign intelligence information.”

Until recently, Anthropic was the only of its tech industry peers approved to supply its AI model to classified military systems. The dispute has led the Pentagon to look to shift Claude's work to Google's Gemini, OpenAI's ChatGPT and Elon Musk's Grok.

Anthropic's lawsuit alleges the Trump administration's actions are impugning its reputation, “jeopardizing hundreds of millions of dollars” in contracts with other businesses and attempting to “destroy the economic value created by one of the world’s fastest-growing private companies.”

Conversely, the fight has boosted Anthropic's reputation among some customers and tech workers who sided with the company's refusal to budge to pressure from the Trump administration. Amodei's moral stance was further distinguished when his bitter rival, OpenAI CEO Sam Altman, sought to replace the Pentagon's Claude with ChatGPT in a move Altman later admitted was rushed and seemed opportunistic.

Consumer downloads of Claude surged, lifting its popularity for the first time over better-known ChatGPT and Gemini.

How companies set guardrails also continues to have repercussions in the competition to retain AI industry talent. OpenAI's head of robotics, Caitlin Kalinowski, resigned over OpenAI's Pentagon deal.

“This wasn't an easy call, " Kalinowski wrote on social media over the weekend. "AI has an important role in national security. But surveillance of Americans without judicial oversight and lethal autonomy without human authorization are lines that deserved more deliberation than they got.”

Another group of more than 30 leading AI developers at OpenAI and Google, including Google's chief scientist and AI research division head Jeff Dean, filed a legal brief Monday supporting Anthropic.

“National security is not served by reckless designations of the military’s American technology partners as a ‘supply chain risk’ or the suppression of public discourse on AI safety,” said the filing from the workers who said they were acting in their personal capacities.

FILE- Secretary of Defense Pete Hegseth, left, and Under Secretary of Defense for Research and Engineering Emil Michael, right, arrive to look at a display of multi-domain autonomous systems in the Pentagon courtyard, Wednesday, July 16, 2025, in Washington. (AP Photo/Julia Demaree Nikhinson, File)

FILE- Secretary of Defense Pete Hegseth, left, and Under Secretary of Defense for Research and Engineering Emil Michael, right, arrive to look at a display of multi-domain autonomous systems in the Pentagon courtyard, Wednesday, July 16, 2025, in Washington. (AP Photo/Julia Demaree Nikhinson, File)

FILE - Defense Secretary Pete Hegseth stands outside the Pentagon during a welcome ceremony for the Japanese defense minister at the Pentagon in Washington, Jan. 15, 2026. (AP Photo/Kevin Wolf, File)

FILE - Defense Secretary Pete Hegseth stands outside the Pentagon during a welcome ceremony for the Japanese defense minister at the Pentagon in Washington, Jan. 15, 2026. (AP Photo/Kevin Wolf, File)

Pages from the Anthropic website and the company's logos are displayed on a computer screen in New York on Thursday, Feb. 26, 2026. (AP Photo/Patrick Sison)

Pages from the Anthropic website and the company's logos are displayed on a computer screen in New York on Thursday, Feb. 26, 2026. (AP Photo/Patrick Sison)

Recommended Articles