Stand up for the facts!
Our only agenda is to publish the truth so you can be an informed participant in democracy.
We need your help.
I would like to contribute
"Mom, Dad, checking in," one person who appears to be a U.S. service member tells the camera. "I’m good, okay? No need to worry about me."
Surrounded by snow and ice, she continues: "It’s freezing out here and I’m soaked. But for the safety of the American people, and to help make America great again, I’m standing my ground. Do I earn your follow and your thumbs up yet? Stay safe."
That isn’t a U.S. service member or even a real person. The video was generated with artificial intelligence.
It’s one of many fake videos showing similar scenarios — U.S. service members crying or in dire conditions. Some show female soldiers in barren locations, in uniform, sniffling or crying. They address their parents directly. Others show male soldiers tearfully addressing their partners.
Fake videos of service members weeping and seeking empathy from viewers have surfaced during other conflicts, including the Russia-Ukraine war, and have continued during the Iran war. Thirteen U.S. service members have been killed as of April 15.
Creators have a financial incentive to produce emotional content. Viral videos can earn money for the creators through social media platforms’ programs. Or their videos can direct users to websites prompting them to make purchases, or steal their personal information.
PolitiFact found at least 11 TikTok, Facebook and YouTube accounts that primarily post AI-generated videos of service members, with more than 174,000 followers combined. The fake military videos gained 29.6 million views collectively, with the views for each account averaging from 628 to 466,192. Some video labels disclose that they are AI-generated, but even in those cases, people commenting on the videos don’t seem to realize they’re fake.
PolitiFact contacted Meta, YouTube and TikTok about the accounts. The accounts we inquired about later became unavailable as of April 15.
A TikTok spokesperson said the platform’s Community Guidelines prohibit AI-generated content that presents misleading information on matters of public importance, such as an active conflict, and that they have removed the accounts we shared.
Facebook removed the accounts we flagged for violating its policies, a spokesperson said, adding that the pages were not monetized.
YouTube also removed a channel we inquired about, a spokesperson said, for violating their spam policies.
Shannon Razsadin, chief executive officer of the nonprofit organization Military Family Advisory Network, said military families are encountering such videos and questioning what is real.
"These videos heighten anxiety by presenting scenarios that may not reflect reality, which can compound fear for families already navigating a lot of unknowns," she said.
Mary Bennett Doty, associate director of programs at We the Veterans & Military Families, said such content adds to inflammatory rhetoric and could deepen division.
Videos show emotional service members talking about their families, fallen soldiers
The accounts often use one type of background and script for their videos, often sticking to videos of only men or only women, or pivoted from one to the other.
(Screenshots from TikTok and YouTube)
One page named "US Soldier Legacy," for example, containeds videos of women crying and talking over the sound of jets, with smoke in the background.
In one video posted by a TikTok account named "Usa Soldier Life," with more than 764,000 views on TikTok, a man stands in the foreground with a flag-draped coffin in the distance, saying through tears, "I’m gonna miss you brother, I hope, I hope you know how much we love you. I love you, man. Rest easy."
Other pages primarily show male service members, often holding photos presumably of their loved ones and addressing their partners. One page’s captions say the videos are their last messages to their families.
Accounts often seek to monetize content
Many accounts don’t appear to seek money, but some provide a way for viewers to potentially contact the account holders, such as through a telephone number or a website, for reasons that could include selling items or leading people to a phishing scam.
These accounts follow a trend that uses AI to create synthetic "influencers" and other deepfake content related to politics.
For example, one profile of a female service member named "Jessica Foster" that gained 1 million followers on Instagram while posting images of her with President Donald Trump and other political figures was AI-generated. The account linked to a separate page where the profile sold exclusive fetish content.
Accounts like these can make money through viewers’ engagement with the content, or can direct users to other websites that sell products. Daniel Schiff, a Purdue University assistant professor of technology policy, said people risk exposure to cyberattacks and information theft.
"Accounts may post sympathetic or incendiary information to leverage people's emotions or draw their attention," he said. "Once that account has enough followers, they may post links to external content, which could range from selling clothing to selling intimate content." Schiff said many of these accounts are driven by economic motives.
In one video, the AI-generated character cries and says he’s thinking about home, then promotes a "shop link" in the account’s bio description as he continues crying. The bio did not feature a link. One Facebook page with 31,000 followers called "Brave Marine," featuring similar videos of male soldiers, linked to a website featuring job listings for a maritime company. A telephone number listed for the website’s registration has previously been connected to fraud campaigns.
The content undermines trust in information sources that military families rely on, Razsadin said.
"Many official entities like military branches, helping agencies or military service organizations like ourselves also use social media to communicate verified content to military families," she said.
How to identify fake videos of service members
If you have doubts about a video’s authenticity, check the account that posted it. If it consistently posts videos with different people saying the same things, it’s one indicator the videos could be AI-generated.
The profile’s creation date and posting volume also can be a signal. Some accounts we saw were created around the time the Iran war began, and have been posting consistently since.
"Many of these accounts are relatively new and engage in fairly uniform patterns of influence-style posting," Schiff said.
Some dubious accounts primarily post attractive young women in uniform. Gregory Daddis, a Texas A&M University history professor who served in the U.S. Army for 26 years, said that even when the women in the videos have muddy or scratched faces, they are still portrayed as attractive.
"Nearly perfectly waxed eyebrows across the board seems telling to me," he said.
The uniforms also can be a giveaway. In one April 12 video, a female service member said, "Dad, it’s almost Christmas. I miss you so much, but for the safety of the American people, I have to hold the line out here. Could you tap the little red plus on my profile to support me? Um, I love you both. Stay safe."
Looking closer at her uniform shows her name is gibberish, and the "U.S." has three periods.
Illegible text and spelling or grammar mistakes are common in these videos. Daddis said the rank insignia is out of place in several AI-generated videos, or feature inaccurate symbols. On combat uniforms, those are located in a patch on the middle of the chest, but some videos show them to the side or missing.
Some videos still have a watermark indicating they were made with AI. One example is Veo, Google’s AI video creator. Watermarks can be cropped out, but another tell that a video was AI-generated is its length: Veo, for one, can typically make videos only up to eight seconds long.
"Be cautious of content that relies heavily on emotion but lacks specifics," Raszadin said.
Staff Writer Maria Briceño contributed to this report.
RELATED: Social media feeds are awash with Iran war misinformation. Here’s how to identify false imagery
Our Sources
Email interview, Mary Bennett Doty, associate director of programs at We the Veterans & Military, April 13, 2026
Emailed statement, Shannon Razsadin, CEO of the Military Family Advisory Network, April 14, 2026
Email interview, Gregory Daddis, history professor at Texas A&M University, April 14, 2026
Email interview, Daniel Schiff, assistant professor of technology policy at Purdue University, April 14, 2026
Email exchange with a TikTok spokesperson, April 15, 2026
Email exchange with a Facebook spokesperson, April 15, 2026
Email exchange with a YouTube spokesperson, April 15, 2026
YouTube video by @dutyfirst13, (archived) April 8, 2026
France 24, These videos of Ukrainian soldiers are deepfakes generated from the faces of Russian streamers, November 13, 2025
Reuters, How many people have been killed in the Iran war?, updated April 10, 2026
TikTok, Integrity and Authenticity, August 14, 2025
Task & Purpose, Military families face waves of AI videos meant to sow discord and tug at heartstrings, March 12, 2026
Facebook page, Brave Marine (archived), accessed April 15, 2026
TikTok account, Usa Soldier Life (archived), accessed April 15, 2026
Facebook page, us_militarry (archived), accessed April 15, 2026
TikTok account, cccarmen (archived), accessed April 15, 2026
Facebook page, The Americans (archived), accessed April 15, 2026
Facebook page, USA marines (archived), accessed April 15, 2026
Facebook page, Marvelous Marines (archived), accessed April 15, 2026
YouTube channel, Duty First, accessed April 15, 2026
TikTok account, killua.deals1, accessed April 15, 2026
Facebook page, US Soldier Legacy, accessed April 15, 2026
Facebook page, Battlefield Heroes, accessed April 15, 2026
TikTok video by us_militarry, accessed April 15, 2026
Fast Company, The most popular MAGA influencer you’ve never heard of is an AI foot fetish model, March 11, 2026
Washington Post, Thousands have swooned over this MAGA dream girl. She’s made with AI., March 20, 2026
Sygnia, Inside a Sophisticated Recovery Scam Network: Evidence from a Live Investigation into Legal Services Impersonation, Feb. 5, 2026
U.S. Army, Army Uniform & Appearance Standards, accessed April 15, 2026
YouTube video by @dutyfirst13 (archived), April 12, 2026
Gemini, Break the silence with Veo 3.1, accessed April 14, 2026



