MAGA Got Played by AI
The “Emily Hart” story is absurd. Turns out the self-proclaimed champions of “fake news” were not especially hard to fool.
📌 NOTE FOR NEW READERS: This is an independent publication covering 50501, No Kings, and the broader pro-democracy and civic-action ecosystem. Subscribe to join our community.
On Monday, April 21, WIRED published an investigation by reporter EJ Dickson revealing that a popular pro-MAGA Instagram influencer called “Emily Hart” was not a real person.
She was an AI-generated persona, created and operated by a 22-year-old medical student in northern India who goes by the pseudonym “Sam.”
The account, @emily_hart.nurse, presented itself as a twentysomething registered nurse.
Her look was reportedly modeled on Jennifer Lawrence.
Her feed was filled with AI-generated images of her ice fishing, drinking Coors Light, and shooting at a rifle range, paired with emoji-heavy captions pushing pro-Christian, pro-Second Amendment, anti-abortion, and anti-immigration messages.
One of her posts read: “If you want a reason to unfollow: Christ is king, abortion is murder, and all illegals must be deported.”
Another Post: “POV: You were assigned intelligent at birth, but you identify as liberal.”
Within a month, the account had more than 10,000 followers, and Sam told WIRED that his Reels were getting millions of views.
If you’re looking for the account now, you won’t find it.
WIRED reported that Instagram had already banned the Emily Hart profile in February for “fraudulent” activity, while the related Facebook page was still active at the time of publication. Follow-up coverage from The Independent and others later reported that the Facebook page also disappeared.
According to WIRED’s reporting, Sam told the outlet he had previously tried making money online through YouTube Shorts and selling study notes to other medical students. None of it worked.
The idea for Emily Hart came while scrolling through Instagram. He initially posted generic AI-generated images of attractive women using Google’s Gemini tool, but they did not gain traction. So he asked the AI for strategic advice.
According to a transcript Sam shared with WIRED, Gemini described the conservative niche as a “cheat code,” noting that “the conservative audience (especially older men in the US) often has higher disposable income and is more loyal.”
Google disputed the political framing in follow-up coverage, saying Gemini is designed not to favor any political ideology and that the prompts were about how to reach an audience with specific political views.
Sam took the advice and built Emily Hart from scratch.
He created the nurse backstory, generated the all-American lifestyle imagery, and posted content daily. He described his approach to WIRED as posting “rage bait” and noted that even when liberals visited the page to leave negative comments, the engagement helped push content to more people.
WIRED reported that Sam said he made a few thousand dollars a month from Fanvue subscriptions and MAGA-themed merchandise, and later said he made a few thousand dollars in just a few days from more explicit content on the same platform. Fanvue, a competitor to OnlyFans, allows AI-generated content on its site.
He also tried a liberal version of the persona. It flopped.
“Democrats know that it’s AI slop, so they don’t engage as much,” he told WIRED.
Emily Hart was not the only synthetic conservative influencer making the rounds.
In March, The Washington Post reported on “Jessica Foster,” an AI-generated persona presented as a patriotic Army soldier and Trump supporter.
That account gained more than a million followers in just four months, posting fabricated images of the character in combat gear, posing next to F-22 fighter jets, and walking the tarmac alongside President Trump.
Fast Company reported that the creator even posted fake images of Foster attending a White House reception for an MLS championship team.
The account funneled followers toward an OnlyFans page, with the bio reading: “Public servant by day, troublemaker by night.” That account has since been removed.
If one person can do this as a side hustle, imagine what coordinated networks with funding and geopolitical intent can (or already have…) build.
Alethea, a threat-intelligence firm, documented a coordinated TikTok network of AI-generated videos showing fake Iranian female fighter pilots, designed to project military strength and co-opt dissident hashtags.
That campaign accumulated over 25 million views in a matter of days.
As the BBC first reported and The Washington Post later discussed, Iran bans women from combat roles entirely, which makes the fabrication easier to identify but no less effective at scale.
Meta says it applies “AI info” labels based on detected signals or self-disclosure, and has shifted its manipulated-media approach to rely more on labels and context rather than removal. But Emily Hart ran for months without any AI label. Jessica Foster hit a million followers before anyone intervened. The enforcement is not matching the scale of the problem.
The internet we are navigating is now one where political identity is farmed and optimized. Sold to people who mistake aesthetic familiarity for authenticity. It’s not enough anymore to ask whether a post is true. We now have to ask whether the person posting it is real and whether the entire account exists to extract clicks, cash, and influence from people who never thought to question it.
The scam is no longer just that people lie online but that online life is being rebuilt so that lying scales better.
And yes, there is something hilarious about a MAGA who claims to “seeing through deception” getting played this easily. With just a few prompts, some generated photos, a flag emoji, a cross necklace, and a Coors Light. That was all it took.
But once fake people can build trust, raise money, hold a conversation, and influence real voters, we are not dealing with catfishing anymore. We are dealing with industrialized political fiction.
If nothing else, this story is a reminder for all of us. If an online account looks like it was engineered in a lab to confirm everything you already believe, that is a product. And you are not the customer. You are the revenue stream.
Share this story as a warning. Because what built “Emily Hart” is available to anyone, the next synthetic influencer could target any audience, and the only people who are protected are the ones who learned to ask questions before they hit follow.
Have you noticed more obviously fake political accounts lately, or are they getting harder to spot? What worries you more: fake posts, or fake people building an audience?





I can't even finish reading this because it reminds me of what happened in 2016. Facebook deployed bots by Russian forces that swayed the election. Period. End of story.
I’ve found YouTube creeates AI music videos completely indecipherable from the real musician