Nick Shirley’s Somali Daycare Fraud Video Is Bullsh*t. Here’s Why It Worked Anyway.
A breakdown of the disinformation tactics used in the viral video.
If you have been on the Internet in the past week, you’ve almost certainly run into content creator Nick Shirley’s video showing a bunch of Somali-run daycare centers in Minnesota, which he alleges are fraudulently taking millions of dollars in government funding without providing the services they claim to offer. The footage quickly went viral on social media, attracting the attention of everyone from Elon Musk to the FBI director to the Vice President of the United States and the White House. Just days after the video was posted on YouTube, the Department of Health and Human Services announced it is freezing all child care payments for Minnesota, while the Department of Homeland Security and the Federal Bureau of Investigation both said they’re surging resources to investigate the matter.








The truth, as usual, got lost somewhere in the hype.
Shirley‘s video has now been pretty thoroughly taken apart by numerous news organizations, and many of his core claims have been debunked or at least called into serious doubt. In one case, Shirley arrived when the facility was closed. In another instance, security footage from the daycare center shows children being dropped off on the same day Shirley was there claiming that no children were anywhere to be seen. In two other cases, Shirley showed up to non-operational child care facilities.
While there have been proven and prosecuted cases of fraud in publicly-funded programs in Minnesota, Shirley’s video doesn’t prove that it’s happening at the daycares he went to. So why do so many people seem to think they’re seeing fraud when they watch the footage? Because that’s what he told people they’re seeing. Then, he rolled out some tried and true tactics straight from the disinformation playbook and waited for the algorithm to do its thing.
Let’s walk through how he did it.
Narrative Preemption
The first thing he did was to set a core narrative from the start that would later force others to go on the defensive, leaving them to try to debunk something that people already believed, which is much harder than setting a narrative in the first place. Although Shirley was not the first to cover the issue of fraudulent government payment in Minnesota, the existing coverage on this issue did not coalesce around a strong narrative and did not tell a story that was as simple and neatly packaged as the one Shirley offered.
This tactic, often used in influence campaigns, is known as narrative preemption—getting your version of events out first so everything that follows is interpreted through your lens.







Shirley’s video establishes, from the opening moments, a clear thesis: Somali daycare owners are committing fraud at scale, and the system enables it. Viewers are not invited to explore a question; they are walked through a conclusion. The title of his video, after all, is “I Investigated Minnesota’s Billion Dollar Fraud Scandal.” The conclusion was already made before he ever filmed a single childcare center. Once that frame was set, any subsequent details—numbers, anecdotes, interviews, etc—got subconsciously filtered to support it.
This is particularly powerful on platforms like YouTube, where recommendation algorithms reward early engagement and where corrective reporting almost always comes much later, if at all.
Selective Editing
Shirley’s video also featured one of the defining characteristics of orchestrated disinformation, which is the use of selective editing rather than fabrication. The video includes footage taken at real daycare centers; documents and figures from real publicly funded programs; real instances where oversight has fallen short; and real quotes or interactions with real employees of the centers.
None of this requires fabrication or invention. The deception emerges from what is left out: things like comparative data, explanations of editorial choices, and context about how common (or uncommon) fraud actually is across childcare providers of all backgrounds. He also failed to mention things like what time he visited the facilities and what the operating hours are.

If viewers are shown three questionable examples, but never told how many compliant providers exist—or how fraud rates compare across demographics—the mind fills in the gap. Humans are pattern-seeking; three anecdotes can easily feel like a trend when presented back-to-back. That’s how he created the impression that these childcare facilities were all empty that day, even though security footage shows otherwise.
This is how disinformation operates while remaining technically “real.”
Implied Guilt Through Visual Framing
Visual media allows for implication without assertion—a powerful legal and rhetorical weapon.
In Shirley’s video, visual effects like lingering shots of neighborhoods, mosques, and Somali-owned businesses, slow zooms on paperwork, and reaction shots of the creator looking skeptical or incredulous all work together to imply wrongdoing without stating it outright.
Crucially, the audience experiences the emotions of discovery and suspicion, even if the narrator avoids explicit claims that could be challenged. This is persuasion, not journalism — and it’s a tactic Shirley has used previously. During a trip to Kyiv, Ukraine, in 2024, Shirley used his footage to imply that American tax dollars were being used to buy luxury cars for corrupt Ukrainians. In both instances, Shirley used imagery of expensive vehicles and intentionally-worded captions to suggest that fraud was taking place while maintaining some degree of plausible deniability. Note, too, the use of facial expressions to say what he won’t state explicitly.



Gotcha-Style Interviews
Another recurring tactic is the asymmetrical interview: an unscripted and unprepared individual confronted by a creator who controls the camera, the edit, and the narrative arc.
When daycare operators or community members appear hesitant, confused, or defensive on camera, that reaction is framed as suspicious rather than human. Language barriers, cultural differences, or discomfort with being filmed are not acknowledged; they are instead recast as further evidence that the person is hiding something.

Shirley’s video also presents it as suspicious that daycare operators did not welcome him and his film crew inside when they showed up unannounced, some with faces covered. Even at a business doing everything right, it seems unlikely that Shirley would have received a warm reception given the circumstances. (Having just enrolled two children in preschool within the past few years, I can say that in my experience, appointments needed to be arranged in advance in order to enter a daycare facility to discuss enrollment.)
Credibility Laundering
Shirley also adopted the aesthetics of journalism, effectively laundering the credibility that people assign to investigative journalism and using it to produce propaganda. Because the video appears investigative—featuring on-the-ground footage, documents, and first-person narration—it borrows the visual language of journalism but without having to adhere to journalistic standards.




There is no transparent methodology, no clear sourcing hierarchy, no meaningful right of reply, and no proportionality in the video coverage. Yet to many viewers, the aesthetic of investigative journalism substitutes for the substance.
This is credibility laundering: using the form of accountability journalism to deliver partisan content.
In-Group vs. Out-Group Appeals
One of the most consequential—and least explicitly acknowledged—tactics in Shirley’s video is its reliance on in-group appeals to frame the alleged misconduct as a moral violation against “everyday Americans,” rather than as a narrow policy or oversight issue.
The video repeatedly centers an implicit “us”: taxpayers, workers, parents, people who “play by the rules.” This group is never precisely defined, but it is culturally legible to the audience Shirley courts.
Against that in-group, the video constructs an out-group—Somali daycare owners—who are portrayed not merely as potential bad actors, but as outsiders benefiting from a system they did not earn and do not respect. Fraud—real or alleged—is no longer just illegal behavior; it becomes a symbolic theft from the moral community. That framing activates grievance politics, even among viewers who might otherwise support public childcare funding or social programs.


Critically, this tactic also preempts empathy. Once the audience has accepted the in-group/out-group frame, contextual explanations—language barriers, regulatory complexity, systemic underfunding, or uneven oversight—are easily dismissed as excuses made on behalf of “them,” not explanations that serve “us.” And history shows that when disinformation is framed as an attack on the in-group, corrections feel less like facts—and more like betrayal.
In that sense, the video is not just about fraud. It is about who belongs, who deserves trust, and who is presumed guilty by default. That is what makes this genre of content so potent—and so difficult to counter once it has taken hold.
Appealing to Emotions and Preexisting Biases
One of the things that a lot of people have trouble understanding about disinformation is that it’s not actually about the facts. It’s about presenting a version of reality that feels true, even though it’s not. The easiest way to do this is to show people something that confirms what they already believe or want to believe.
In this case, Shirley appealed to his audiences’s preexisting beliefs about immigrants, Muslims, and government waste. Quiet buildings, locked doors, confused or hostile reactions, and repeated references to public money were arranged to match the expectations of people who already view immigrants and Muslims as undeserving recipients of their tax dollars.
None of the footage was fabricated, but instead it was selected and framed to align with the fears and resentments that his viewers already held. That alignment creates emotional validation, which makes viewers less likely to question context, scale, or alternative explanations—a core disinformation tactic.
Yes, Real Footage Counts as Disinformation
Now that we’ve gone through the tactics Shirley used, I want to preemptively address an argument that people always make in cases like this: Disinformation doesn’t mean fake.
More often than not, disinformation looks exactly like what we see here: real footage, real money, real places, and real people. The label of disinformation gets applied because of the way that footage is strung together, the choices that were made about what to include and not include, and the content and style of the narration.
Disinformation is not defined by falsity alone. It is also defined by intent and effect. When real facts are stripped of context, assembled to imply a conclusion that isn’t supported by the evidence presented, and targeted at reinforcing stereotypes or political narratives, the result is a materially false understanding of reality. Think about it like a map: every detail can be geographically correct, but it can still lead you in the wrong direction.
Why Legacy Media and Institutions Are Losing This Fight
Frustratingly, traditional institutions are not well equipped to handle disinformation. When it comes to content like this, most major news organizations are still fundamentally reactive. They typically lose the narrative race and respond to incidents like this with fact checks and defensive messaging that implicitly accepts the initial frame.
Meanwhile, creators like Shirley operate with speed, emotional clarity, and no obligation to proportionality or fairness. By the time officials or journalists respond, the audience has already internalized the story.
Worse, political actors swept up in the controversy often avoid engagement entirely, fearing that rebuttal will amplify the claim—a valid concern in some instances, but not for a story of this magnitude, where silence simply creates a vacuum that disinformation eagerly fills.
How to Get Ahead of This Playbook
If institutions want to counter this genre of disinformation, they must shift from defense to preemption. That means getting ahead of controversy whenever possible by preemptively publishing content in easily digestible formats and hiring experienced analysts (ahem) to monitor adversarial information spaces in order to proactively identify emerging disinformation narratives. When deceptive narratives are detected early, this allows for time to publish the content and context that will likely be missing when the disinformation hits the mainstream. This could be a game changer in terms of thwarting disinformation campaigns, and it is baffling to me that this is so rarely done, or at least so rarely done effectively.
Getting ahead of the disinformation playbook used by people like Shirley also requires adapting to an attention economy that values entertainment and engaging storytelling. Longform investigative journalism will always play a crucial role here, but if YouTube clips are driving engagement and shaping perceptions, then the corrective narrative should also be told via visual storytelling and multimedia content.
Finally, instead of just publishing fact-checks and counter narratives, it’s also important to expose the tactics being used. Regardless of ideology, people don’t like being lied to or deceived, and it can be very powerful to show people exactly how they’re being misled. This also avoids the rarely-effective strategy of trying to use facts to debunk narratives that appeal to emotions, not logic.
Institutions and defenders of reality need to learn to compete on these new terms—without abandoning their principles—or risk losing even more ground to creators who understand that the most powerful lies are very often carved out of truth.


Your article does a great job showing Mr. Shirley's claims to be unproven. Yet there is a big space between unproven and "bullsh!t". That is, you do not prove him wrong, you point out his bad journalism. To me, he shows a big smoking gun, but it isn't malarkey.
Thank you, Caroline, for exposing the mechanisms by which that deceitful video was put together.
The infestation of bots and trolls in your comments section shows that you hit a nerve.