Stand up for the facts!
Our only agenda is to publish the truth so you can be an informed participant in democracy.
We need your help.
I would like to contribute
If Your Time is short
-
Within hours and without evidence, misinformers on X, Telegram and other social media and communications platforms coalesced around the idea that the Francis Scott Key Bridge in Baltimore collapsed because of a coordinated attack. Some users blamed Ukraine and Israel.
-
We found that X subscribers paying for blue check marks that guarantee greater reach from the platform’s algorithm made nearly all of the most popular posts linking Israel or Ukraine to the bridge collapse.
-
Pro-Russia accounts also promoted those narratives. A digital investigations company found that many accounts sharing these anti-Ukraine claims had no followers, suggesting they were created solely to spread the narrative.
-
Our mission: Help you be an informed participant in democracy. Learn more.
At 1:29 a.m. March 26, the Dali, a large container ship, struck the Francis Scott Key Bridge in Baltimore, causing its collapse.
Within hours — as many Americans slept — misinformers on X and other platforms posted wild theories, unsubstantiated claims and speculation about who was to blame for the catastrophe.
Without evidence, misinformers coalesced around the idea that the bridge collapsed because of a coordinated attack. PolitiFact repeatedly saw social media users falsely assign blame to two nations: Israel and Ukraine.
"If you’re pro-Russia and anti-Ukraine, then it was a Ukrainian attack," said Mike Rothschild, a journalist and conspiracy theory expert who has written books about conspiracy theories.
As of April 1, there have been no credible reports or evidence that the ship’s collision with the bridge was linked to terrorism or an attack.
Sign up for PolitiFact texts
Nevertheless, the invented narratives proliferated — often customized to suit individual posters’ preexisting beliefs and brands, researchers said.
Sara Aniano, a disinformation analyst at the Anti-Defamation League’s Center on Extremism, said these claims often came from people who make it their literal business to spread conspiracy theories.
"A content creator who does makeup tutorials is not much different than the content creator who is selling conspiracy theories," Aniano said. "These theories and these events are the equivalent of their products."
We found that X subscribers paying for blue check marks that guarantee greater reach from the platform’s algorithm were responsible for nearly all of the most popular posts linking Israel or Ukraine to the bridge collapse.
X promotes subscribers’ posts even if they contain unverified or false information. The platform also shares ad revenue with "blue-check" subscribers, letting them earn profit when people interact with their posts.
Misinformation experts said bridge collapse conspiracy theories were widespread and also successfully reached a larger, more mainstream audience on X.
PolitiFact used advanced searches on X to analyze more than 100 posts and create a timeline of the anti-Ukraine and anti-Israel narratives that emerged immediately following the incident. Misinformation experts also shared some examples with us.
Here’s our timeline of the day’s events and examples of anti-Ukraine and anti-Israel claims:
1:29 a.m. The Francis Scott Key Bridge collapsed.
3:02 a.m. The earliest posts mentioning Ukraine or Israel did not immediately assign blame, but brought both countries into the bridge collapse discussion.
"Collapse of a Bridge in Baltimore after being hit by a ship. US infrastructures emblematic of a declining/collapsing empire," one paid X subscriber posted. "Money spent in endless wars, to finance Nazis in Ukraine and baby killers in Gaza rather than taking care of US citizens."
3:22 a.m. A paid X user with 185,000 followers and a Russian flag emoji in its name asked, "Did Israel just hit the US over not using the Veto power yesterday?"
The U.S. on March 25 abstained from voting on a United Nations Security Council resolution calling for an immediate ceasefire between Israel and Hamas. Three times prior, the U.S. vetoed similar resolutions.
(Screenshots from X.)
4:07 a.m. A paid X subscriber with 24,000 followers posted, "Israel cancels its visit to Washington after the US allows the UN Gaza cease-fire resolution to pass and then the Francis Scott Key Bridge in Baltimore is attacked? This is not a coincidence, nor was it an accident!"
7:21 a.m. Andrew Tate, a conservative internet personality who is facing rape, human trafficking and gang activity charges in Romania, said to his 9 million followers on X that the ship "was cyber-attacked," before claiming that "foreign agents of the USA attack digital infrastructure." We rated his claim False.
9:11 a.m. Alex Jones, a conservative radio host with 2.2 million X followers who is known for spreading conspiracy theories, reshared Tate’s post, adding that the incident "looked deliberate."
(Screenshots from X.)
9:17 a.m. A paid X user whose account description includes "tweeting for Palestine," replied to Jones’s reply to Tate, expressing doubt that the ship’s collision with the bridge was a coincidence.
9:53 a.m. Federal and Maryland state officials said during a press conference that no credible information suggested a terrorist attack caused the collapse.
9:57 a.m. "Our supposed ‘friends’ from Ukraine are enjoying the news of 20 Americans missing after the collapse of the Francis Scott Key Bridge in Baltimore," one blue-check X subscriber posted. "They are claiming it’s punishment for not sending them more billions of our tax dollars."
9:59 a.m. An anti-Gov. Katie Hobbs, D-Ariz., blue-check subscriber X account with nearly 56,000 followers said "it’s not plausible" that the bridge collapse was accidental "during an election season and in the middle of two theaters or combat in Ukraine and Israel."
10 a.m. An X subscriber falsely claimed the container ship’s captain was Ukrainian.
"Here is information circulating regarding the container ship that hit the Francis Scott Key Bridge and its alleged drivers," read the post. "One is reportedly from Ukraine. The information does not account for any remote operating. Developing."
Posts shared at 10:11 a.m. and 10:24 a.m. used nearly identical language.
(Screenshots from X.)
The vessel was crewed by 22 Indian nationals, according to the ship’s management company.
10:24 a.m. A blue-check X subscriber whose account features hallmarks of the discredited QAnon conspiracy theory movement posted that something seemed "very off" about the collapse because of a "vessel operator" with "ties to Ukraine."
11:31 a.m. "The captain of the ship that hit the bridge in Baltimore is Ukrainian," wrote one blue-check subscriber with 362,500 followers.
12:46 p.m. President Joe Biden said the incident was "a terrible accident," adding that there was no indication it was caused by "any intentional act."
1:10 p.m. An X subscriber account whose name includes Nigerian and Russian flags posted that the Dali’s captain was, "a citizen of Ukraine."
We found three more posts echoing the false Ukrainian captain theory before 3 p.m.
(Screenshot from X.)
2:51 p.m. DC Draino, a blue-check X subscriber who frequently shares misinformation to approximately 1.4 million followers, amplified claims that the collapse was an attack and questioned who was to blame: "Iran for our support of Israel? Russia for Biden’s support of Ukraine? China … just because?"
3:54 p.m. The news website Voice of Europe, which has 182,500 followers on X, posted that the captain "may be a citizen of Ukraine."
Less than 12 hours later, on March 27, the Czech Foreign Ministry announced that it had sanctioned the leader of Voice of Europe for using the site to spread anti-Ukrainian disinformation. As of April 1, Voice of Europe’s website had been taken down. The site’s X account — which has a gold X verification badge signaling that it is "an official organization on X" — temporarily stopped posting.
(Screenshot from X.)
6:45 p.m. "The Baltimore bridge terror attack stems from the United States didn’t veto a U.N. Resolution on the Gaza ceasefire," wrote a blue-check subscriber whose bio includes a Russian flag before the words "defeat NATO." "And the U.S. didn’t send Ukraine the $60 billion."
Some posts sharing anti-Ukraine or anti-Israel sentiment came from accounts that declared support for the conservative "MAGA" movement or used language linked to the QAnon conspiracy theory.
Pro-Russia accounts also promoted these narratives.
The 3:22 a.m. X post that questioned whether Israel "hit" the U.S. came from a user NewsGuard analyst Coalter Palmer described as a "notorious purveyor of misinformation related to the Russia Ukraine war." Palmer pointed to two other X posts in which that user falsely claimed the Bucha massacre was a false flag operation and that Ukraine is a "Nazi state."
Memetica, a digital investigations company that studies disinformation and violent extremism, found that the false Ukrainian ship captain claim was pushed by pro-Russia accounts and QAnon conspiracy theory promoters, said Adi Cohen the company’s chief operating officer.
Looking at a sample of X posts from 10 a.m. to 5 p.m. March 26, Cohen said Memetica found that 7% of the accounts sharing that narrative had zero followers, suggesting what researchers call "inauthentic amplification" — accounts created solely to boost the narrative.
Cohen said that the Ukrainian captain claim was promoted by a known element of the Russian disinformation ecosystem, SouthFront, a website the State Department described in a 2020 report as "a multilingual online disinformation site registered in Russia."
Most researchers identified Telegram, 4Chan and X as places where this misinformation flourished most, crediting those platforms’ permissive policies about what can be posted and X’s reputation as the go-to platform to discuss breaking news events.
It’s hard to definitively say where misinformation was worst, because not every platform shares the same data or is easily searchable, experts said.
Conspiratorial content might have been more contained to fringe platforms once, but such theories are now widespread on platforms including X and TikTok, said the ADL’s Aniano.
Memetica analysts observed conspiratorial content about the bridge collapse right away on all social media platforms, but especially X, Telegram and TikTok, Cohen said.
Misinformers can use events like the bridge collapse as "another plot point in their broader narrative that the mainstream media is not to be trusted, that our government is not to be trusted, that experts like us are not to be trusted, and that there is always an active attack against America happening," Aniano said.
In this image taken from video released by the National Transportation and Safety Board, the cargo ship Dali is stuck under part of the structure of the Francis Scott Key Bridge after the ship hit the bridge, March 26, 2024, in Baltimore. (AP)
Once misinformers seize on an event, experts said, they often assign blame to entities — people, groups, countries — that have also been in recent news headlines.
"Given that the conflicts in the Middle East and Ukraine are ongoing and continued funding to support these various efforts remains a major wedge issue in the United States, it makes sense that they would become fodder for conspiracies and false claims," said Valerie Wirtschafter, a Brookings Institution fellow in foreign policy and the artificial intelligence and emerging technology initiative.
Wirtschafter said she suspects this "will likely continue to be the way that these types of narratives take shape — by leveraging prominent and polarizing political topics in times of uncertainty and incomplete information."
PolitiFact Researcher Caryn Baird contributed to this report.
RELATED: Edited Wikipedia entry doesn’t prove Israel caused the Baltimore bridge collapse
RELATED: No, the captain of the container ship that hit the bridge in Baltimore wasn’t Ukrainian
RELATED: Baltimore bridge collapse: A cyberattack, a movie and other false claims about the ship accident
Our Sources
Interview with Sara Aniano, a disinformation analyst at the Anti-Defamation League’s Center on Extremism, March 29, 2024
Interview with Adi Cohen, chief operating officer at Memetica, a digital investigations company that studies disinformation and violent extremism, March 29, 2024
Email interview with Christine Sarteschi, a professor of social work and criminology at Chatham University, March 31, 2024
Email interview with Valerie Wirtschafter, a Brookings Institution fellow in foreign policy and the artificial intelligence and emerging technology initiative, March 29, 2024
Email interview with Mike Rothschild, a journalist and conspiracy theory expert, March 30, 2024
Email interview with Coalter Palmer, an analyst at NewsGuard, April 1, 2024
Wired, Online Conspiracies About the Baltimore Bridge Collapse Are Out of Control, March 27, 2024
Axios, Misinformation runs rampant after Baltimore bridge collapse, March 26, 2024
Shayan Sardarizadeh post on X, March 26, 2024
NPR, The Baltimore bridge collapse gave conspiracy theorists a chance to boost themselves, March 27, 2024
CNN, How the Baltimore bridge collapse spawned a torrent of instant conspiracy theories, March 28, 2024
X Help Center, Ads revenue sharing, accessed March 28, 2024
Poynter, Why Twitter’s Community Notes feature mostly fails to combat misinformation, June 30, 2023
Elon Musk post on X, Oct. 29, 2023
Mashable, Most users on X never see Community Notes correcting misinformation, Nov. 30, 2023
USA Today, Baltimore bridge collapse was tragic enough. Then came the right-wing conspiracy theorists, March 27, 2024
The Washington Post, Republicans put forth unfounded and sometimes racist theories on bridge collapse, March 28, 2024
Agence France Presse, Baltimore bridge collapse sparks baseless attack theories, March 26, 2024
Politico, Russian influence scandal rocks EU, March 29, 2024
Reuters, Czechs sanction Medvedchuk, website over pro-Russian EU political influence, March 27, 2024
Euronews, European Parliament 'looking into' claims members were paid to spread Russian propaganda, March 29, 2024
Voice of Europe’s website, accessed March 29, 2024
USA Today, How Francis Scott Key Bridge was lost: A minute-by-minute visual analysis of the collapse, March 30, 2024
Governor Wes Moore’s post on Facebook, March 26, 2024
The Associated Press, Exhaustion, dwindling reserves and a commander who disappeared: How Ukraine lost Avdiivka to Russia, March 11, 2024
Reuters, UN Security Council demands immediate Gaza ceasefire after US abstains, March 25, 2024
The Washington Post, Israel cancels diplomatic visit to U.S. after U.N. vote demanding cease-fire, March 25, 2024
The White House, Remarks by President Biden on the Collapse of the Francis Scott Key Bridge, March 26, 2024
U.S. Department of State, GEC Special Report: Pillars of Russia’s Disinformation and Propaganda Ecosystem, August 2020
Gallup, American Views on the Ukraine War in 6 Charts, Nov. 2, 2023
PolitiFact, Lie of the Year 2022: Putin’s lies to wage war and conceal horror in Ukraine, Dec. 13, 2022
Shayan Sardarizadeh post on X, March 26, 2024
PolitiFact, What is a MAGA Republican? Sept. 21, 2022
PolitiFact, QAnon ideas proliferate in 2022 midterms, raising continued concerns of violence, Nov. 7, 2022
3:02 a.m. X post, March 26, 2024
3:22 a.m. X post, March 26, 2024
4:07 a.m. X post, March 26, 2024
7:21 a.m. Andrew Tate X post, March 26, 2024
9:11 a.m. Alex Jones X post, March 26, 2024
9:17 a.m. X post, March 26, 2024
9:57 a.m. X post, March 26, 2024
9:59 a.m. X post, March 26, 2024
10 a.m. X post, March 26, 2024
10:11 a.m. X post, March 26, 2024
10:24 a.m. X post, March 26, 2024
10:24 a.m. X post, March 26, 2024
11:31 a.m. X post, March 26, 2024
1:10 p.m. X post, March 26, 2024
1:32 p.m. X post, March 26, 2024
1:43 p.m. post, March 26, 2024
2:20 p.m. post, March 26, 2024
2:51 p.m. X post, March 26, 2024
3:54 p.m. X post, March 26, 2024
6:45 p.m. X post, March 26, 2024