In recent years, more than 150 fake individuals and media sites created in the United States were removed by Twitter and Facebook. revealed last month by Internet researchers Grafika and the Stanford Internet Observatory. Although the researchers did not attribute the fake accounts to the US military, two officials familiar with the matter said that US Central Command is among those whose activities are being probed. Like others interviewed for this report, he spoke on condition of anonymity to discuss sensitive military operations.
The researchers didn’t say when the takedowns happened, but people familiar with the matter said they were in the past two or three years. Some were recent, he said, and included posts from the summer that advanced anti-Russian narratives, citing the Kremlin’s “imperialist” war in Ukraine and warnings of the conflict’s direct impact on Central Asian countries. Importantly, they found that pretentious individuals – a strategy used by countries such as Russia and China – did not gain much traction, and that open accounts actually attracted more followers.
Centcom, which is headquartered in Tampa, has military operations in 21 countries in the Middle East, North Africa and Central and South Asia. A spokesperson declined to comment.
Air Force Brigadier Pentagon press secretary, General Patrick Ryder, said in a statement that Army information operations “support our national security priorities” and must be conducted in compliance with relevant laws and policies. “We are committed to implementing those safeguards,” he said.
Spokespersons for Facebook and Twitter declined to comment.
As the researchers report, the deleted accounts included a Persian-language media site that shared reposted content from the US-funded Voice of America Persian and Radio Free Europe. Another, he said, was linked to a Twitter handle which in the past claimed to be operated on behalf of Centcom.
According to the report, a fake account posted a provocative tweet which claimed that relatives of deceased Afghan refugees had informed about the bodies being returned from Iran with missing organs. The tweet pertains to a video that was part of an article posted on a US-military affiliate website.
Centcom has not commented on whether these accounts were maintained by its personnel or contractors. A defense official said that if the amputation tweet is shown to be from Centcom, it would be “absolutely in violation of principle and training practices.”
Independent of the report, The Washington Post has learned that in 2020 Facebook disabled fictitious individuals created by CentCom to counter the disinformation spread by China, suggesting the coronavirus is responsible for COVID-19. Made in the US Army laboratory in Detrick, MD. Officer familiar with the matter. The pseudo-profiles – active in Facebook groups containing conversations in Arabic, Farsi and Urdu, officials said – were used to amplify true information from the US Centers for Disease Control and Prevention about the virus’s origins in China.
of the US government ersatz social media access The accounts, although authorized by law and policy, have sparked controversy within the Biden administration, with the White House pressing the Pentagon to clarify and justify its policies. Several US officials said agencies such as the White House, the State Department, and even some Defense Department officials are concerned that the policies are too broad, leeway for the strategy, even if it is used to spread true information. , there is a risk of undermining US credibility, several US officials said.
A second senior defense official said, “Our adversaries are operating completely in the information sector.” “There are some who think that we should not covert anything in that area. It would be unwise to hand over an entire domain to an adversary. But we need a strong policy guardrail.”
A spokesman for the National Security Council, which is part of the White House, declined to comment.
Kahl disclosed his review in a virtual meeting convened by the National Security Council on Tuesday, saying he wanted to know what types of operations have been carried out, who they are targeting, what equipment is being used. And why military commanders have chosen those tactics, and how effective they have been, several officials said.
The message was essentially, “You have to tell me why you’re doing these kinds of things,” the first defense officer said.
Pentagon policy and doctrine discourage the military from lying, but there are no specific rules mandating the use of truthful information for psychological work. For example, the military sometimes uses imagery and satire for purposes of persuasion, but generally the messages must stick to facts, officials said.
In 2020, Facebook and Twitter officials contacted the Pentagon to express concerns about fake accounts they had to remove, suspecting they were linked to the military. That summer, David Agranovich, Facebook’s director for global threat disruption, appointed Assistant Director of Special Operations/Low Intensity Conflict Christopher C. Miller, who oversees the influence operations policy, warns them that if Facebook can smell them, the U.S. Opponents, said several people familiar with the conversation.
“Their talk,” said one person, “is ‘Guys, you got caught. That’s a problem.’ ,
Before Miller could act, he was tapped to head a separate agency – the National Counter Terrorism Center. Then came the November election and it was time for the Trump administration to address the matter, though Miller spent the final few weeks of Donald Trump’s presidency as acting Secretary of Defense.
With the rise of Russia and China as strategic competitors, military commanders are looking to fight back, including online. And Congress supported it. Frustrated by perceived legal barriers to the Defense Department’s ability to conduct covert activities in cyberspace, Congress passed a law in late 2019 confirming that the military could operate in an “information environment” to protect the United States. and may push back against foreign disinfection on purpose. undermining his own interests. The measure, known as Section 1631, allows the military to conduct covert psychological operations without having been claimed by the CIA as its secret authority, reducing some of the friction that previously hindered such operations.
“The fighter commanders got really excited,” recalled the first defense officer. “They were very eager to use these new officers. Defense contractors were equally eager to land lucrative classified contracts to enable covert influence operations.”
At the same time, the official said, military leaders were not trained to supervise “technically complex operations conducted by contractors” or to coordinate such activities with other stakeholders elsewhere in the US government.
Last year, with a new administration, Facebook’s Agranovich tried again. This time he took his complaint to President Biden’s deputy national security adviser for cyber, Anne Neuberger. According to people familiar with the exchange, Agranovich, who worked at the NSC under Trump, told Neuberger that Facebook was removing fake accounts because they violated the company’s terms of service.
The accounts were easily traced by Facebook, which has increased its ability to identify fake individuals and sites since the campaign for Russia’s interference in the 2016 presidential election. A person familiar with the matter said that in some cases, the company had removed profiles that appeared to be linked to the military that promoted information deemed false by fact-checkers.
Agranovich too Spoke to officials at the Pentagon. his message Was: “We know what DoD is doing. It violates our policies. We will enforce our policies” and so “DoD should shut it down,” said one US official informed about the matter.
In response to White House concerns, Kahl ordered a review of the Pentagon’s nickname for Military Information Support Operations, or MISO, psychological operations. According to officials, a draft concluded that policies, training and oversight all needed to be tightened, and that coordination with other agencies, such as the State Department and the CIA, needed to be strengthened.
The review also found that there were cases in which fake information pushed by the military were the result of inadequate monitoring of contractors and training of personnel – not systemic problems, officials said.
The Pentagon’s leadership had little to do with the review, two officials said, before Graphica and Stanford published their reports on August 24, which sparked a flurry of news coverage and questions for the military.
The State Department and the CIA are troubled by the military’s use of covert tactics. “Hey, don’t escalate our policies by using fake individuals, because we don’t want to be seen as false grassroots efforts,” the first defense official said.
One diplomat put it this way: “Generally speaking, we should not adopt the same tactics that our adversaries are using because the bottom line is that we have a moral high ground. We are a society that values values. We promote those values around the world and when we use such tactics, it undermines our argument for who we are.
Psychological operations to promote American narratives abroad are nothing new in the military, but the popularity of Western social media around the world is The strategy has expanded, including the use of artificial persons and images. , Sometimes called a “deep fake”. The argument is that the views expressed as an Afghan woman or an Iranian student may be more persuasive than those openly pushed by the US government.
Officials said most of the military’s influence campaigns are open to promoting US policies in the Middle East, Asia and elsewhere. And there are legitimate reasons to use covert tactics, such as trying to infiltrate a closed terrorist chat group, he said.
A major issue now for senior policy makers is to determine whether the military’s execution of operations with covert influence is yielding results. “Is the juice worth squeezing? Does our approach really have the potential for the return on investment we expected or is it just creating more challenges? A person familiar with the debate said.
Reports from Graphica and Stanford show that covert activity did not have much effect. It noted that “most posts and tweets” that were reviewed received “no more than a handful of likes or retweets”, and that only 19 percent of fabricated accounts had more than 1,000 followers. “Describing,” the report said, “the two most followed assets in the data provided by Twitter were open accounts that publicly declared ties to the US military.”
Covert influence operations have a role in supporting military operations, but it should be a narrow one with “infiltration monitoring” by military and civilian leadership, said Michael Lumpkin, a former senior Pentagon official at Information Operations Policy and former head of state. Told. The Department’s Global Connectivity Center. “Otherwise, we risk making more enemies than friends.”
Alice Crits contributed to this report.