Hacking, Extortion, Election Interference: These Are the Tools Used by Israel’s Agents of Chaos and Manipulation
Revealed: The cyberweapons a key Israeli company
offers its clients in order to corrupt democracies across the globe
Feb 15, 2023
“Thank you, my love. It was a wonderful night. Waiting
for you whenever you come back,” read the note attached to the sex toy hidden
inside the Amazon delivery box. It was allegedly sent by Shannon Aiken, a
38-year-old Washington woman, to the home of a politician at the height of an
election campaign. His wife received the package. The politician slept in his
office for two days. A crew was secretly sent to film him there, and the embarrassing
footage leaked online.
Aiken appears real. She is active on Facebook and
Twitter and has a real Gmail account and an active WhatsApp, both linked to a real
phone number. Her Amazon is connected to a credit card, while real funds from
her digital wallet paid for the package purportedly containing a dildo, condoms, and lubricants.
However, despite her impressive digital footprint,
Aiken is not real. She is an avatar: a complex, fake digital persona built to
mimic human behavior and avoid detection – not a bot, but a “cyborg.”
It is impossible to know if her story is true. In one
version we were told, the politician returned from a trip to the Gulf prior to
receiving the package; in another, from Europe. But that is exactly the point.
Tal Hanan is the man who told us the different
versions of Aiken’s story and the person in command of an army of fake social
media accounts.
Hanan is the head of a business that offers dark
services to the highest bidder: digital
surveillance, hack-and-leak smear campaigns, influence operations,
disinformation, and election interference and suppression.
Foreign Information Manipulation and Intervention, or
FIMI, is increasingly considered a threat to national security and
international stability. It is usually attributed to states or state agencies
in countries such as Russia or China. Hanan, a private actor, and his team
claim to have been active for more than a decade and boast of meddling in 33
presidential elections – 27 of them successfully.
This is the second story in a global investigation
into the disinformation-for-hire industry.
It was conducted as part of the Story Killers project led by Forbidden Stories,
a Paris-based consortium of international journalists who pursue the work of
assassinated or threatened journalists.
An undercover investigation led by journalists from
TheMarker, Radio France, and Haaretz in Israel resulted in a series of meetings
that were secretly recorded with Team Jorge – as they called themselves. Hanan
(who called himself Jorge in our meetings) and Co. thought they were meeting
with representatives of a potential client from an African country.
The clients’ goal: postponing the elections there,
perhaps indefinitely, without cause. The price: 6 million euros (nearly $6.5
million).
Over the course of six meetings, held online and in
person, the intermediaries – actually, three undercover reporters – were
pitched by Team Jorge. The recorded materials were then investigated over six
months by the project members, which included over 100 reporters worldwide from
the likes of Le Monde, Der Spiegel, Die Zeit, The Guardian, Paper Trail Media,
El Pais, and the investigative journalist group OCCRP.
The Team Jorge brand is at the center of this global
investigation. This report reveals the dangerous mix of technology and
tradecraft the group offers. It meshes the worlds of active intelligence and
psychological warfare into a one-stop-shop that includes hacking for hire,
digital spying and surveillance, never-before-seen technologies for the mass
creation and deployment of fake accounts, and tools for social media
manipulation and the dissemination of disinformation online.
These are sold together, as a service, to the highest
bidder, including political actors – but also private ones like embattled
oligarchs and cryptocurrency dealers – looking to use influence tools to smear
rivals or hype digital currencies.
Election interference – a guide
Every influence operation, Hanan explained during our
first meetings, has three stages: gathering intelligence, constructing a
narrative, and deploying for maximum impact.
For the first stage, Team Jorge offers a dizzying
array of intelligence skills, including hacking and cyber offense. Once targets
are selected (for example, officials in a rival political campaign), Team Jorge
claims they can hack into their emails and messaging apps like Telegram – both
troves of valuable information. At a later stage, these can be manipulated and
hacked materials can even be leaked or forged. Hanan called this “active
intelligence.”
“An email [account] is not just email today, right?
You have contacts, and you have a Drive. Let’s see what he has on his Drive,” Hanan
said as he scrolled through his victim’s Google account
“Nick” – aka Zohar, Tal’s brother – chimed in during
the face-to-face meeting we had with the team. He explained that sometimes
people keep their passwords in their drafts or in notes that are backed up on
their email, providing more actionable intel. “Sometimes people [save] their
password in [the] Notes [app],” he explained. “So we taking the notes, we
taking the passwords … even to the bank,” he added in his Israeli-accented
English.
The Hanan brothers would go on to explain how their
hacking capabilities work – stunning the experts.
They take advantage of a known loophole in the
international cellular network, in what experts say is a new, creative, and
disconcerting way. With the help of an official telecom provider – usually
accessed through some nefarious backhanded local cooperation – and a little
social engineering, they claim to be able to find almost any phone in the world
and even intercept its data to gain access to many of its apps.
“It's $50,000 a number,” Zohar Hanan said during our
last meeting, admitting the group also offers general hacking-for-hire
services. He suggested different methods for payment, but his brother shouted
out from his office: “Crypto! Crypto!”
Profiler
The mobile hacking abilities of Team Jorge, much like
most intel gathering, is based on a program called Profiler.
Profiler is an open source and web intelligence (OSINT
and WEBINT) tool that creates a full intelligence profile on targets by
scraping known online databases. It also seems able to collect data from less
legitimate sources: It is able to match names to numbers that should not be
public.
For example, Tal Hanan claimed, feeding Profiler a
target’s phone number allows Team Jorge to find its SIM card’s ID - or its
“international mobile subscriber identity.” This is then used to geolocate it
via the international roaming system and perhaps even to intercept its
communications by exploiting it.
Unlike spyware such as the NSO Group’s Pegasus, this
type of attack does not leave digital forensics on a device.
Hanan revealed how Profiler served them in a project
in the Gulf that is best described as political persecution. A local sheik
gave Team Jorge an anonymous Twitter account and, with Profiler’s help, they
managed to trace it back to its actual owner – thus revealing the dissident's
identity.
Global Bank Scan
Another tool in Team Jorge’s intelligence arsenal is
Global Bank Scan. Hanan said this could create a full financial profile on
targets – including secret offshore accounts and wire transfers – by accessing
a global banking database. This is likely false.
According to sources, in more than one instance
clients he supplied with financial data suspected that the information was
bogus.
D-Day: Hard-core release
Intel collection is just the first phase in Team
Jorge’s disinformation operations. Dirt is collected to help their client reach
their goal – whether that be smearing a competitor, suppressing voter turnout, or even wreaking havoc during Election Day in Africa.
“D-Day,” as they term Election Day, can also require
active measures: from cellphone attacks to “killing” the internet as voting
begins, per a fastly cut video clip Team Jorge showed to potential clients.
“Electronic warfare, let’s call it,” said Hanan,
elaborating on what their presentation dubs “voter suppression” services. These
also include “blame game” tactics and something they call “hard-core release.”
These include targeting rivals’ phones, as well as wide-scale cyberattacks such
as DDoS attacks (standing
for distributed denial-of-service – which overwhelms a site with traffic,
knocking it offline) against government websites.
There are many types of attack offered as part of Team
Jorge’s “disruption” services – but their underlying goal is nearly always concerned
with lowering turnout, casting doubt on potentially negative results, and
generally undermining trust in the democratic process.
These attacks are “evidence of the bleeding of
state-sponsored tactics of cyberwarfare into the hands of for-hire cyber mercenaries,”
said Dr. Nir Grinberg, an assistant professor at Ben-Gurion University of the
Negev, Be’er Sheva. He previously published one of the definitive studies about
disinformation on Twitter during the 2016 U.S. presidential election.
“The outsourcing of disinformation
campaigns is a new concept,” he said – specifically
“the bundling of cybercrime capabilities for manufacturing an alternative
reality with an elaborate apparatus to promote this disinformation online.”
The combination of intelligence, influence, and
political strategy services is what made Team Jorge’s pitch unique: “It’s not
PR work; it’s intelligence work,” Hanan said during one Zoom presentation. “PR
is just one means to get information out. The key to intelligence is that
there is no one solution … and nothing works 100 percent. Okay. Let me show you
what does work.”
Advanced Impact Media Solutions
At the heart of Team Jorge’s service stands Advanced
Impact Media Solutions (AIMS), an advanced disinformation system
that this investigation is revealing for the first time.
A software system capable of creating and deploying
fake accounts across social media at scale and without detection, AIMS was
developed by Team Jorge to serve as a versatile online influence and social
media manipulation tool.
“What is fake news?” Hanan asked rhetorically during
one meeting. “Fake news is when people do believe it. Not because it’s reality
or not reality. The question is credibility.”
And Hanan has the necessary tools to create and
manipulate credibility.
A mass avatar management system, AIMS allows for real
accounts to be created for nonexistent people. These can then be deployed
either as a swarm – similar to a network of bots – or as single agents.
“Let’s make one together,” Hanan said
enthusiastically, opening AIMS’ “Create Profiles Wizard” – an interface that
creates new avatars per a client’s needs.
With one click, AIMS generates a new name, ethnicity,
nationality, language, hometown, and more for the new avatar, based on the
campaign’s location or needs. After picking a name, AIMS then offers an entire
set of photos for use.
Hanan created a new avatar for us: a woman from the
United Kingdom. He did not like the generic British name his system initially
provided, so hit a button and generated a new name: “Sophia Wilde – I like the
name.” Her photos, this investigation found, belong to a real woman who had no
idea her likeness was being used to demo influence operations.
AIMS does not use AI to generate its avatars’ photos.
These can be spotted by social media sites like Facebook and Twitter, which are
still scrambling to address “coordinated inauthentic behavior.” AIMS gives its
avatars real pictures belonging to real people, lending them further credence.
“And after you’ve created credibility, what do you do? Then you can
manipulate,” Hanan explained.
Digital footprint
A single click will breathe digital life into the fake
persona, automatically opening a Gmail account under the avatar’s name, in
their native language, and registered to their home country – all to avoid
detection.
The avatar also gets a local cellphone number, which
is used to verify the email account. “All our avatars are SMS-verified,” said
Hanan proudly.
The phone number and email, he claimed, form the basis
for an avatar’s digital identity – and they also get a unique digital
footprint: “[Sophia is using] Chrome on this version of Windows,” Hanan noted.
Additional layers are then added to the avatar:
specific accounts on specific platforms, from Instagram to WhatsApp – all of
which require either an email or phone number (or both) to register.
AIMS, a dropdown menu reveals, can purportedly automatically
create verified accounts on dozens of websites. These go well beyond Twitter
and Facebook and include the communication app Discord and interactive live streaming service Twitch, as well as websites such as Reddit, Amazon, Airbnb, and even gaming sites. It is unclear if this aspect of AIMS is actually
fully automated. However, Team Jorge claims that any site with a login can be
added to enrich an avatar’s identity and grant them more credibility.
Some even have digital wallets. “If someone looks and
checks our avatar, he will see it even has a crypto wallet ... with Bitcoin,”
Hanan boasted.
He showed us the Shannon Aiken profile again – the
avatar that purportedly sent sex toys to a politician. “She has a credit card,”
he laughed. “What do you think: is it real or not?”
Virgin avatars
Shannon Aiken is one of the “stars” of AIMS – part of
the elite unit in Team Jorge‘s army of avatars. Avatars, Hanan explained, are
like wine: the older, the better. Aiken‘s Gmail and Facebook accounts have been
active for more than two years, he noted proudly.
Single avatars like Aiken can serve as lone agents of
chaos, helping to create a political scandal. However, others are needed to
amplify it into a story through an automated social media campaign.
AIMS’ home screen boasted over 31,000 avatars when we
first saw it last summer. During our last meeting, in December, it showed
almost 40,000.
Under each avatar’s name and photo, a list of icons
indicated the social media platforms on which they are active. Not all avatars
are created equal: There is a star system, ranking the quality of each one in
the system.
Alongside stars like Aiken, the AIMS avatars are
organized according to language and location, so they can easily be grouped and
deployed in different contexts.
AIMS’ avatar management dashboard listed the following
groups: Africa, Arabian, Armenian, Brazilian, Canada, European (mix), Kenya,
Panama, Philippines, Russia, Senegal, South America, Ukraine, USA, and Zimbabwe.
Later we shall see some from Francophone Africa.
There are also avatars active in cryptocurrencies such
as Bitcoin, and NFTs, the artificially scarce digital objects created by using
blockchain technology. Even Aiken was found to have tweeted about crypto coins,
highlighting how the same avatars can serve different campaigns.
“These are technologies and skills that have never
been seen from the inside,” said Prof. Anat Ben-David, a digital media
researcher at the Department of Sociology, Political Science and Communications
at the Israeli Open University. (At the reporter's request, she reviewed the
footage from the Team Jorge presentations.) The disinformation researcher added
that this was the “first time we are seeing a disinformation system's interface
from the inside.”
How many avatars would you need to postpone an
election in Africa without good cause, Hanan asked rhetorically. “Do we need
10,000? No, we need about 1,000,” he responded. Half, he added, could be
“virgin” – an industry term for new avatars created just for us – and half
could be preexisting avatars, depending on “how much inventory we have [from]
other African countries.”
After selecting a group of avatars, Hanan showed us an
“automatic campaign creation” screen and demonstrated how they could now be
deployed en masse. The avatars can push out posts on Facebook, or Like or
comment on each other’s posts to amplify a message. On Twitter, they can
promote a link or try to get a certain hashtag to trend.
We asked for a real-life demonstration, which Hanan
provided, leading the investigation to over 1,800 accounts on Twitter and
Facebook that were active in at least 19 campaigns in 18 countries for
political but mostly private clients. These ranged from a campaign against
California Gov. Gavin Newsom to one in favor of a South American corporation.
The content for campaigns is created by a local team
hired by the client but managed by Team Jorge, Hanan explained – a troll farm,
of sorts.
During our final meeting, in December, AIMS had not
just grown its number of avatars. It had also been upgraded to include AI
capabilities. Now, you no longer need to upload a spreadsheet with posts
written by hired hands; AIMS can automatically generate posts based on a set of
keywords. These can be “positive,” “negative” or “neutral” – thus rendering the
troll farm obsolete.
600 fake links
Another tool in Team Jorge’s arsenal is Blogger. This
helps provide the links the avatars push out during campaigns. These can range
from leaked materials to videos and are hosted on seemingly real websites –
the sites and blogs it creates.
Blogger, Hanan explained, can create “600 links for
the same news” story. That way, he said, “I don’t care” if the links get burned.
“We can create as many links [for] content [as] we want. The blogs themselves
are not important, even though they help with SEO [Search Engine Optimization] –
but we don't go there. We don't need it for that. We needed it to put them on
social media.”
We did not see a working version of the Blogger
system but did see its output: A spreadsheet with hundreds of links leading to
what seemed like the same handful of news stories and videos from an election
in Asia. A number of the campaigns found through suspected avatars posted links
to similar sites that have since been taken offline.
As a result, the campaign can come full circle: the
materials obtained through hacking and active intelligence, including forged
materials, are leaked online. Blogger creates the links and the avatars amplify
them.
Every “10 avatars, every 10 promotions, and every 10
campaigns, I change the link. It’s unlimited, basically,” Hanan said. “The link
is for a video and it can be YouTube or a website we set up. Then we usually do
a leak site – like WikiLeaks, but with a different name.”
Facebook groups can also be created. “We create
credibility for this leak site and we put it on social media,” he explained.
Avatars discovered by the consortium of journalists also posted links to websites
and news reports. They offered leaks about different businesspeople and
officials who were the targets of the different campaigns.
Similar tactics described by Hanan can apparently now be seen in Kenya,
where an unidentified actor is offering leaked materials purporting to be the
“real” result of last summer’s presidential election, which Team Jorge’s likely
client lost.
“The most disturbing part is the bundle of services,”
says digital media researcher Ben-David. “For those developing these
technologies, elections are not really a democratic process but a battleground
in which any means are justified. These are simply weapons of war, and they’re
being used against democracy.”
Experts explained that many times, the goal of
disinformation campaigns is not to advance a specific agenda but rather, to
create the conditions needed for advancing alternative narratives.
“There is a long shadow over the effectiveness of
social media disinformation campaigns,” said disinformation researcher
Grinberg. “Therefore, it may very well be that the biggest impact of disinformation
campaigns – like the ones reported here – is in creating a facade of
effectiveness that is larger than life, and pushing us closer to questioning
the authenticity of everything we see online.”
‘We’re nothing’
In 2019, the Archimedes Group – an Israeli
disinformation firm – was revealed to be operating dozens of fake Facebook accounts on
behalf of political clients in Africa, Asia, and Latin America. The connection
between Israeli IP addresses and accounts elsewhere exposed them as being fake.
Consequently, the firm, its groups, and its workers were all banned from the platform.
Team Jorge and AIMS overcame this exposure by using
“residential proxies” – a complex system of connections that mask the location
of their activities while also providing them and their avatars with a local
identity.
Unlike Virtual Private Networks (VPNs), which are used
to change our virtual location, residential proxies use actual devices located
in the target country to avoid detection – at times rerouting trafficking in
creative ways. In fact, while VPNs are easily detectable, residential proxies
are not. That is why they serve as the “oil of the disinformation industry,”
says Donncha Ó Cearbhaill, an ethical hacker and researcher who worked with the
investigation.
Team Jorge, this investigation believes, works with a
residential proxy provider, in order to provide its avatars with local digital
identities in the form of real IPs that cannot be found by social media sites.
Team Jorge also seems to work with a cellular provider
that serves AIMS in a similar manner, likely giving avatars their local numbers
and supplying the infrastructure needed to grant them SMS verification.
The deliberately misleading technical infrastructure
also serves to hide such actors. The process creates another layer of plausible
deniability between, in this case, Team Jorge, the client, and the campaign’s
content, and helps them further mask the truth. Or, as Hanan put it in one
presentation: “My signal is from Indonesia, the WhatsApp from Hong Kong,
Telegram from Germany. And none of them are my numbers.”
At that stage, Hanan still believed digital trickery
could help hide his identity. “Jorge” was still Jorge. During our last meeting,
the man who we now knew to be Tal Hanan asked if we saw what was written on the
door outside his office in Modi’in. “Nothing. It says nothing. That’s who we
are. We’re nothing.”
Tal Hanan and Zohar Hanan refused to answer questions.
Tal Hanan denied “any wrongdoing.” Zohar Hanan said: “I have been working all
my life according to the law!”
No hay comentarios:
Publicar un comentario