Fake story about assassination attempt on Mahmoud Abbas goes viral

A video supposedly showing an assassination attempt on Mahmoud Abbas, the president of the Palestinian Authority, has been circulating widely on X (formerly Twitter) since November 7. However, it turns out that the video actually shows a police drug raid on a refugee camp near Ramallah in the West Bank on November 7.

Issued on: Modified:

5 min

If you only have a minute

  • On November 7, a number of X accounts posted a video they claimed showed an assassination attempt on Mahmoud Abbas, the president of the Palestinian Authority.
  • However, the day the video started circulating online, the spokesperson of the Palestinian security services, Talal Dweikat, said the video actually shows a drug raid carried out by the Palestinian Civil Police Force on the Jalazone refugee camp located near Ramallah. A local media outlet also reported this.
  • Our team reached out to the Palestinian authorities but, for the time being, have not received a response.

The fact-check, in detail

“WARNING: Palestinian President Mahmoud Abbas was the victim of an assassination attempt. His convoy came under fire,” reads a tweet, translated from French, posted by the X account Arab Intelligence in the middle of the afternoon on November 7. Arab Intelligence says in its bio that it is a news site for information about the Arab world.

The post, which garnered more than 700,000 views before it was taken down, also claimed that one of Abbas’s security agents was shot in the head and killed.

Hundreds of other accounts also shared this rumour – within just a few hours, the news had gone viral internationally.

This is a tweet from the Belarusian news outlet Nexta, which reported that there was an assassination attempt on the President of the Palestinian Authority, Mahmoud Abbas. Observers

A post by Belarusian news outlet Nexta featuring the video has garnered more than 1.9 million views since it was posted on November 7. The story spread quickly on X in Arabic, with some posts garnering more than two million views. Some international media outlets, like the Russian press agency Tass, also reported that Abbas’s convoy was attacked.

Most of these posts featured videos showing an exchange of gunfire between two groups in a town centre. The footage most widely shared shows bullets raining down on a group of armed men standing next to a black pick-up truck.  One of them falls to the ground, seemingly shot.

A police drug raid 

However, none of these videos show an assassination attempt on Mahmoud Abbas. The footage was filmed on November 7 during a police raid on drug traffickers in the Jalazone refugee camp located near Ramallah in the West Bank.

The first posts on X about the drug raids appeared around 11am Paris time on November 7 (here’s one example). That means they were shared online before the false rumour about the assassination started to circulate. A local media outlet in Ramallah, Khabar24, also shared this video on Facebook and X before 12pm Paris time.

Khabar24 said in its posts that a captain in the security forces of the Palestinian Authority was injured by shots fired by a criminal gang in the Jalazone camp during an attempt to arrest a drug trafficker.

This information aligns with the statement posted on Facebook a few hours later by the spokesperson for the Palestinian security forces, Talal Dweikat.

“Six members of the Palestinian security forces were injured, including one seriously, during a raid for a person wanted in drug cases,” Dweikat said in the statement, translated from Arabic.

Our team was able to geolocate the specific site where the police raid took place by analysing several different videos posted on X (like this one and this one) of the incident filmed from different angles.

A stone building (outlined in dark green in the image below) appears in two different videos of the incident, filmed at different angles. We were able to locate this building on Google Maps thanks to its distinctive vertical balconies.

In the first video, filmed from the location marked with a red star (here), you can see a white roof that also appears in the satellite image (marked in light blue). In the second video, filmed from the location marked with a blue star, you can see a roof made of orange tiles (marked in red), a uniquely shaped white building (marked in light green) and a minaret that also appears in the background of the video below (in purple).

In the background is Jalazone as seen on Google Maps. At the right are two screengrabs of videos of the drug raid. The first video (above right) was filmed from the location marked with a red star. The second video was filmed from the location marked with a blue star. In the videos. you can see the distinctive balconies on the main building (marked in dark green), a roof of orange tiles (marked in red), the white roof (marked in light blue) and a minaret that appears in the background (marked in purple).
In the background is Jalazone as seen on Google Maps. At the right are two screengrabs of videos of the drug raid. The first video (above right) was filmed from the location marked with a red star. The second video was filmed from the location marked with a blue star. In the videos. you can see the distinctive balconies on the main building (marked in dark green), a roof of orange tiles (marked in red), the white roof (marked in light blue) and a minaret that appears in the background (marked in purple). Observers

Our team reached out to the Palestinian Authority but has not yet heard back.

A document with unknown origins

Some accounts on X went further than just sharing rumours about the assassination attempt on Abbas – they also claimed to know who had carried out the attack. French-Algerian journalist Mohamed Sifaoui, along with others, claimed that this (fake) assassination attempt was the work of a Palestinian group known as the Sons of Abu Jandal.

This Palestinian group was unknown up until this point. It claims to be made up of members of the security forces of the Palestinian Authority’s security who have links to Fatah.

In a statement in Arabic dated November 5, this group delivered an ultimatum to Abbas (using his nickname Abou Mazen).
In a statement in Arabic dated November 5, this group delivered an ultimatum to Abbas (using his nickname Abou Mazen). Observers

This document says that if the president of the Palestinian Authority didn’t “take a clear position declaring an open confrontation with the [Israeli] occupation”, the group would consider rebelling.

While many questions remain about this document and its authors – including its veracity – that hasn’t stopped some accounts from claiming that this group was behind the (fake) assassination attempt.

Mahmoud Abbas, a president weakened by the conflict in Gaza

As Israel’s offensive in Gaza in response to the October 7 Hamas attack continues, the president of the Palestinian Authority, based in the West Bank, has found himself under increasing criticism from the Palestinian population, who say that he has not taken hard enough action against Israel.

However, Abbas’s popularity was already low before the war. An opinion poll published in September by the Palestinian Center for Policy and Survey Research (PSR), said that 78% of Palestinians were in favour of 88-year-old Abbas resigning.

The Palestinian Authority has been in power since 2005. However, after Hamas took power in Gaza in 2007, they now only control parts of the West Bank.



Source link

#Fake #story #assassination #attempt #Mahmoud #Abbas #viral

No, this video doesn’t show a Palestinian pretending to be injured in the Israel-Hamas war

Some pro-Israeli social media users are claiming that Palestinians are posting videos pretending to be injured by the bombing campaign carried out by Israel since October 7. On October 26, for example, pro-Israeli accounts started sharing claims that a Palestinian video maker had pretended to be injured. Their so-called proof was two videos – one said to show the man in a hospital bed and the next one, apparently filmed the next day, showing him in perfect health. It turns out, however, that the video of the injured man is from August 2023, months before the war broke out. Moreover, it shows another man – and a real victim of the ongoing conflict.

Issued on:

5 min

If you only have a minute

  • Pro-Israel social media accounts have accused a Palestinian video maker, Saleh al-Jafarawi, of being an “actor” paid by Hamas. They have been circulating a video montage that they say shows al-Jafarawi lying injured in a hospital bed on October 25, and then a video supposedly posted the next day where he is walking through the streets of Gaza in good health.
  • The video showing a man in a hospital bed is actually from August 23, and shows a different person, as demonstrated by several videos shared on TikTok and an article published by an NGO.
  • The state of Israel’s X account posted this misleading montage, though it later deleted the post.  However, the video is still circulating online and some posts featuring the video have garnered millions of views.

The fact check, in detail

This type of accusation is not new. For the past few years, some Israeli websites have been accusing Gaza residents of staging scenes to make themselves look like victims – a practice these sites have dubbed “Pallywood” (a blend between “Palestine” and “Hollywood”). This theory has reemerged online in earnest since the war between Hamas and Israel broke out in early October.

Over the past few days, many pro-Israel social media users have been spreading claims that a Gazan video maker named Saleh al-Jafarawi is an actor working for Hamas and that he has posted videos pretending to be injured by the Israeli bombardments. They’ve used the hashtag #Pallywood alongside these claims.

Al-Jafarawi has indeed been posting videos every day on Instagram to document what has been happening in Gaza since the start of the war. However, these pro-Israel accounts claim that he shared a video of himself in a hospital bed, only to post videos of himself in perfect health, walking the streets of Gaza, the very next day.

A fake news item shared by Israel’s official social media account… then deleted 

“He’s revived!”, reads the caption of this tweet featuring the video posted on October 26 by one anti-Palestinian account. The tweet has since been viewed two million times. The official X account of the state of Israel also shared the fake story about al-Jafarawi the same day in two separate tweets… which it deleted a few hours later.

In these two posts, the state of Israel went into detail in its claims that Saleh al-Jafarawi was an actor and that the hospital scene was staged, explaining, for example, that “most of the machines are disconnected and the ones which are have fake stats”.

These are screengrabs of tweets posted by the official account of the state of Israel on October 26 and then deleted a few hours later. Observers

However, on Friday, October 27, the video was still being circulated by pro-Israel accounts.

One high-profile figure who shared the video montage was Hananya Naftali, a former member of Prime Minister Benjamin Netanyahu’s communications team. Naftali has 385,000 followers on X.

“I don’t watch Netflix because Pallywood propaganda is the actual comedy,” he wrote in his post, which garnered more than one million views.

A video from August 2023 of a young Palestinian with an amputation 

However, al-Jafarawi isn’t the person in the hospital bed video. This video was actually filmed in August 2023, before the conflict began on October 7. It shows a young Palestinian hospitalised after losing his leg.

If you carry out a reverse image search, then you’ll pull up the original video posted on a TikTok account on August 18, 2023.

The account in question shared a number of videos of this bedridden teenager with an amputated leg. One of the posts has a link to a TikTok account belonging to a close friend of the injured young man who gives updates on his friend’s treatment and condition.

A video posted by this account on August 2 shows the same young man in a bed and room identical to those shown in the video posted on August 18.

On the left is the original video posted on August 18 on TikTok that has been used out of context in recent days. On the right is a screengrab of another video published by an account belonging to a friend of the young Palestinian man on August 2. We put the image in vertical format for an easier video comparison.
On the left is the original video posted on August 18 on TikTok that has been used out of context in recent days. On the right is a screengrab of another video published by an account belonging to a friend of the young Palestinian man on August 2. We put the image in vertical format for an easier video comparison. Observers

An image of the same hospital room appears in this article published by the pro-Palestinian NGO International Solidarity Movement on August 25, 2023 – a story that was also picked up by the Indian fact-checking outlet AltNews.

The article explains that the young man, named Mohammed Zendiq, lost his leg after an attack by the Israeli forces on a refugee camp on July 24.

This is a screengrab of an article that the NGO the International Solidarity Movement posted on its site on August 25, 2023.
This is a screengrab of an article that the NGO the International Solidarity Movement posted on its site on August 25, 2023. Observers

Saleh al-Jafarawi, a video maker in Gaza

The video of the young man in hospital has nothing to do with Saleh al-Jafarawi.

Al-Jafarawi, an amateur singer and video maker from Gaza with 1.6 million followers on Instagram, has been sharing images of how the war is playing out in Gaza. He has filmed some recent videos in hospitals, where he stands among victims, documenting the brutal consequences of the Israeli bombings on Gazan civilians.

Al-Jafarawi has come under criticism in recent days, especially by pro-Israel accounts that claim that he has been supporting Hamas in some of his posts.

‘Pallywood’: recurrent fake news items about events being staged

Al-Jafarawi is not the only person to be falsely accused of staging scenes of Palestinian suffering. A number of fake news items in this genre have been circulating since the start of the war.

The term “Pallywood” was coined in 2005 by the American historian Richard Landes, who teaches in Israel. He used it to describe what he believes is a phenomenon of Palestinians staging scenes of suffering that they hope will be picked up online and by the media to undercut Israeli policy.

However, many of the claims around Pallywood that have been circulating in recent days have been proven false.

The health ministry in Hamas-run Gaza said 8,796 people have been killed in three weeks of Israeli bombardments on the enclave, two-thirds of them women and children



Source link

#video #doesnt #show #Palestinian #pretending #injured #IsraelHamas #war

Was this photo of a dead Israeli baby AI-generated? When AI-detection errors muddle public debate

On October 12, the official account of the state of Israel posted an image of a tiny, charred body, claiming that the image showed a baby killed by Hamas during the attacks carried out on October 7. In the hours after the image was posted, social media users began to comment, saying that the image had been generated by artificial intelligence, according to the AI detection tool AI or Not. However, several experts – as well as the company behind AI or Not – have said these findings were wrong and that the photo is likely real.

If you only have a minute

  • A number of social media accounts, most of them either American or expressly pro-Palestinian, have said on X (formerly Twitter) that the photo of the burned body of a child shared by the state of Israel was generated by AI, based on the results of an AI detection tool called AI or Not.

  • However, AI or Not actually said that the result was a false positive. Several specialists in image analysis agreed, saying that the photo was not AI-generated.

  • A number of people claimed that the image of the charred body had been generated using a photo of a puppy. However, when we talked to a specialist in image analysis, he said the photo of the dog was actually the doctored image.

The fact check, in detail

On October 12, Israeli Prime Minister Benjamin Netanyahu published [warning: disturbing images] photos of the burned bodies of children in mortuary sacks on his X account (formerly Twitter). In the caption, he said that the photos showed “babies murdered and burned by the Hamas monsters” during their attack on October 7. The photos were picked up and reposted by the X account of the state of Israel a few hours later.

However, many American and pro-Palestinian social media users blamed the country for having generated one of the images using artificial intelligence.

A number of tweets, including one viewed more than 22 million times, denounced the images, claiming that they had been “created” by Israel, based on the results of artificial intelligence detector “AI or Not“. These tweets featured a screengrab of the results page, where the tool indicated that the image had been “generated by AI”.

The result was even picked up by the X account for Al Jazeera in Arabic. On October 14, the Qatari media published a video on the topic, which garnered more than 500,000 views.

“These images, according to [Israel], reflect the “brutality of Hamas”… Artificial intelligence has revealed the falsity of the Israeli accusations against members of Al-Qassam [the armed branch of Hamas],” Al Jazeera wrote.

This is a tweet from Al Jazeera in Arabic about the accusations that an image of a charred body was actually generated by AI. It includes the screengrab of the results page of the tool AI or Not. Observers

In these same messages, users also accused Israel of generating this image using a photo of a live puppy in a mortuary sack – one that looks the same as one in the picture of the child’s body.

This photo of a puppy, which some people have said is the original photo that was subsequently doctored, circulated widely, especially on 4chan, a site frequented by the American far-right, starting the evening of October 12.

A number of social media users claimed that this image of a puppy, shared on the 4chan channel, was the origin of the photo shared by Israel.
A number of social media users claimed that this image of a puppy, shared on the 4chan channel, was the origin of the photo shared by Israel. Observers

A false positive for the tool AI or Not

In reality, there are a few clues that the image posted by the Israeli government was not generated by artificial intelligence.

The company that created “AI or Not” actually cast doubt on the results of its own software. In a tweet from October 14, the company said that its software was capable of false positives, meaning that it could falsely conclude that a real photo was generated by AI, especially when the image in question was low quality.

“We have confirmed with our system that the result is inconclusive because of the fact that the photograph was compressed and altered to blur out the name tag,” the company said on X, referring to the tag next to the left hand of the body. “If you run the photo through the software now, it actually indicates that it is ‘likely human’.”

Our team confirmed these results on October 16.

This is a screengrab of the results page when our team ran the photo through AI or Not on October 16. Now, the results page says that the image is “likely human.” Our team added the gray circle to mask the body.
This is a screengrab of the results page when our team ran the photo through AI or Not on October 16. Now, the results page says that the image is “likely human.” Our team added the gray circle to mask the body. Observers

The team at the investigative outlet Bellingcat, which specialises in image analysis, tested out the software back in September 2023.

“The fact that AI or Not had a high error rate when it was identifying compressed AI images, particularly photorealistic images, considerably reduces its utility for open-source researchers,” Bellingcat concluded.

‘There is no proof that the image shared by the Israeli government was altered’

Moreover, the photo itself doesn’t show signs of being generated by AI, Hany Farid, a specialist in image analysis and a professor at Berkeley, explained to the media outlet 404.

Farid pointed out the accurate shadows and the structural consistencies in the photo.

“That leads me to believe that [the photo] is not even partially AI-generated,” he said.

The same sentiment was expressed by Denis Teyssou, the head of AFP’s Medialab and the innovation manager of a project focused on detecting AI-generated images, vera.ai.

“There are no signs that the image shared by the Israeli government has been doctored,” he told our team on October 16.

He added that the software designed by vera.ai to detect AI-generated images didn’t identify that the image had been doctored – while also specifying the limits of this kind of software.

“The big risk with AI detection tools is if they produce false positives. If there is a false positive, we can no longer trust the tool,” he said.

A ‘doctored’ image of a puppy

When the photo of the body was run through a software created by the AFP Medialab to detect AI-generated images called InVID-WeVerify, it reached the same conclusion as vera.ai – that the photo hadn’t been doctored.

However, the tool did pick up inconsistencies in the image of the puppy.

“It’s likely that this image was falsified using generative methods,” said Tina Nikoukhah, a mathematics researcher at Paris-Saclay University, during an interview with our team on October 16.

It “detected significant traces in the background of the image that didn’t appear on the puppy,” she said. In the image below, you can see these differences marked in colour – dark green on the puppy and light green on the rest of the image.

The photo of the puppy is on the left. On the right is the same photo with the ZERO filter applied by the software InVID-WeVerify. The filter
The photo of the puppy is on the left. On the right is the same photo with the ZERO filter applied by the software InVID-WeVerify. The filter “detected significant traces in the background of the image that didn’t appear on the puppy,” said Tina Nikoukhah. This is demonstrated by the dark green pixels in the centre of the image. Observers

“Considering the nature of these traces, it’s likely that the falsification was made using AI-generation,” she added, referring to software like Midjourney or Stable Diffusion.

These results line up with claims made by an X user, who said that he had created the puppy image.

In a tweet published on October 12, a few hours before the image was shared on 4chan, he said that it took him “five seconds” to create this image from the image shared by Israel.

“Not hard to make a convincing fake photo anymore,” he wrote in his tweet, which has since been deleted. In another tweet, the same user said that he had used the AI-image generator Stable Diffusion. He referred on multiple occasions to his AI-generated image in other tweets.

Photos of children burned shared without context by Israel

Even if the images are real, Israel shared them without any context.

On October 10, Israeli channel i24 News and the Israeli government were accused of having announced, without proof, that 40 babies were decapitated by Hamas in Kfar Azar.

On October 11, US President Joe Biden also said that Israeli children had been “decapitated”. However, the same evening, the White House said that the American president had gotten this information from the Israeli services and didn’t have any additional proof.

The next day, the Israeli government shared the image of the charred remains of children, saying: “Those who deny these events are supporting the barbaric animals who are responsible for them.”

They did not, however, give any context for the images or the circumstances of the death of these children.



Source link

#photo #dead #Israeli #baby #AIgenerated #AIdetection #errors #muddle #public #debate

Iranian ‘hack’ targets citizens who send videos to foreign broadcasters

Hardline media outlets in Iran claim the country’s security forces hacked the Telegram channel of Iran International, a Persian-language broadcaster that has extensively covered the year-old “Woman Life Freedom” protests. The outlets claim the regime intercepted messages in which Iranian citizens sent amateur images related to the protests to the UK-based broadcaster for publication. The channel denies it was hacked, and a FRANCE 24 review of the supposedly intercepted messages found no evidence that any of the amateur content was ever broadcast by Iran International.

With a news blackout in place in Iran on the protests that followed the death of Mahsa Amini last September, many Iranians have turned to Persian-language media broadcasting from overseas. With independent media barred from working in Iran, such channels rely heavily on amateur images published on social media or sent in by Iranian citizens. Videos filmed by citizens and sent to these media outlets outside Iran have become the main source for many Iranians of independent information about what is happening inside their country.

In what appears to be an attempt to discourage these ties, media affiliated with Iran’s hardline Revolutionary Guard Corps (IRGC) have targeted Iran International, publishing what they say are messages in which Iranian citizens sent amateur videos for publication by the UK-based channel. Launched in the UK in 2017, the channel, which reportedly receives funding from Saudi sources, is one of the favourite destinations for amateur videos shot inside Iran. Iranian authorities have branded it a “terrorist organisation”.


Media affiliated with the IRGC, including the Fars News Agency, have published at least six online videos saying an unspecified “group of hackers” intercepted messages sent to Iran International.

Iran International denies the hacking. “I can state categorically that our Telegram account has not been hacked, or compromised in any way. It never has been. Such claims from the IRGC or its associates are false and are designed to frighten and intimidate people,” spokesperson Adam Baillie told FRANCE 24. “We are characterised by the Iranian authorities as a terrorist channel, which provides quasi-legal cover for threats against our staff and the harassment, often brutal, of their families in Iran.”

The designation of Iran International as a terrorist organisation means that Iranians accused of sending information to the channel could face severe penalties in Iranian courts.

A Fars News Agency alert about contacting Iran International television: “Alert to people who cooperate with enemy media”. © Observers

Alert to people who cooperate with enemy media

Media affiliated with the IRGC, including the Fars News Agency, have published at least six online videos saying an unspecified “group of hackers” intercepted messages sent to Iran International. The videos, posted since mid-September, feature amateur images supposedly sent to the UK-based channel via Telegram, along with screenshots of the senders’ messages and usernames with the account name blurred. The amateur images show protests and other anti-regime initiatives such as strikes by shopkeepers.

One video, published on Telegram on September 15, showed screenshots of messages sent by a user named “Milad” in which he sent a video of an anti-regime protest along with this caption: “Aryashahr (a neighbourhood in Tehran), 17th or 18th Aban (September 8 or 9, 2022). Regime agents savagely beat up a young man.” FRANCE 24 was unable to confirm the sender’s identity or the context of the video, but Iranian web users suggested the claims of a hack were fabricated.


In a video published on X, formerly Twitter, on September 19, demonstrators chant: “The mullahs must go”.

Fars News Agency’s claim is BS

Iranian web users have been skeptical about the claims of a hack. “As someone who has sent many photos and videos [to Iran International], I can confirm Fars News Agency’s claim is BS,” said one tweet posted on September 20.


“If they had hacked the channel, they would have shown off about it by announcing they had hacked it and changing the profile picture,” another user wrote, referring to a common practice when the Iranian security forces hack into anti-regime accounts.


A third user wrote: “Hacking? That’s a joke! The IRGC fanboys can’t do anything more complex than basic HTML coding.”


Hacking Telegram is very difficult

Amin Sabeti is an Iranian cybersecurity expert based in London. He closely follows the activities of hackers close to the Islamic republic’s regime.

“In general, hacking the servers of a messaging app like Telegram is a very difficult task, not just for Iranians, but for any hacker in the world. The screenshots of the user messages supposedly sent to Iran International’s Telegram account are in a format that would only be visible by the Iran International Telegram account owner. I closely follow hackers working for the Iranian regime and I have never seen any indication that they are capable of directly hacking Telegram’s servers to access any account.

All the Iranian hackers have done so far is to trap the “end user”, using various techniques like phishing. For example, they send emails to account holders pretending to be from the Telegram company saying that someone is trying to hack your account or change your password.

There are two sides to the question of the safety of Iranians who turn to foreign media such as the BBC or Iran International. Concerning the news organisations, I know that the security measures of these media outlets are really good. They are up-to-date in keeping their accounts secure. That is why we have never had such a case so far.

The only possible problem, however, could be the Iranians who contact these news organisations, because they too need to protect their accounts. They need to update their apps and software, and make sure they do not have malware on their phones. And once they have sent their messages, they need to delete them themselves.”

No trace of the videos on Iran International accounts 

FRANCE 24 analysed the six video reports published by Fars and other IRGC-affiliated Telegram accounts. The IRGC reports featured more than 30 amateur videos supposedly sent to Iran International. The FRANCE 24 team then searched for other publications of the videos on social media, including archives of Iran International’s Telegram, X (formerly Twitter) and Instagram accounts over the last 12 months.

Of the around 30 videos supposedly sent to Iran International by Iranian citizens:

  • None were published on Iran International’s social media accounts, including Telegram, X and Instagram.
  • Reverse image searches found no publication of the videos on other social media accounts.
  • In at least in one case, the video could not have been recorded on the date it claimed because the environment is not the same as it was during the 2022 protests.

Video supposedly filmed in November 2022 was filmed in 2023

One video, published by Fars News on September 20, featured messages supposedly sent to Iran International in November 2022 by a Telegram user called “Nilo0o”. The supposed user sent a video showing closed businesses on a street with a caption saying: “General strike by the population in Rasht on 17 November 2022.”

The video was filmed in the Golsar neighbourhood in the city of Rasht. It shows a bank, Melal Credit Institution, on Golsar Street between alleys 92 and 96, in a complex called the Blanca Palace.

The video shows a bank, Melal Credit Institution, on Golsar Street in a complex called the Blanca Palace.
The video shows a bank, Melal Credit Institution, on Golsar Street in a complex called the Blanca Palace. © Observers

But other information indicates that the Golsar branch of the bank moved to that location in 2023. A video of Golsar Street filmed in January 2023 shows the same location vacant, with a banner giving contact information for the complex.

This photo shows the same location vacant, with a banner giving contact information for the complex.
This photo shows the same location vacant, with a banner giving contact information for the complex. © Observers

Yellow Pages information indicate that Melal Credit had a branch at a different location on Golsar Street, 500 metres away near alley 109.

This photo shows that Melal Credit had a branch at a different location on Golsar Street, 500 metres away near alley 109.
This photo shows that Melal Credit had a branch at a different location on Golsar Street, 500 metres away near alley 109. © Observers

A posting by a business at that location in February 2023 said: ““I am the new owner at alley 109, pls Bank update your contact info!”

A posting by a business at that location in February 2023 said: ““I am the new owner at alley 109, pls Bank update your contact info!”
A posting by a business at that location in February 2023 said: ““I am the new owner at alley 109, pls Bank update your contact info!” © Observers

The video supposedly intercepted by hackers could not have been filmed in November 2022.

If the regime succeeds in cutting this line, we will have a total information freeze

Bahram [not his real name] is an Iranian journalist who has been arrested or interrogated multiple times in recent years over his reporting on current affairs in Iran. He says that with widespread censorship in Iran, many Iranians turn to overseas broadcasters like Iran International for reliable news.

Iranians now record everything with their mobile phones: strikes, protests, police violence … and send the videos to organisations that will publish them. The amateur videos people send to overseas broadcasters are our only source of information. If the regime succeeds in cutting this line of communication, we will have a total information freeze in our country. We will not know what is going on: we’ll know absolutely nothing.

The regime has done its best to drive us into such a blackout. They have blocked social media, but people use VPN proxy servers to get access.

They have tried to discredit these media or activists through propaganda smear campaigns. Now the latest attempt is to scare people. They’re saying: “If you send them something, we will find you, so don’t send them anything.” However, I am not sure it will ultimately benefit the regime. Maybe in the short term people will hesitate for a few days to send videos to this or that media or activist, but in the long term I think nothing will change. You will not give up your water source, no matter how tiny it is, in a desert.



Source link

#Iranian #hack #targets #citizens #send #videos #foreign #broadcasters

A flood of misinformation about migrants in Lampedusa

Thousands of individuals, predominantly from sub-Saharan Africa, have recently arrived on the small Italian island of Lampedusa, reigniting the discussion on the EU and European states’ approach to handling illegal immigration. Amidst this context, people online have been sharing three deceptive videos with the intent of disparaging migrants arriving in Italy.

Issued on:

5 min

If you only have a minute 

  • One video shared on X (formerly Twitter) claims to show a fight among migrants in Lampedusa. However, a reverse image search reveals that the video dates to 2021. It shows a fight outside a club, nowhere near Lampedusa.

  • Some people have also shared a video showing migrants dancing with NGO staff, claiming the scene took place this weekend in Lampedusa. However, the video was taken in August, in the UK.

  • Finally, one video claims that migrants who made their way into Europe through Lampedusa had started skirmishes in Stuttgart, Germany. The incident did indeed take place last weekend, but there’s no indication that it involved migrants.

The fact-check, in detail

On September 14, around 7,000 migrants landed on the Italian island of Lampedusa in the span of just 48 hours. So far in 2023, nearly 126,000 migrants have arrived in Italy – twice as many as last year.

Against this backdrop, a number of videos have been shared on social media networks targeting migrants.

This fight between ‘migrants’ dates back to 2021 – and isn’t in Lampedusa

“The migrants in Lampedusa, Italy are getting restless,” reads the caption on this video shared on X on September 18. The video shows a group of people in the midst of a violent fight. A group of young men are seen beating another man, who appears to be taking cover behind a policeman before being chased away by the group.

The video had garnered more than 169,000 views on X before it was deleted.

September 18 post on X claiming to show a fight between migrants in Lampedusa. © X / @WallStreetSilv

A simple reverse image search (click here to find out how) reveals that the original video was published on August 10, 2021 by Rossini TV, a regional channel based in Pesaro, central Italy.


The title of the report states that the video shows a brawl in Marotta, a village near Pesaro.

We searched for details about the incident and found that several local newspapers reported on a brawl outside a Marotta club on August 7, 2021. During the fight, which started inside the club, a Senegalese man was stabbed in the abdomen. Two Italian police officers were also injured while trying to intervene. Four people were arrested, including two Albanians, a Dominican, and a Senegalese person.

The video was published two years ago, and has nothing to do with the current influx of migrants to Lampedusa.

These migrants filming themselves dancing with volunteers and members of NGOs were not in Lampedusa

With over 3 million combined views on X, a video posted on several accounts claims to show migrants taking selfies while dancing with volunteers from NGOs, even though they have just arrived on the island of Lampedusa.

Screenshot on X, September 16, showing migrants dancing with members of an NGO, allegedly in Lampedusa according to the post's caption.
Screenshot on X, September 16, showing migrants dancing with members of an NGO, allegedly in Lampedusa according to the post’s caption. © X / @stillgray

There are several indications that the scene did not take place in Lampedusa. Firstly, when the person filming himself with the NGO members dancing, you can see a red and white logo on an employee’s jacket: it identifies the NGO Care4Calais.

On its website, the organisation explains that its volunteers work with refugees in the UK, France and Belgium. Members of Care4Calais are not currently in Lampedusa.

If you go further, using a reverse image search, you can find an earlier post featuring the same video. On August 25, 2023, @BFirstParty, the X account of the British political party Britain First, already published it, accusing the Care4Calais association in the caption of being a “traitorous” NGO, having committed a serious faux-pas by dancing with refugees at the border in the UK.

Screenshot taken on August 25 on X, showing the reaction of the British political party Britain First after members of the NGO Care4Calais danced with refugees.
Screenshot taken on August 25 on X, showing the reaction of the British political party Britain First after members of the NGO Care4Calais danced with refugees. © X / @BFirstParty

We contacted Care4Calais, who confirmed that this video does indeed show some of its volunteers dancing with refugees. They also confirmed that the video was not taken this month. The organisation added: “There is no context to the video. As you will be aware, Care4Calais delivers humanitarian aid to refugees in northern France. Whilst distributing that aid, our volunteers interact with refugees with kindness and compassion, often sitting down to share stories (some, as you can imagine, are very harrowing) and in this video they are enjoying a dance.”

Therefore, this video was not taken in Lampedusa, and has nothing to do with the current migrant arrivals on the Italian island.

Clashes don’t involve migrants who arrived via Lampedusa

After the arrival of migrants on the island of Lampedusa, this video was posted on X to denounce the impact of welcoming them to Europe. In a caption, the @Linfo24_7 account claims that the people behind the violent clashes in Stuttgart on Saturday were “illegal immigrants from Lampedusa”.

Screenshot from September 17 of an X post claiming that migrants from Lampedusa have sparked clashes in Germany.
Screenshot from September 17 of an X post claiming that migrants from Lampedusa have sparked clashes in Germany. © X / @Linfo24_7

A reverse image search reveals that the scene was filmed in Stuttgart on September 16. The violence in Germany followed an Eritrean cultural festival organised by groups close to the president, as confirmed by Africa News.

During the day, opponents of the government came to protest against the festival, triggering scuffles between pro- and anti-government Eritrean activists. People close to the opposition were accused of assaulting the police as they intervened to stop the conflict.


There’s no indication that migrants who had just arrived in Lampedusa had travelled to Stuttgart to start riots, or that those involved had arrived via Lampedusa illegally. Furthermore, an article by Sud Ouest explains that, as early as July, a similar conflict had broken out between Eritreans north of Frankfurt.

Source link

#flood #misinformation #migrants #Lampedusa

No, this video doesn’t show Russian ballistic missiles in Niger

A video reported to show trucks transporting Russian ballistic missiles in Niger has been widely circulating amongst West African Facebook and TikTok users since August 11. It turns out, however, that the video was filmed in the Republic of the Congo, not Niger, and shows trucks transporting storage tanks.

Issued on: Modified:

6 min

If you only have a minute

  • A number of West African social media accounts have been sharing a video showing two trucks carrying large cylinders. The accounts claim that the cylinders are a type of Russian ballistic missile called “Satan 2”.
  • One of these videos has already garnered more than four million views on TikTok.
  • However, in reality, the cylinders are not missiles – they are storage containers, likely for transporting oil.
  • Finally, the video was filmed in Congo, not Niger.

The fact check, in detail

The video, first posted on TikTok on August 11, shows two trucks carrying enormous cylinders with red stars on them. The trucks are driving past a number of buildings.

The audio – the sound of women crying and screaming – seems to have been added to the footage. First posted by an Ivorian TikTok account, the video has since garnered four million views.

Text on the video in French reads: “Delivery of ballistic missiles to Niger, Satan 2 [Editor’s note: a type of extremely powerful intercontinental Russian missile] in Niger”.

The TikTok user who published this video on August 11 claims that it shows Russian “Satan 2” missiles being deployed in Niger. © TikTok

The video was picked up and shared by a Facebook account that often comments on news in West Africa. The account also seems to be in favour of the military coup that took place in Niger. Posted on August 11, the video has since been shared a thousand times.

On August 11, the same video was picked up by a Facebook account.
On August 11, the same video was picked up by a Facebook account. © Facebook

In the comments section, many people said the footage was likely fake.

Many people who commented on the footage shared on TikTok on August 11 said that they thought the trucks were likely carrying water tanks (in French, citernes).
Many people who commented on the footage shared on TikTok on August 11 said that they thought the trucks were likely carrying water tanks (in French, citernes). © TikTok

No, this video doesn’t show ‘Satan 2′ missiles

While the red star on the tanks may look like a Russian symbol, we know that “Satan 2” missiles are not transported on the back of trucks like the ones shown in the video. The public got a glimpse of how Russia transports these missiles during tests carried out at the Russian Plesetsk Cosmodrome back in 2018.

This video shows
This video shows “Satan 2” missiles being transported to the Plesetsk Cosmodrome in 2018. Russian Ministry of Defence

In this photo, you can see that the “Satan 2” missile, also known as “RS-28 Sarmat”, is usually transported using a specialised vehicle. These same vehicles were on display during a military parade that took place on May 9, 2022.


This BBC report on the Russian military parade that took place on May 9, 2022, commemorating victory over the Nazis shows the vehicles used to transport Russian ballistic missiles like the “RS-28 Sarmat” (footage begins at 2:17).

These vehicles don’t look anything like a semi-truck. Moreover, the exterior of the missile – which is khaki green and includes several metallic components – looks nothing like the objects in the video.

In the first image (at left), there is no sign of the metal equipment visible on the actual missiles (at right). Nor is there any sign of the warhead shape or the khaki green protective covering.
In the first image (at left), there is no sign of the metal equipment visible on the actual missiles (at right). Nor is there any sign of the warhead shape or the khaki green protective covering. © Observers

Moreover, even when its khaki protective covering is removed, the missile doesn’t look like what is being transported on the trucks in this video. You can see the missile, without protection, in this Bloomberg video that shows a test of  “RS-28 Sarmat” that took place in April 2022.


With its dark warhead end and white body, the “Satan 2” missile doesn’t look like the cylinder being transported by a semi-truck in the video that has been widely circulating.

But if this video doesn’t show a “Satan 2” missile, then what does it show?

Typical features of fuel storage tanks

At the top of the cylindrical objects being transported by the trucks, there are two rounded protuberances.

In this screengrab, taken from the video, you can see two protuberances on either side of the cylinders.
In this screengrab, taken from the video, you can see two protuberances on either side of the cylinders. © TikTok

These are openings that allow for the liquid stored in the tanks to be pumped out. They are typical of liquid storage tanks that will be buried.

There are a number of different storage tanks used for storing oil. You can see that they are very similar to the cylinders that appear in the video: a cigar shape with two openings at the top.
There are a number of different storage tanks used for storing oil. You can see that they are very similar to the cylinders that appear in the video: a cigar shape with two openings at the top. © Observers

The tanks, often made out of fibreglass or metal, are often used to store oil in liquid form.

The shape of the last truck, which you can see in the upper section of the video, in the background, looks like the kind of truck used to transport oil, like the one shown in the photo below.
The shape of the last truck, which you can see in the upper section of the video, in the background, looks like the kind of truck used to transport oil, like the one shown in the photo below. © Observers

An employee at Sanergrid, a company that specialises in manufacturing this type of storage tank, shared these images with our team.

The expert said that the video likely showed a subterranean containment pit or another type of oil storage container.  These containment pits are often used to hold pollutive liquids in case there is a spill from an electrical transformer.

Video taken in Pointe-Noire, in the Republic of the Congo

Even though the video doesn’t show much of the location where it was filmed, there is enough to figure it out. At one point in the footage, you can see a blue-green wall in the background. Black letters on the wall spell out “Betsaleel”. After that comes what looks like the beginning of the French word “maternelle”, which could indicate a primary school (called an école maternelle in French).

During two short moments in the video, you can read the words on the blue-green wall. First, you can see the name
During two short moments in the video, you can read the words on the blue-green wall. First, you can see the name “Betsaleel”, followed by “matern…”, which seems like the start to the French word “maternelle”, which might indicate a primary school (école maternelle). You can also see the words “anglais” (English) and “complet” (full). © Observers

In the comments section, a number of people say that the video was filmed in Pointe-Noire in the Republic of the Congo. We did a Google search for “betsaleel” and “Pointe-Noire” and pulled up information on the “Complex School Betsaleel College Primary Maternal” in Pointe-Noire, which offers primary through secondary education.

We took a look at the street where the school is located on Google Street View. When we compared it to the video, it turned out to be the same place.

In these images, available on Google Street View, you’ll recognize the word
In these images, available on Google Street View, you’ll recognize the word “Betsaleel” from the video, as well as the blue-green wall. The square black light (here outlined in yellow) also appears in both images, helping us to identify that they were filmed in the same location. © Observers

This screengrab, also taken from Google Maps, shows the buildings across the street from Betsaleel school, including a modern-looking building, a series of columns (in pink) and a brown kiosk (in blue).
This screengrab, also taken from Google Maps, shows the buildings across the street from Betsaleel school, including a modern-looking building, a series of columns (in pink) and a brown kiosk (in blue). © Observers

Our team contacted Betsaleel School. They confirmed that the video did indeed show the outer wall of their establishment. Thus, we can say with confidence that the video was not filmed in Niger, but in the port city of Pointe-Noire in the Republic of the Congo.

In conclusion, this video doesn’t show “Satan 2” missiles in Niger. It actually shows storage tanks used for storing liquid fuel in Pointe-Noire, in Congo Brazzaville. Moreover, the “Satan 2” missiles are still in a testing phase: to our knowledge, they have not been deployed abroad.

Source link

#video #doesnt #show #Russian #ballistic #missiles #Niger

Watch out for these images fuelling a conspiracy theory about the Hawaii wildfires

In the wake of the fires that tore across the Hawaiian island of Maui on August 8, a number of images have been circulating on social media. The unrelated videos have been fuelling a conspiracy theory, born in the 2000s, that says wildfires are caused by laser weapons known as “directed energy weapons”.

Issued on: Modified:

5 min

If you only have a minute

  • On August 8, devastating fires broke out on the Hawaiian island of Maui, ravaging the major city of Lahaina.
  • Since then, several videos purporting to show the island before, during and after the fires have been posted on Facebook and elsewhere. They are all unrelated to this tragedy.
  • What all these images have in common is that they fuel a conspiracy theory that the Maui fires were caused by “directed energy weapons”.

The fact-check, in detail

A video with more than 100,000 combined views on X (formerly Twitter) shows a huge blast of light that seems to travel some distance, resulting in smoke and fire.

It was first posted on August 13, with the caption, “Maui was attacked by directed energy weapons (dews)”. The video was reposted the next day by an account that claimed, “What happened in Maui was more than just wildfires … It appears directed energy weapons may have been used and possibly why there was such a sudden and tragic loss of life!!”

A video showing a blast of light was shared on Twitter on August 13, 2023. © Twitter

According to these users, the tragic fires that ravaged Maui were in fact set intentionally, by a laserbeam weapon. Directed energy weapons are a very real type of weapon, using a laser beam or microwaves. They can perforate, damage or disrupt an object’s electronic systems from a distance. But these systems are mainly designed for defence against drones and high-speed missiles. There is no evidence that such weapons have ever been used to cause fires.

The cause of the Maui fires, which have claimed more than 100 lives since August 8, remains unknown for the time being.

The viral video is blurry, making it hard to see exactly what is happening, or where it might have occurred.

In fact, a higher-definition version of this video exists. It was posted on YouTube in December 2018 by local television channel WWL-TV, which serves New Orleans, Louisiana. The caption on the video says it was a cellphone video taken by a viewer “down Williams Boulevard” in Kenner, Louisiana.

The blast of light is in fact an electrical explosion that traveled through power lines and caused sparks to fly. “Thousands” of Kenner residents lost power as a result, according to WWL-TV.

The explosions were caused by severe weather and high winds, according to this post from December 2018.

A blast of light appearing … in Chile

Another video shared on Facebook on August 14 shows a large beam that seems to hit a building in an urban area, clear characteristics of a supposed “directed energy weapon”.

A number of accounts shared the video, including this French-speaking user, who wrote, “What’s happening in Maui, Hawaii?”

On August 14, this account, which usually focuses on African news, published a montage of images of the Maui disaster. In the middle is this excerpt showing a beam hitting a building.
On August 14, this account, which usually focuses on African news, published a montage of images of the Maui disaster. In the middle is this excerpt showing a beam hitting a building. © Facebook

But this video has nothing to do with the fires in Maui, as confirmed by AP in this article. The video actually comes from a TikTok post dated May 26, 2023. The person who posted it said that it was taken in the Macul district of Santiago, Chile. When it was reposted to support the “directed energy weapons” conspiracy, the video was enlarged and flipped, making it harder to see what was really going on.

A capture of the original TikTok, published on May 26.
A capture of the original TikTok, published on May 26. © TikTok

But what could have caused the beam seen in the original video? According to a report on Chilean television, the explosion was caused by a branch hitting an electrical transformer.

The beam itself is simply a refraction from the camera lens. In fact, if you play the video frame-by-frame, it’s possible to see that the explosion occurs before the beam appears, rather than the other way around.

In the video posted on TikTok, we first see the explosion (left, 0:00), then the beam appear (0:01).
In the video posted on TikTok, we first see the explosion (left, 0:00), then the beam appear (0:01). © TikTok

An industrial incident at a refinery

On X (formerly Twitter), another account claims to have proof that directed energy weapons were the cause of the Maui fires. “They’re using Direct Energy Weapon (sic) to try and advance their climate agenda”, this post, in French, explains.

The post contains a low-quality image that appears to show a beam causing an explosion. Another post with the same photo and a caption in English claims: “I can confirm this, this was #DEW (Direct Energy Weapon) They have been using these is (sic) Canada Australia and other places.”

This Twitter account, which regularly publishes conspiracy content about the fires in Hawaii, believes that this image, posted on August 11, is proof of the use of directed energy weapons.
This Twitter account, which regularly publishes conspiracy content about the fires in Hawaii, believes that this image, posted on August 11, is proof of the use of directed energy weapons. © Twitter

Once again, the image has been debunked. Snopes, an American verification media, was able to find the original context of this scene. It is in fact an incident that took place in January 2018 at a refinery in the city of Canton, in the US state of Ohio. It was reported in the local press. An Internet user also shared this photo of the event in the comments of a Facebook post by The Canton Repository.

A screenshot of the first occurrence of this image, posted on Facebook in 2018.
A screenshot of the first occurrence of this image, posted on Facebook in 2018. © Facebook

Again, no connection with the Maui fires. Similar claims like these, attributing wildfires to a government conspiracy or high-tech weapons have proliferated in recent months. Last June, we debunked a claim that called into question the cause of fires in Canada.

Read moreNo, these satellite images aren’t proof that the Canadian wildfires are a conspiracy

The idea that forest fires are caused by laser weapons, known as the “DEW theory” for “Directed Energy Weapon”, is not new.

According to Mick West, an American journalist specialising in fact-checking, it “emerged in the early 2000s, particularly after the attacks of September 11, 2001”.

At the time, certain conspiracy theories claimed – wrongly – that the collapse of the Twin Towers had been caused by laser weapons. The same theory was later applied to forest fires.



Source link

#Watch #images #fuelling #conspiracy #theory #Hawaii #wildfires

Animals that are too cute to be true: how to detect AI-generated images

Whether it is a baby sloth hanging onto someone’s thumb, a tiny colourful peacock or baby penguins taking a selfie, these insanely cute images of animals have been shared thousands of times on social media in recent months by people unable to resist. But it turns out that all three of these images were generated by AI. We’ve written up a list of tips so that you won’t be duped just because something is wildly cute.

Issued on:

6 min

What do these three images have in common? They’ve all gone viral since early May, for one. They also all feature tiny, unbelievably cute animals.

This image of a baby sloth cupped in someone’s hand, for example, garnered more than 265,000 views and was shared more than 3,000 times on Twitter.

The image of this cute baby sloth was actually generated using artificial intelligence. © Twitter

On Instagram, a photo of a tiny baby peacock has garnered more than 5,800 “likes” since April 28, when it was posted by an account called “Beautiful nature”.

This teeny, tiny peacock is a fake image generated by artificial intelligence – if you couldn’t tell.
This teeny, tiny peacock is a fake image generated by artificial intelligence – if you couldn’t tell. © Instagram / Birdlovers_

However, the image that got the most engagement online is one of baby penguins seemingly taking a selfie. Since it was posted on July 7, it has garnered more than 46 million views on Twitter.

This image of baby penguins garnered more than 40 million views … turns out, it was generated by artificial intelligence.
This image of baby penguins garnered more than 40 million views … turns out, it was generated by artificial intelligence. © Twitter / @shouldhaveanima

These three images have more in common than just being cute they were all generated by artificial intelligence. That means these aren’t real photos, which obviously means that these animals aren’t real either.

How plain old observation is often your best tool for spotting fakes 

Janne Ahlberg founder of the site Hoaxeye, which identifies fake images circulating on social media took a look at these three images.

Ahlberg told our team that the easiest way to spot an image generated by artificial intelligence is to look for “artifacts” or issues with the digital image. Essentially, the tools that generate images using artificial intelligence don’t always get it right and often leave behind errors.

Take a look at the baby sloth picture, for instance. There are two errors that stand out right away. First of all, the thumbnail has an issue  it looks like it has been cut in two. There is also something going on with the bottom left of the image.

Also, there’s a factual error. Sloths only have two or three fingers, which isn’t the case with the baby animal in this picture.

As you can see, the thumbnail looks a bit dodgy – one of our first clues that this might have been generated using artificial intelligence.
As you can see, the thumbnail looks a bit dodgy – one of our first clues that this might have been generated using artificial intelligence. © Observers

Something else gives away the image of the tiny peacock. There are two points of focus in the image  the little bird’s head and its feet. The rest of the image is blurry. It’s not possible for a real photo to have two different focus points.

And, again, there is also a factual error here, too: baby peacocks are actually usually brown or gray and, to be honest, pretty ungainly looking. It takes them a few years to develop their colourful feathers.

There are two points of focus in this image – the peacock’s head and its feet – while the rest of the image is blurry. A photograph can’t actually have two points of focus.
There are two points of focus in this image – the peacock’s head and its feet – while the rest of the image is blurry. A photograph can’t actually have two points of focus. © Observers

As for the little penguins, if you take a closer look, you’ll see that something isn’t quite right with one of the birds in the background  it seems to be made up of two images that aren’t perfectly aligned. This is a sign that the artificial intelligence wasn’t able to generate this part of the image.

You can see an error with one of the penguins in the background – the image of its head isn’t aligned.
You can see an error with one of the penguins in the background – the image of its head isn’t aligned. © Observers

A tool to help detect fakes but use with care 

There are several online tools that you can use to detect images generated by artificial intelligence. Our favorite as of July 2023 is Hive Moderation. If you upload an image to the site, it will give you a percentage indicating how likely it is that the image was generated by artificial intelligence.

We ran these three photos through Hive Moderation, which told us that there was between a 99.6 percent and 99.9 percent likelihood that the images were generated by artificial intelligence. Good call, HiveMod.

A tool called Hive Moderation concluded that these three images were generated by artificial intelligence.
A tool called Hive Moderation concluded that these three images were generated by artificial intelligence. © Hive Moderation / Observers

The site claims that it is 99 percent accurate in analysing if images are AI-generated or not. However, like any algorithm, this tool learns from the images fed into it and, thus, can be easily fooled.

A number of social media users, many of whom were based in Japan, proved they were able to fool the tool. It gave very different responses if the images were modified even in a minor way. Thus, it can’t be considered a foolproof tool for analysis.

The photo on the upper left is real but has been modified with the app FaceApp. Hive Moderation, however, concluded that it was generated by artificial intelligence. The same photo, on the bottom right, has been slightly altered in a different way and, bizarrely, the tool no longer says it was generated by artificial intelligence.
The photo on the upper left is real but has been modified with the app FaceApp. Hive Moderation, however, concluded that it was generated by artificial intelligence. The same photo, on the bottom right, has been slightly altered in a different way and, bizarrely, the tool no longer says it was generated by artificial intelligence. © ken5bt

So why are people sharing fake cute animal images?

Everyone on the internet or nearly everyone seems to love a good animal photo. But that doesn’t explain why people are trying to pass off images of cute animals generated by AI as the real deal.

Our team asked expert Janne Alhberg of the site HoaxEye:

I guess that, first of all, these cute animals have gotten popular simply because people seem to like them and because they work, a lot of people copy the technique.

The accounts that share AI-generated images trying to pass them off as real have different goals. But a lot of them just want to get a lot of likes and followers.

Some accounts are more financially motivated. Some of the accounts are hoping to be sold off once they have a lot of followers [editor’s note: on the black market, people will pay hundreds of dollars to get thousands of followers].

This is a screengrab of a website where it is possible to buy several thousand followers for about a hundred dollars. Our team blurred out the names of the accounts.
This is a screengrab of a website where it is possible to buy several thousand followers for about a hundred dollars. Our team blurred out the names of the accounts. © Observers

Back in January 2018, we interviewed Ahlberg for another article (see below). Back then, we asked her why so many accounts were trying to pass off fake videos of incredible natural phenomena as real. In 2023, the subject might be different, but the idea remains the same.

For more on this topicDebunked: When amazing nature shots are a bit too good to be true



Source link

#Animals #cute #true #detect #AIgenerated #images

No, Elijah Wood did not address Volodymyr Zelensky with an offer to cure his alleged addictions

A video has been circulating online which allegedly shows “Lord of the Rings” actor Elijah Wood giving the Ukrainian president advice on how to tackle his alleged drug and alcohol addictions. He can be seen recording himself and talking to a so-called “Vladimir”. However, the video has been heavily edited and his agent has confirmed it was not published by the actor.

Issued on: Modified:

5 min

If you only have a minute:

  • A video has been shared online that allegedly shows “Lord of the Rings” star Elijah Wood urging Ukrainian President Volodymyr Zelensky to seek treatment for drug and alcohol use.

  • However, the video was edited using a voice recording from an unknown source in which Elijah Wood can be heard speaking to a man named “Vladimir”.

  • Elijah Wood’s agent has confirmed that the actor does not have a public Instagram account and that he did not post the video.

  • The video has a QR code on it that sends users to a fake Netflix series criticising the International Olympic Committee.

The fact-check in detail:

At the beginning of the video, the actor can be heard saying “Vladimir, hi, it’s Elijah! I hope you’re well and in good health.” He then goes on to say: “you have serious problems with drugs and alcohol. I hope you’re taking care of yourself. We know people who can help you.” The video has the graphics of an Instagram story. It has been seen more than 2 million times on Twitter in English and more than 10,000 times in French.

Several users who shared the video claim that the actor is addressing Ukrainian President Volodymyr Zelensky, with an offer to cure his alleged drug and alcohol addictions. Others added that the Instagram account in question has been closed by Meta, the company that owns the social network. This post was also shared on various Russian-speaking Telegram accounts, such as this one, where it has racked up more than 760,000 views.


A number of Russian-language media outlets, such as Tsargrad.tv and 5tv.ru, referred to the video, but they did not identify the source.

Pro-Russian accounts regularly accuse Volodymyr Zelensky of being a drug and alcohol addict, and often manipulate videos to illustrate these claims.

An edited video from a fake Instagram account

Several visual clues suggest that the video was not posted to Instagram and that it was edited.

First, the actor makes no explicit mention of Volodymyr Zelensky or the situation in Ukraine. He addresses the video to “Vladimir”, as opposed to “Volodymyr”. The video is also heavily edited to cut out several sentences and obscure the exact context.

The video features three other elements that enabled us to debunk it: the logo of American media outlet TMZ, the tag “@zelenskiy_official” and a link to the website www.hazeldenbettyford.org, which helps people who suffer from substance abuse.

However, the video does not appear on any of TMZ’s social networks and their logo looks different on their other videos. Our editorial team contacted TMZ, which responded that “This is clearly something that someone posted on social networks and that has nothing to do with TMZ. We have nothing to do with this video.”

Elijah Wood does not have an official, public Instagram account and in March 2023 said that he used a private account. However, the account that the video was published from looks like a verified account with over 1.5 million followers. We took a freeze-frame of the video, which shows the full name of the account (despite being hidden by the TMZ logo): elijah.wood.kingring.

The account name “elijah.wood.kingring” can be seen when freezing the fake Instagram story. © Observers

The elijah.wood.kingring account does exist on Instagram, but it has no followers and only follows three accounts, including that of Volodymyr Zelensky.

elijah.wood.kingring's instagram account in July 2023
elijah.wood.kingring’s instagram account in July 2023 © Instagram/ @elijah.wood.kingring

The FRANCE 24 Observers team contacted JoAnne Colonna, Elijah Wood’s agent, who explained that “Elijah does not have an Instagram account. He is going to make a comment on his Twitter account about this video.” The tweet in question had not been published at the time of writing, but we will include it once it becomes available.

The origin of the video has not been disclosed by Elijah Wood or his team. Some Twitter users claim that it was made using Cameo, a site where you can buy a video message recorded by Elijah Wood for $340. We have not been able to verify this suggestion.

Furthermore, in February 2022, just after Russia invaded Ukraine, the actor publicly expressed his support for Kyiv, and he has never publicly criticised the Ukrainian president.

A disguised operation for a fake series?

Another clue which proves that the video is fake is the QR code on “Elijah Wood’s” fake Instagram account which sends users from the photos and story there to a so-called “Olympics has fallen” series.

The poster for this alleged series also features American actor Tom Cruise, president of the International Olympic Committee Thomas Bach, sports director of the same committee Kit McConnell, and former president of the International Amateur Boxing Association Wu Ching-Kuo.

Screenshot of the Telegram account
Screenshot of the Telegram account “Olympics has fallen” on July 25, 2023. © Telegram

The QR code sends users to a Telegram account that was created at the beginning of July and shows four episodes of the series. The first episode even includes the Netflix logo. In it, Tom Cruise appears onscreen, greets viewers, and then explains, off camera, what the aim of the series is: to reveal to the world the behind-the-scenes workings of the International Olympic Committee, described as a “venal committee that has been destroying the Olympic spirit of sport for years”.

However, the series does not exist; Tom Cruise never directed it. There has been no reference to the film anywhere in the news in recent weeks. The actor has not promoted it on his verified networks, and the only references to the series can be found on Telegram accounts that voice support for Russia.

The Telegram account “Olympics has fallen” has also posted fake messages supposedly written by several American celebrities, including Jared Leto, Miley Cyrus and Mike Tyson, who are said to have congratulated Tom Cruise on his series. 

Three fake stories from American stars were shared to make it look like they were congratulating Tom Cruise on this fake series.
Three fake stories from American stars were shared to make it look like they were congratulating Tom Cruise on this fake series. © Observers

The Telegram channel, which is followed by around 2,500 people, hasn’t had a huge impact, with the most viewed episode garnering a maximum of 90,000 views.



Source link

#Elijah #Wood #address #Volodymyr #Zelensky #offer #cure #alleged #addictions

No, this video does not show the Wagner Group ‘surrendering’ in Sudan

Since the start of the war in Sudan between government troops and the paramilitary Rapid Support Forces (RSF), the role of Russia’s Wagner mercenary group, said to have links to the paramilitary forces, has remained unclear. Against this backdrop, social media users shared a video which they claimed shows Wagner soldiers surrendering to the Sudanese army. But the video was actually filmed during the evacuation of Russian embassy staff from Sudan by regular Russian troops at the start of the conflict in the spring of 2023.

Issued on:

5 min

If you only have a minute:

  • A video of a military convoy is being shared with captions suggesting the Wagner militia in Sudan has “surrendered” to the Sudanese armed forces.
  • We were able to geolocate the video to Khartoum, Sudan.
  • Our team compared it with images taken during the evacuation of Russian diplomats from Sudan in May. We determined that one of the vehicles in the viral video is the same as one of the vehicles used during the evacuation, which allows us to conclude that the video was filmed during the evacuation of diplomatic staff from the Russian embassy on May 2nd, 2023.
  • The Wagner Group is said to have links to General Mohamed Hamdane Daglo, whose RSF forces have been fighting the government since April.

The fact-check in detail:

A video was posted on Twitter on July 16 (archive here) with a caption claiming it shows soldiers employed by the paramilitary Wagner Group, dedicated to defending Russia’s foreign interests, talking with government forces in Sudan. 

A man in uniform can be seen taking a video of himself next to a man speaking Russian on the phone.

Behind them is a white car, followed by at least four military vehicles marked with a “Z”, a symbol painted on Russian military equipment involved in the war in Ukraine. The video garnered more than 86,000 views.

A July 16 tweet purporting to show Wagner mercenaries in Sudan. © Observateurs Capture d’écran Twitter @khalidalbaih

Posts shared the same day in English and Arabic on Facebook claimed that the video showed Wagner mercenaries surrendering to Sudanese troops. They garnered more than 11,000 views.

Video posted on Facebook on July 16, allegedly showing the Wagner group surrendering to Sudanese armed forces.
Video posted on Facebook on July 16, allegedly showing the Wagner group surrendering to Sudanese armed forces. © Observateurs Capture d’écran Twitter

Wagner is said to have links with the RSF of General Mohamed Hamdane Daglo, known as Hemedti. But since the RSF have been fighting the regular Sudanese army, Wagner’s role and position is unclear in Sudan. Wagner founder Yevgeny Prigozhin, who moved the group’s base to Belarus after a short-lived rebellion against Russian president Vladimir Putin in June, has said there are no Wagner personnel on the ground in Sudan.

A video taken in Khartoum

Some elements of the video raise doubts. The letter “Z” is painted on the vehicles. This symbol is often used by the regular Russian army since the start of the war in Ukraine, but rarely by Wagner’s mercenaries in Africa.

Sam Doak, a journalist at the British fact-checking outlet Logically Facts, was able to identify the location where the video was taken.


In the footage, a blue petrol station can be seen in the background. If you search for petrol stations in Sudan, you can see that the brand Oil Libya matches this colour. 

By searching for the petrol station in Khartoum, the city where the scene could have been filmed according to comments from Twitter users, we can find the exact location where the video was taken.

This is a petrol station in Khartoum North. Although there is no street-level view of the street on Google Maps, the aerial view shows the petrol station, the billboard and the large white building behind the station.

The images at the left are stills from the viral video. The image on the right shows the Google Maps aerial view of the site. You can see the petrol station (outlined in purple), the advertising hoarding (outlined in green), the small brown building (outlined in orange), and the large white building (outlined in pink) behind the station.
The images at the left are stills from the viral video. The image on the right shows the Google Maps aerial view of the site. You can see the petrol station (outlined in purple), the advertising hoarding (outlined in green), the small brown building (outlined in orange), and the large white building (outlined in pink) behind the station. © Les Observateurs Google Maps

Video shows Russian embassy staff being evacuated in May 

Was this video taken recently? In the comments, people suggested that the video could have been filmed when the Russian army evacuated civilians from Sudan at the start of the fighting.

Our team searched for images on the Telegram channel of the Russian embassy in Sudan, and found a May 2 image of the white van that appears on the viral video. It is exactly the same white Toyota HiAce, with the same green and brown luggage mounted on the roof and wrapped in netting, alongside a Russian flag.

This photo was taken during the evacuation of a part of the Russian embassy staff on the morning of May, 2nd 2023, according to the Russian authorities. 

A search for other images of the evacuation shows a photo of a military vehicle belonging to the Russian military convoy. It was shared by a pro-Russian account at the time of the evacuation

This vehicle looks very similar to another car seen in the viral video. The number plate is similar, albeit with a one-digit difference, and both vehicles have luggage and a Russian flag on the roof.

The image at the left is a still taken from the viral video. The image on the right was published on the Telegram channel of the Russian embassy in Sudan. You can see the same white Toyota HiAce van, with the same luggage on the roof and the Russian flag.
The image at the left is a still taken from the viral video. The image on the right was published on the Telegram channel of the Russian embassy in Sudan. You can see the same white Toyota HiAce van, with the same luggage on the roof and the Russian flag. © Observateurs

A press release issued by the Russian Ministry of Foreign Affairs explained that the embassy staff were evacuated on May 2 in a convoy that took them to the Al-Shahid Mukhtar air base in the town of Omdurman, near Khartoum, before flying to Moscow.

Amateur videos appear to show the convoy en route.

More than 200 people, including Russian embassy staff, representatives of the Ministry of Defence, Russian citizens and citizens of other countries allied with Russia were reportedly evacuated on the same day by the Russian armed forces.

Wagner’s role in Sudan still unclear

The Wagner Group forged a partnership in 2018 with then-President Omar al-Bashir to illegally exploit the country’s gold resources, as an investigation by an international consortium of journalists has documented

At the same time, the Russian militia developed relations with General Mohamed Hamdan Daglo, also known as Hemedti, and his paramilitary group RSF. Allies of al-Bashir before his ouster in 2019, Daglo and his RSF joined the rebellion against him and later took up arms against government forces in April 2023.

An investigation by the open-source investigative organisation All Eyes on Wagner and CNN in April 2023 suggests that Wagner supplied missiles to the RSF to support their fight against the Sudanese army. 

All Eyes on Wagner has also developed an interactive map listing the Russian militia’s activities and human rights abuses around the world.



Source link

#video #show #Wagner #Group #surrendering #Sudan