news oceon

The underestimated dangers of deepfake geography

Spread the love


Deepfake satellite imagery poses an increasingly serious threat to national security, according to researchers. But what are we talking about exactly? And what can be the dangers?

Deepfake is an image synthesis technique based on artificial intelligence (AI) to overlay already existing audio and video files. You then get completely bogus new content. Until now, concerns about deepfakes have centered around machine-manipulated videos of celebrities and other world leaders saying or doing something they have never actually said or done.

Faced with these threats, some large tech companies like Amazon, Facebook and Microsoft have jointly launched a challenge to detect these fakes.

However, while bogus speeches by politicians and other celebrity pornography spreading on social media have received widespread public attention in recent years, another threat should not be underestimated: that of the doctored images of the Earth itself.

Deepfake geography

The growing convergence of AI and Geographic Information Systems (GIS) has enabled dramatic advances in the field of geospatial artificial intelligence. However, in recent years, researchers have also witnessed unintended and problematic consequences of this convergence. Let us cite problems of manufactured GPS signals or even fake photos of geographic environments.

As of yet, these “fakes” have not yet proliferated, but some scientists are increasingly concerned about the spread of this AI-generated data. And for good reason, such information could be misleading in various ways. Humans have indeed been “lying” with their cards for about as long as cards have existed. However, the consequences might not be the same today.

This false information could in particular be used to discredit stories based on real satellite imagery. For The edge, James Vincent thus takes the example of the Uyghur detention camps in China, which have gained credibility thanks to satellite evidence. “As the deepfake geography generalizes, the Chinese government could claim that these images are also fake“.

Satellite images of what looks like a Uighur detention camp in China. Credit: ASPI

A threat to national security

This type of technique could also be a national security issue for some countries having to deal with geopolitical adversaries relying on fakes to deceive them.

Todd Myers, automation manager for the CIO Technology Directorate at the National Geospatial-Intelligence Agency, warned the U.S. military against the prospect as early as 2019 following China’s progress in this area. Thanks to an emerging technique called “generative antagonistic networks”, the country can indeed encourage computers to “See” objects that do not exist in landscapes or in satellite images.

The analyst had at the time imagined a scenario in which military planning software was tricked with false data revealing a bridge in an incorrect location. “From a tactical point of view, you could then train your forces to follow a certain route towards said bridge, when in reality there is no bridge.“, Myers explained then. “And there, a big surprise awaits you“.

Raise awareness and counter the problem

For Bo Zhao of the University of Washington, the first step in tackling this problem is to recognize the threat. In a recent item, the researcher details how he was able to create with his team their own satellite images generated by the AI. As he details in The Verge, the aim was then to “demystify the idea that satellite images are absolutely reliable” and of “raise awareness of the potential influence of deepfake geography“. According to him, his article is indeed probably the first to address the subject of these “fakes” in this area.

As part of their study, Zhao and his colleagues also created a detection software capable of detecting satellite counterfeits depending on characteristics such as texture, contrast and color. They point out, however, that such a tool would need constant updates to follow the improvements of the deepfake.


Source link

Even more News