The danger of deepfake technology

avatar

image.png
source

What is deepfake

For those who don’t know what deepfake is yet; wikipedia has a good explanation:
Deepfakes (a portmanteau of "deep learning" and "fake") are synthetic media in which a person in an existing image or video is replaced with someone else's likeness.

Developments

Manipulating audio and/or video or images is not new. There have been many cases of celebrities whose heads have been placed on porn actresses. But when you paid a bit of attention to the video you could see it was fake.
By using powerful techniques from machine learning and artificial intelligence the deepfakes are getting better and better every day and thereby more difficult to be distinguished from reality.
Another thing about the rapid development of these techniques is that it gets more accessible. A couple of years ago you needed expensive software, but nowadays you can easily do it with your smartphone. The funny snapchat filters we’re using are actually fakes too.

Check out this video:

So, with deepfakes you can let people say things they haven’t said. We’ve probably all seen video’s of Trump or other presidents saying funny things, but deepfakes can be a serious threat.

Threats

These techniques are not only used for the good. Cybercriminals also embraced deepfake technology.
This technology gives them a tool to deceive their victims.
A week ago I created a blog about Whatsapp fraud where criminals pretend to be a friend of the victim and ask them to send them some money because they had some financial problems.
https://leofinance.io/@friendlymoose/huge-increase-of-whatsapp-fraud
But what if you received a voicemail of even a video call of a friend that’s asking you for money. That would be more convincing than just some text messages.

And if you think that this isn’t going to happen soon; Wrong! It already happened!
In 2019 there was already a case where criminals used deepfakes to impersonate a chief executive’s voice and demand a fraudulent transfer of €220,000. The subordinate recognized the voice of the CEO and made the transfer.

Source: https://www.wsj.com/articles/fraudsters-use-ai-to-mimic-ceos-voice-in-unusual-cybercrime-case-11567157402

And now what?

We’ve arrived at a point where we can’t trust anything we see anymore and there aren’t any ways to distinguish fake from real.
What is the solution; I don’t know. We just have to be very careful for now.

Posted Using LeoFinance Beta



0
0
0.000
4 comments
avatar

pixresteemer_incognito_angel_mini.png
Bang, I did it again... I just rehived your post!
Week 33 of my contest just started...you can now check the winners of the previous week!
!BEER
1

0
0
0.000
avatar

Even worse, a deepfake video of an innocent person committing a murder. These things can also be used for good, such as having a celebrity appear to comfort a dying fan. Technology can go either way.

Posted Using LeoFinance Beta

0
0
0.000
avatar

It's also used in cases of child loss. Parents are able to see their kid that passed away and even talk with them. It is a bit strange, but good for the bereavement.

Posted Using LeoFinance Beta

0
0
0.000