Audit expert: Liu Xiangquan Director of the Robotics Engineering Department, Beijing Information Science and Technology University AI face-changing Seeing is no longer believing! In an age where AI can make fake things look real, how can we ensure authenticity? Recently, the Hong Kong police disclosed an AI "multi-person face-changing" fraud case, with the amount involved reaching 200 million Hong Kong dollars. The victim was an employee of a multinational company's Hong Kong branch. He was invited to participate in a "multi-person video conference" initiated by the chief financial officer of the headquarters, and as required, he transferred 200 million Hong Kong dollars in multiple installments to five local bank accounts. Source | CCTV.com When the employee later confirmed with the head office, he discovered that he had been cheated. After investigation, the police found that in that video conference, only one employee was a real person, and all other participants were forged by scammers using AI face-changing technology! It is understood that the scammers cleverly copied the appearance and voice of the company's senior executives by stealing the company's YouTube videos and other public media materials. Then, they used deep fake technology to create fraudulent videos to make victims mistakenly believe that they were communicating with other real people in a multi-person video conference. What is deep fake technology? Deepfake is a noun that is a combination of deep learning and fake. Originally, this artificial intelligence technology that adjusts videos or pictures was just an interesting new breakthrough that could put a person's facial expressions, speech content and movements on someone else's picture or video, and make it look very real. Source | CCTV.com Because this technology is convenient, low-cost and realistic , it has brought many new opportunities to the fields of art, social interaction, and medicine. However, because this technology makes it easier to create fake videos, some criminals may use it to deceive people, spread false information or create social conflicts, posing a great risk to our safety. In the face of AI, the laws of “pictures are evidence of the truth” and “seeing is believing” have been overturned. What does AI fraud generally look like? In life, when encountering relatives and friends asking to borrow money online, cautious people will generally choose to make a voice or video call with the other party, confirm the other party's identity before transferring the money. But what if the “relatives and friends” who answered the phone were also forged by scammers using AI? Source | Xinhua News Agency Scams that use AI technology usually include the following steps: 1. Steal the target’s social media account and look for potential victims; 2. Using AI robots to make harassing phone calls, collect the target person’s (changed to target relatives or friends) voice data, and forge the voice; 3. Collect photos, videos and other information of the target's relatives and friends through various channels, and disguise as the target's relatives and friends through AI face-changing; 4. Use AI face-changing technology to make video calls with victims to gain their trust and commit fraud. After these three steps, the previous method of preventing fraud by "sending voice" or "making video calls" is no longer effective in the face of AI fraud. Source | The Paper What is even more worrying is that the number of scammers who master AI technology is increasing. Public data shows that since 2021, cases of using AI for fraud and improper profiteering have occurred in many provinces. The Internet Society of China also pointed out that illegal acts such as using technologies such as "AI face-changing" and "AI voice-changing" for fraud and defamation have frequently occurred. Learn a few tricks to identify AI forgeries in time How can we prevent such fraudulent methods that look so real? 1. Ask the subject to wave their hands in front of their face, or pinch their nose or press their cheeks During a video call, waving will interfere with the facial data, causing the fake face to appear jittery, flashing, or other abnormalities, so we can ask the other party to wave in front of the face for identification. In addition, we can also ask the other party to try to press the nose or face to observe the changes in their face. A real person's nose will deform when pressed, but the AI-generated nose will not. However, as AI training becomes more mature, pinching the nose is no longer the safest identification method. Source | CCTV.com 2. Ask questions that only the other person knows Although physical appearance can be faked, personal memories and experiences cannot be replicated. Therefore, during the video call, we can ask some questions that only the other party knows, such as private information such as home address, birthday, pet name, etc. If the other party shows hesitation in answering or cannot answer accurately, then it is likely to be a scam. Source: Wenxinyige (picture drawn by AI) “When news, images, audio, video — anything can be faked, then in that world, nothing is real,” said Farid, associate dean of the School of Information at the University of California, Berkeley. As we enter an era where AI can make fake things look real, the challenge of ensuring authenticity seems more daunting than ever. However, we are not helpless in the face of this technological revolution. Through the improvement of technological literacy, global cooperation and public education , we are confident that we can meet the challenges and maintain the authenticity and credibility of information. |
>>: "I am your wall breaker!" Air & Water: ...
A few days ago, I saw a news report that a Japane...
In real life, for ordinary people like us, is the...
The progress of Chinese brands in recent years is...
I wonder if you have noticed that both Eastern an...
Many people find that their total cholesterol lev...
Generative AI technology is very popular this yea...
Although Android still has many problems, it is u...
Through this article, you can learn how to formul...
"HiFi mobile phone" - when hearing this ...
Pancreatic neuroendocrine cell tumor Amyotrophic ...
Huawei App Market Brand Resource Bidding Promotio...
Doushen Chinese Language Peking University Master...
Generally speaking, besides the basic display fun...
The 2016 China Show Entertainment Market Research...
In recent years, Kuaishou has combined its own po...