日韩精品久久一区二区三区_亚洲色图p_亚洲综合在线最大成人_国产中出在线观看_日韩免费_亚洲综合在线一区

Global EditionASIA 中文雙語Fran?ais
World
Home / World / Americas

AI becoming a handy tool for US fraudsters

By BELINDA ROBINSON in New York | China Daily Global | Updated: 2023-07-28 07:02
Share
Share - WeChat

Technology being employed to clone people's voice for ransom, govt warns

People in the United States are being warned to stay vigilant against a growing number of scams using artificial intelligence that mimics a person's voice during a phone call to a concerned relative or friend, who is then asked to send money for ransom.

The Federal Trade Commission, or FTC, issued the consumer warning alert this year after an increase in the number of people reporting they had been asked to send money after receiving a frantic phone call from a person who they believed was their loved one but was in fact a cloned voice using AI.

Jennifer DeStefano from Scottsdale, Arizona, experienced the crime firsthand. She told a US Senate judiciary hearing last month that she got a call from an unlisted number in April, and when she picked up, she could hear her daughter, Briana, crying.

"Mom! I messed up," her daughter said sobbing on the phone call.

DeStefano asked her daughter, "OK, what happened?"

She then heard a man's voice on the phone telling her daughter to "lay down and put your head back".

He then told the worried mother: "Listen here, I have your daughter. You tell anyone, you call the cops, I am going to pump her stomach so full of drugs."

DeStefano was at her other daughter Aubrey's dance rehearsal when she picked up the phone. She put the phone on mute and asked nearby parents to call 911.

The scammer first asked her to send $1 million, but when she said she did not have access to that much money, he asked for $50,000 in cash and arranged a meet-up spot.

The terrified mother said the man on the phone told her that "if I didn't have all the money, then we were both going to be dead".

However, she contacted her husband and daughter and found out Briana was safe, and it was a hoax.

Cybercrimes on rise

Last year, frauds and scams rose 30 percent compared with the previous year, the FTC said. Cybercrimes are also increasing with losses of $10.2 billion last year, the FBI said.

Scammers use AI to mimic a person's voice by obtaining "a short audio clip of your family member's voice from content posted online and a voice-cloning program", the consumer protection watchdog said. When they call, they will sound just like the person's loved one.

In another scam, a Canadian couple was duped out of C$21,000($15,940) after listening to an AI voice that they thought was their son, The Washington Post reported in March.

According to a recent poll by McAfee, an antivirus software organization in San Jose, California, at least 77 percent of AI scam victims have sent money to fraudsters.

Of those who reported losing money, 36 percent said they had lost between $500 and $3,000, while 7 percent got taken for anywhere between $5,000 and $15,000, McAfee said.

About 45 percent of the 7,000 people polled from nine countries — Australia, Brazil, France, Ger — many, India, Japan, Mexico, the United Kingdom and the US — said they would reply and send money to a friend or loved one who had asked for financial help via a voicemail or note.

Forty-eight percent said they would respond quickly if they heard that a friend was in a car accident or had trouble with their vehicle.

Although phone scams are nothing new worldwide, in this AI version, fraudsters are getting the money sent to them in a variety of ways, including wire transfers, gift cards and cryptocurrency.

Consumers are being encouraged to contact the person that they think is calling to check if they are OK before ever sending cash.

FTC Chair Lina Khan warned House lawmakers in April that fraud and scams were being "turbocharged" by AI and were of "serious concern".

Avi Greengart, president and lead analyst at Techsponential, a technology analysis and market research company in the US, told China Daily: "I think that it is hard for us to estimate exactly how pervasive (AI) is likely to be because this is still relatively new technology. Laws should regulate AI."

The software to clone voices is becoming cheaper and more widely available, experts say.

AI speech software ElevenLabs allows users to convert text into voice-overs meant for social media and videos, but many users have already shown how it can be misused to mimic the voices of celebrities, such as actress Emma Watson, podcast host Joe Rogan and columnist and author Ben Shapiro.

Other videos mimicking the voices of US President Joe Biden and former president Donald Trump have also appeared on platforms such as Instagram.

Most Viewed in 24 Hours
Top
BACK TO THE TOP
English
Copyright 1995 - . All rights reserved. The content (including but not limited to text, photo, multimedia information, etc) published in this site belongs to China Daily Information Co (CDIC). Without written authorization from CDIC, such content shall not be republished or used in any form. Note: Browsers with 1024*768 or higher resolution are suggested for this site.
License for publishing multimedia online 0108263

Registration Number: 130349
FOLLOW US
主站蜘蛛池模板: 亚欧成人中文字幕一区 | 免费在线公开视频 | 欧美久久久无码精品亚洲日韩小说 | 免费永久欧美性色xo影院 | 福利视频区 | 色搞搞| 欧美日韩精品一区二区三区 | 91成人在线 | 久久久久久久国产精品毛片 | 欧美成人观看视频在线 | 成人精品鲁一区一区二区 | 久久精品二区 | 久草最新视频 | 久久一本精品 | 亚洲一区二区在线视频 | 亚洲高清视频在线观看 | 亚洲娇小性xxxx色 | 精产国产伦理一二三区 | 久久一本日韩精品中文字幕屁孩 | 夜干夜干2017最新网站 | 9久久99久久久精品齐齐综合色圆 | 亚洲精品婷婷无码成人A片在线 | 国产色产综合色产在线观看视频 | 97超级碰碰碰视频在线视频观看 | 欧美又黄又嫩大片a级 | 久草视频电影 | 香蕉一区二区 | 日韩avav| 在线麻豆视频 | 91视频精选| 亚洲网站色 | 日日夜夜婷婷 | 六月婷婷啪啪 | 免费精品美女久久久久久久久久 | 日韩av片免费播放 | 欧美成人26uuu欧美毛片 | 日本粉嫩一区二区三区视频 | 久久久入口 | 国产精品免费一区二区三区都可以 | 国产精品视频一区二区三区 | 青青在线香蕉精品视频免费看 |