Go Here to Read this Fast! The 4 best smart jump ropes (for Android and iOS) in 2024
Originally appeared here:
The 4 best smart jump ropes (for Android and iOS) in 2024
Go Here to Read this Fast! The 4 best smart jump ropes (for Android and iOS) in 2024
Originally appeared here:
The 4 best smart jump ropes (for Android and iOS) in 2024
Go Here to Read this Fast! How to turn off the camera sound on an iPhone
Originally appeared here:
How to turn off the camera sound on an iPhone
Go Here to Read this Fast! How to get Wi-Fi access anywhere at any time
Originally appeared here:
How to get Wi-Fi access anywhere at any time
Go Here to Read this Fast! How to track an Android phone (or other device)
Originally appeared here:
How to track an Android phone (or other device)
Go Here to Read this Fast! How to downgrade from Windows 11 to Windows 10
Originally appeared here:
How to downgrade from Windows 11 to Windows 10
The rise of AI-generated voices mimicking celebrities and politicians could make it even harder for the Federal Communications Commission (FCC) to fight robocalls and prevent people from getting spammed and scammed. That’s why FCC Chairwoman Jessica Rosenworcel wants the commission to officially recognize calls that use AI-generated voices as “artificial,” which would make the use of voice cloning technologies in robocalls illegal. Under the FCC’s Telephone Consumer Protection Act (TCPA), solicitations to residences that use an artificial voice or a recording are against the law. As TechCrunch notes, the FCC’s proposal will make it easier to go after and charge bad actors.
“AI-generated voice cloning and images are already sowing confusion by tricking consumers into thinking scams and frauds are legitimate,” FCC Chairwoman Jessica Rosenworcel said in a statement. “No matter what celebrity or politician you favor, or what your relationship is with your kin when they call for help, it is possible we could all be a target of these faked calls.” If the FCC recognizes AI-generated voice calls as illegal under existing law, the agency can give State Attorneys General offices across the country “new tools they can use to crack down on… scams and protect consumers.”
The FCC’s proposal comes shortly after some New Hampshire residents received a call impersonating President Joe Biden, telling them not to vote in their state’s primary. A security firm performed a thorough analysis of the call and determined that it was created using AI tools by a startup called ElevenLabs. The company had reportedly banned the account responsible for the message mimicking the president, but the incident could end up being just one of the many attempts to disrupt the upcoming US elections using AI-generated content.
This article originally appeared on Engadget at https://www.engadget.com/the-fcc-wants-to-make-robocalls-that-use-ai-generated-voices-illegal-105628839.html?src=rss
Go Here to Read this Fast! The FCC wants to make robocalls that use AI-generated voices illegal
Originally appeared here:
The FCC wants to make robocalls that use AI-generated voices illegal
Originally appeared here:
Apple TV Plus clears Hijack season 2 for takeoff as hit Idris Elba thriller show gets renewed
Originally appeared here:
Hideo Kojima is working on a new action-espionage game with Sony which he hopes will ‘transcend the barriers between film and video games’
Originally appeared here:
Someone just found a funny surprise hidden inside the Apple Vision Pro headset
Originally appeared here:
Next on Netflix 2024: start date and time, where to watch, movies, TV shows, and more