【free eroticism】
As if we needed more reasons to be free eroticismfreaked out by increasingly powerful digital assistants, there's a new nightmare scenario: The music you listen to or conversations you hear on TV could hijack your digital assistant with commands undetectable to human ears.
This is known as a "Dolphin Attack" (because dolphins can hear what humans can't), and researchers have been aware of the possibility for years. The basic idea is that commands could be hidden in high-frequency sounds that our assistant-enabled gadgets can detect, but we are unable to hear.
SEE ALSO: Google: Use phones less, but use AI moreResearchers proved in 2016 they could use the technique to trigger basic commands, like making phone calls and launching websites. At the time, they hypothesized that it might be possible to embed these audio cues into music and other recordings, which would significantly amp up the creepy factor.
Now, that day has come. In a paper first reported on by The New York Times, researchers proved it is in fact possible to hide audio inside of other recordings in a way that's nearly undetectable to human ears.
The researchers were able to do this using recordings of music and speech; in both cases, the changes were almost completely undetectable. Notably, the researchers tested this with speech recognition software, not digital assistants, but the implications of the experiment are huge.
A 4-second clip of music came out as “okay google browse to evil dot com”
In one example, they took a 4-second clip of music, which, when fed to the speech recognition software, came out as “okay google browse to evil dot com.” They were able to do the same with speech — hiding "okay google browse to evil dot com," inside a recording of the phrase "without the dataset the article is useless.”
In both cases, it's nearly impossible for humans to detect any differences between the two clips. The paper's authors note there is some "slight distortion," in the adulterated clips, but it's extremely difficult to discern. (You can listen to them for yourself here.)
This research could have troubling implications for tech companies and the people who buy their assistant-enabled gadgets. In a world in which television commercials are already routinely triggering our smart speakers, it's not difficult to imagine pranksters or hackers using the technique to gain access to our assistants.
This is made all the more troubling by the growing trend of connecting these always-listening assistants to our home appliances and smart home gadgets. As The New York Timespoints out, pranksters and bad actors alike could use the technique to unlock our doors or siphon money from our bank accounts.
It's not difficult to imagine hackers using the technique to gain access to our assistants.
Tech companies, on their part, are aware of all this, and features like voice recognition are meant to combat some of the threat. Apple, Google, and Amazon told the Timestheir tech has built-in security features, but none of the companies provided specifics. (It's also worth pointing out that Apple's HomePod, Amazon's Echo, and the Google Home all have mute switches that prevent the speakers from listening for their "wake words"—which would likely be a hacker's way in.)
It doesn't help that the latest research comes at a moment when many experts are raising questions about digital assistants. Earlier this week at Google's I/O developer conference, the company showed off a new tool, Duplex, which is able to make phone calls that sound just like an actual human.
Since the demo, many have questioned whether it's ethical to for an AI to make such calls without disclosing that it's an AI. (Google says it's working on it.)
Now, we might have even more to worry about.
Featured Video For You
Four MIT graduates opened a restaurant with a fully functional robotic kitchen
Topics Alexa Artificial Intelligence Google Assistant Siri Gadgets
Search
Categories
Latest Posts
Better Buy: Previous
2025-06-26 22:43Shakedown: Cossery in Egypt by Mostafa Heddaya
2025-06-26 22:41Story Time! by Sadie Stein
2025-06-26 22:35DFW: the Trading Card, and Other News by Sadie Stein
2025-06-26 21:47Acupuncture for pets is on the rise
2025-06-26 20:30Popular Posts
New panda mom doesn't know she has twins thanks to sneaky zookeepers
2025-06-26 22:14Indian Comics, Professor Nabokov, and Other News by Sadie Stein
2025-06-26 21:30Our Books Lack Feelings, and Other News by Sadie Stein
2025-06-26 21:28Shop the Google Pixel Pro 9 for $200 off at Amazon
2025-06-26 20:50Featured Posts
Philips now allows customers to 3D print replacement parts
2025-06-26 22:30Anaïs Nin on Heroes by Sadie Stein
2025-06-26 22:16The Joy of Books by Sadie Stein
2025-06-26 22:01A Prize for Isol by Sadie Stein
2025-06-26 21:14Fyre Festival and Trump’s Language
2025-06-26 21:11Popular Articles
Best headphones deal: Save up to 51% on Beats at Amazon
2025-06-26 22:51Story Time! by Sadie Stein
2025-06-26 22:47Persepolis Ascendant, and Other News by Sadie Stein
2025-06-26 22:02#rateaspecies is basically Yelp reviews for zoo animals
2025-06-26 21:34Newsletter
Subscribe to our newsletter for the latest updates.
Comments (228)
Ignition Information Network
Free Rita's Italian Ice: How to get free Italian Ice on March 20
2025-06-26 23:04Creation Information Network
Letter from Boston by Michael McGrath
2025-06-26 22:31Sharing Information Network
Low Boil by Tallis Eng
2025-06-26 21:38Reality Information Network
Festival Guide: A List of Don’ts for the Lady Music Writer by Natalie Elliott
2025-06-26 21:17Progress Information Network
Operation Mensch
2025-06-26 20:43