to user data when it is compelled by a judge’s order. “People are very scared,” LearnedMiller says. But he believes the fears are misplaced. “If a company like Facebook

really oversteps the bounds of what is ruled as acceptable by society … they could go out of business. If they break laws, then they can be shut down and people can be arrested.” He says that the suspicion stems

THE PRIVACY ARMS RACE

When your voice betrays you By David Shultz

“M

y voice is my password.” You may soon find yourself saying that—or perhaps you already do—when you call your bank or credit card company. Like a fingerprint or an iris scan, every voice is unique, and security companies have embraced voice recognition as a convenient new layer of authentication. But experts worry that voiceprints could be used to identify speakers without their consent, infringing on their privacy and freedom of speech. Voiceprints are created by recording a segment of speech and analyzing the frequencies at which the sound is concentrated. Physical traits like the length of a speaker’s vocal tract or a missing tooth leave their mark on a voice, creating a unique spectral signature. Unlike a fingerprint, a voiceprint incorporates behavioral elements as well; traits like cadence, dialect, and accent easily distinguish, say, Christopher Walken from Morgan Freeman. Speech recognition systems, which aim to understand what is being said, minimize these differences, normalizing pitch and overlooking pauses and accents. But for identifying a unique individual, the disparities are crucial. Because voiceprint systems typically have the user repeat a standard phrase, identity thieves could theoretically record such phrases and play them back to fool the technology. The systems are designed to detect recordings or synthesized speech, however. An even safer alternative is to ask the customer to repeat a randomly chosen bit of text. “The system will

494

prompt the user, ‘Now say this phrase,’” says Vlad Sejnoha, the chief technology officer at Nuance Communications Inc. in Burlington, Massachusetts, an industry leader in voice recognition technology. “It’s hard to come prepared with all possible recordings.” Some systems require no pass phrase at all but rather analyze a person’s voice by listening in the background—for instance, as they talk to a sales representative—and compare it with a stored voiceprint. Demand for voiceprint authentication is skyrocketing. Nuance Director Brett Beranek says the company has logged more than 35 million unique voiceprints in the past 24 months, compared with only 10 million over the previous 10 years. But massive voiceprint databases could make anonymity a scarcer commodity. “Like other biometrics, voiceprint technology does raise privacy issues, because it gives companies and the government the ability to identify people even without their knowledge,” says Jennifer Lynch, an attorney at the Electronic Frontier Foundation in San Francisco, California, specializing in biometrics. “That does create a challenge to anonymous speech protection” as enshrined in the United States’ First Amendment, she says. How and when voiceprints can be captured legally is murky at best. Many countries have legislation regulating wiretapping, but voice recognition adds a major new dimension that most lawmakers have yet to consider, Lynch says. If the past is any guide, companies have massive financial incentives to track consumers’ movements and habits. Recognizing someone as soon as they pick up the phone or approach a cashier will open up marketing opportunities—as well as ease transactions for the consumer. As with many new authentication technologies, the balance between convenience and privacy has yet to be struck. ■

from the lack of transparency. Whereas academic researchers must obtain explicit consent from people to use private data for research, those who click “agree” on Facebook’s end-user license agreement (EULA) grant the company permission to use their data with few strings attached. Such online contracts “are the antithesis of transparency,” Learned-Miller says. “No one really knows what they’re getting into.” Last year, the company introduced a friendly looking dinosaur cartoon that pops up on the screen and occasionally reminds users of their privacy settings, and it boiled down the EULA language from 9000 words to 2700. There is already a bustling trade in private data—some legal, others not— and facial identity will become another hot commodity, Iowa State’s Mennecke predicts. For example, facial IDs could allow advertisers to follow and profile you wherever there’s a camera—enabling them to cater to your preferences or even offer different prices depending on what they know about your shopping habits or demographics. But what “freaks people out,” Mennecke says, “is the idea that some stranger on the street can pick you out of a crowd. … [You] can’t realistically evade facial recognition.” FacialNetwork, a U.S. company, is using its own deep learning system to develop an app called NameTag that identifies faces with a smart phone or a wearable device like Google Glass. NameTag reveals not only a person’s name, but also whatever else can be discovered from social media, dating websites, and criminal databases. Facebook moved fast to contain the scandal; it sent FacialNetwork a cease and desist letter to stop it from harvesting user information. “We don’t provide this kind of information to other companies, and we don’t have any plans to do so in the future,” a Facebook representative told Science by e-mail. The potential commercial applications of better facial recognition are “troublesome,” Learned-Miller says, but he worries more about how governments could abuse the technology. “I’m 100% pro–Edward Snowden,” Learned-Miller says, referring to the former National Security Agency contractor who in 2013 divulged the U.S. government’s massive surveillance of e-mail and phone records of U.S. citizens (see p. 495). “We have to be vigilant,” he says. Learned-Miller’s sentiment is striking, considering that he is funded in part by the U.S. Intelligence Advanced Research Projects Activity to develop a classified facial recognition project called Janus. Perhaps that’s all the more reason to take his warning seriously. ■ sciencemag.org SCIENCE

30 JANUARY 2015 • VOL 347 ISSUE 6221

Published by AAAS

Downloaded from www.sciencemag.org on March 8, 2015

THE END OF PR I VAC Y

IMAGE: WILLIAM DUKE

SPECIA L SECTION

The privacy arms race. When your voice betrays you.

The privacy arms race. When your voice betrays you. - PDF Download Free
101KB Sizes 3 Downloads 8 Views