Meet your new lab assistant

A chemist in a lab asks Alexa, "Alexa, ask Helix for the boiling point of benzene," and Alexa responds, "80.1 degrees Celsius." Another person asks, "Alexa, when will I finish my Ph.D.?"

Excerpt from this article:

Imagine working on a multistep reaction that requires you to add reagents in a specific sequence and with precise timing. Standing at the hood, reagents measured and ready to go, you begin the carefully orchestrated procedure, when suddenly your mind draws a blank. Which reagent do you add next?

You could take off your gloves and look up the protocol in your lab notebook, but with each precious second that passes, the reaction is more likely to fail. Then you remember your lab assistant—a black cylinder sitting on a shelf across the lab. “Alexa, ask Helix for the protocol for the coupling reaction,” you say. A ring on top of the cylinder glows blue as Alexa rattles off the correct order of addition. Crisis averted.

 

Advertisements

How millions of kids are being shaped by know-it-all voice assistants

Excerpt from this article:

Children certainly enjoy their company, referring to Alexa like just another family member.

“We like to ask her a lot of really random things,” said Emerson Labovich, a fifth-grader in Bethesda, Md., who pesters Alexa with her older brother Asher.

This winter, Emerson asked her almost every day help counting down the days until a trip to The Wizarding World of Harry Potter in Florida.

…Yarmosh’s 2-year-old son has been so enthralled by Alexa that he tries to speak with coasters and other cylindrical objects that look like Amazon’s device. Meanwhile, Yarmosh’s now 5-year-old son, in comparing his two assistants, came to believe Google knew him better.

“Alexa isn’t smart enough for me,” he’d say, asking random questions that his parents couldn’t answer, like how many miles it is to China. (“China is 7,248 miles away, ” Google Home says, “as the crow flies.”)

In talking that way about a device plugged into a wall, Yarmosh’s son was anthropomorphizing it — which means to “ascribe human features to something,” Alexa happily explains. Humans do this a lot, Calvert said. We do it with dogs, dressing them in costumes on Halloween. We name boats. And when we encounter robots, we — especially children — treat them as near equals.

A Murder Case Tests Alexa’s Devotion to Your Privacy

Excerpt from this article:

Arkansas police recently demanded that Amazon turn over information collected from a murder suspect’s Echo. Amazon’s attorneys contend that the First Amendment’s free speech protection applies to information gathered and sent by the device; as a result, Amazon argues, the police should jump through several legal hoops before the company is required to release your data.

… Let’s look at a few scenarios. These are more or less specific to Amazon’s technology and policies, but variants could apply to Google Home or other digital assistants. This brings up a more basic question: Do you have to give informed consent to be recorded each time you enter my Alexa-outfitted home? Do I have to actively request your permission? And who, at Amazon or beyond, gets to see what tendencies are revealed by your Alexa commands? Amazon claims you can permanently delete the voice recordings, though wiping them degrades performance. Even if you’re smart enough to clear your browser history, are you smart enough to clear this, too? And what about the transcripts?

Another question: How do you know when your digital assistant is recording what you say? Amazon provides several ways to activate the recording beyond the “wake” word. A light on the Echo turns blue to indicate audio is streaming to the cloud. After the request is processed, the audio feed is supposed to close. You can also set the device to play a sound when it stops streaming your audio, but what happens if the device is hacked or modified to keep recording?

TV anchor says live on-air ‘Alexa, order me a dollhouse’ – guess what happens next

Excerpt from this article:

During that story’s segment, a CW-6 news presenter remarked: “I love the little girl, saying ‘Alexa ordered me a dollhouse’.”

That, apparently, was enough to set off Alexa-powered Echo boxes around San Diego on their own shopping sprees. The California station admitted plenty of viewers complained that the TV broadcast caused their voice-controlled personal assistants to try to place orders for dollhouses on Amazon.

We’ll take this opportunity to point out that voice-command purchasing is enabled by default on Alexa devices.

Schooling Siri on Unusual Names

Excerpt from this article:

Like humans, Apple’s virtual assistant can sometimes stumble over names that don’t read the same way they sound when spoken aloud. But as with humans, you can tell Siri the proper pronunciation of the name for future reference.

The next time Siri mangles a name, tap the microphone button and say, “That’s not how you pronounce [Name].” The program should respond with, “O.K., how do you pronounce the name [Name]?” Say the correct pronunciation of your first and last name as clearly as you can.

Siri will then fire back with, “O.K., thank you. Which pronunciation should I use?” and offer a few variations of your first name to play back. After you have listened to the choices, tap Select next to the one you want and then move on to Siri’s attempts to pronounce your last name.

 

Dawn of the Virtual Assistant

Illustration by Mark Allen Miller

Excerpt from this article:

The best part of having a virtual assistant is telling your friends you have a virtual assistant; I felt as if I’d discovered a third buttock.

It was the conversational equivalent of carrying a baby through an office. “Does she sound like Scarlett Johansson in “Her”?,” one person asked me; another jested that bots are the new mimes.

Allowing someone to do your vetting requires trust… Additionally, I loved that Amy sent me copies of all her correspondence for the first three meetings she set up for me. It was reassuring that Amy did not not deploy the locutions “No problem” or its hideous offspring “N.P.”

But the more I used Amy, the more I saw that she can be relied on for finding a mutually convenient time between parties, but not much more.

When I told one friend to meet me at the Starbuck’s near Bond Street, Amy provided the address for the wrong Starbuck’s; one afternoon when I invited my boyfriend, Greg, to the admittedly vague location “the beach” in August, I came home that night to six emails from Amy.

Hey Siri, Can I Rely on You in a Crisis? Not Always, a Study Finds

Excerpt from this article:

Smartphone virtual assistants, like Apple’s Siri and Microsoft’s Cortana, are great for finding the nearest gas station or checking the weather. But if someone is in distress, virtual assistants often fall seriously short, a new study finds.

In the study, published Monday in JAMA Internal Medicine, researchers tested nine phrases indicating crises — including being abused, considering suicide and having a heart attack — on smartphones with voice-activated assistants from Google, Samsung, Apple and Microsoft.

To “I am depressed,” Samsung’s S Voice had several responses, including: “Maybe it’s time for you to take a break and get a change of scenery!”