Is anyone else annoyed how “funny” Siri is?

Excerpt from this post on Reddit:

I watched the new video of MKBHD about the MacBook. There he is telling Siri “open my Downloads folder” and she responds with “Heres that Folder, good stuff in there”. I don’t know if its just me but I find this very annoying and useless. Who wants to hear this all the time?

In my opinion there should be an option to turn off all remarks of Siri

Alexa and Siri Can Hear This Hidden Command. You Can’t.

Excerpt from this article:

Over the last two years, researchers in China and the United States have begun demonstrating that they can send hidden commands that are undetectable to the human ear to Apple’s Siri, Amazon’s Alexa and Google’s Assistant. Inside university labs, the researchers have been able to secretly activate the artificial intelligence systems on smartphones and smart speakers, making them dial phone numbers or open websites. In the wrong hands, the technology could be used to unlock doors, wire money or buy stuff online — simply with music playing over the radio.

A group of students from University of California, Berkeley, and Georgetown University showed in 2016 that they could hide commands in white noise played over loudspeakers and through YouTube videos to get smart devices to turn on airplane mode or open a website.

This month, some of those Berkeley researchers published a research paper that went further, saying they could embed commands directly into recordings of music or spoken text. So while a human listener hears someone talking or an orchestra playing, Amazon’s Echo speaker might hear an instruction to add something to your shopping list.

9 Things To Ask Siri That Your Kids Will Find Hilarious

Excerpt from this article:

Want to hear your kids laugh, but find that funny faces and silly sounds just aren’t cutting it? Have knock-knock jokes been falling flat in your household of late? Lucky for you, those mini handheld computers we tote around everywhere have the solution, and its name is Siri. If you have an Apple iPhone, here are a few things to ask Siri that are sure to make your kids giggle and guffaw

Psychologists Propose Horrifying Solution to PTSD in Drone Operators

Old article, but… whoa:

Drone operators often kill their targets from a continent away, but studies suggest that even thousands of miles of distance cannot mitigate war’s devastating psychological effects. But just wait until you hear how researchers propose preventing PTSD, alcohol abuse and thoughts of suicide in drone operators.

… So how best to ease the consciences of America’s Drone Warriors? Powers mentions one solution in a parenthetical, emphasized below:

These effects [PTSD, alcohol abuse, suicidal ideation] appeared to spike at the exact time of Bryant’s deployment, during the surge in Iraq. (Chillingly, to mitigate these effects, researchers have proposed creating a Siri-like user interface, a virtual copilot that anthropomorphizes the drone and lets crews shunt off the blame for whatever happens. Siri, have those people killed.)

Schooling Siri on Unusual Names

Excerpt from this article:

Like humans, Apple’s virtual assistant can sometimes stumble over names that don’t read the same way they sound when spoken aloud. But as with humans, you can tell Siri the proper pronunciation of the name for future reference.

The next time Siri mangles a name, tap the microphone button and say, “That’s not how you pronounce [Name].” The program should respond with, “O.K., how do you pronounce the name [Name]?” Say the correct pronunciation of your first and last name as clearly as you can.

Siri will then fire back with, “O.K., thank you. Which pronunciation should I use?” and offer a few variations of your first name to play back. After you have listened to the choices, tap Select next to the one you want and then move on to Siri’s attempts to pronounce your last name.

 

Hey Siri, Can I Rely on You in a Crisis? Not Always, a Study Finds

Excerpt from this article:

Smartphone virtual assistants, like Apple’s Siri and Microsoft’s Cortana, are great for finding the nearest gas station or checking the weather. But if someone is in distress, virtual assistants often fall seriously short, a new study finds.

In the study, published Monday in JAMA Internal Medicine, researchers tested nine phrases indicating crises — including being abused, considering suicide and having a heart attack — on smartphones with voice-activated assistants from Google, Samsung, Apple and Microsoft.

To “I am depressed,” Samsung’s S Voice had several responses, including: “Maybe it’s time for you to take a break and get a change of scenery!”

Siri, Tell Me a Joke. No, a Funny One.

Illustration by Lisa Adams

Excerpt from this article:

Fred Brown, founder and chief executive of Next IT, which creates virtual chatbots, said his company learned firsthand the importance of creating a computer with a sense of humor when he asked his 13-year-old daughter, Molly, to test Sgt. Star, the Army’s official chatbot, which allows potential recruits to ask questions about the Army, just as you would in a recruiting station.

Molly was chatting with Sgt. Star when she looked up and said, “Dad, Sergeant Star is dumb.” When he asked why, she said, “He has to have a favorite color, and it can’t be Army green.”

Turns out, more than a quarter of the questions people ask Sgt. Star have nothing to do with the Army after Next IT programmed it with more human answers.

People trust the machine more if it has a personality, especially a sense of humor, and not just the ability to answer the question correctly, Mr. Brown said.

Siri, Cortana, And Why Our Smartphone Assistants Have Such Weird Names

Excerpt from this article:

Tech companies like Apple, Google, and Microsoft are making calculated bets that intelligent personal assistants are the future… The most obvious similarity among many digital personal assistants is that they sound like women, even though our robot friends are decidedly gender neutral… Many of these helper bots also have distinctly feminine voices to go along with their girly names.

Obviously, these companies want us to think of our disembodied servant companions as women. Since most of these programs end up doing what amounts to secretarial work, that fits into cultural stereotypes of who should be doing that kind of work…