A Murder Case Tests Alexa’s Devotion to Your Privacy

Excerpt from this article:

Arkansas police recently demanded that Amazon turn over information collected from a murder suspect’s Echo. Amazon’s attorneys contend that the First Amendment’s free speech protection applies to information gathered and sent by the device; as a result, Amazon argues, the police should jump through several legal hoops before the company is required to release your data.

… Let’s look at a few scenarios. These are more or less specific to Amazon’s technology and policies, but variants could apply to Google Home or other digital assistants. This brings up a more basic question: Do you have to give informed consent to be recorded each time you enter my Alexa-outfitted home? Do I have to actively request your permission? And who, at Amazon or beyond, gets to see what tendencies are revealed by your Alexa commands? Amazon claims you can permanently delete the voice recordings, though wiping them degrades performance. Even if you’re smart enough to clear your browser history, are you smart enough to clear this, too? And what about the transcripts?

Another question: How do you know when your digital assistant is recording what you say? Amazon provides several ways to activate the recording beyond the “wake” word. A light on the Echo turns blue to indicate audio is streaming to the cloud. After the request is processed, the audio feed is supposed to close. You can also set the device to play a sound when it stops streaming your audio, but what happens if the device is hacked or modified to keep recording?

Anti-surveillance clothing aims to hide wearers from facial recognition

An image of a Hyperface pattern, specifically created to contain thousands of facial recognition hits.

Excerpt from this article:

The use of facial recognition software for commercial purposes is becoming more common, but, as Amazon scans faces in its physical shop and Facebook searches photos of users to add tags to, those concerned about their privacy are fighting back.

Berlin-based artist and technologist Adam Harvey aims to overwhelm and confuse these systems by presenting them with thousands of false hits so they can’t tell which faces are real.

The Hyperface project involves printing patterns on to clothing or textiles, which then appear to have eyes, mouths and other features that a computer can interpret as a face.

 

“Kids Don’t Understand Privacy Anymore”

Excerpt from this article:

All of this centers around a strong set of values—which parents and other mentors, can model for kids. The new world of social media does mean we all get to ignore our values, but it does require us to help young people navigate how their ideas get filtered and shared through these new means of communication. For instance, you have a sense of when it’s OK to resolve an issue via e-mail, but you also understand when it’s best to have a face-to-face discussion. The issue for kids is no different at its core—it’s just the medium that’s different. The challenge for you lies in the nuances of each communication mechanism, be it Facebook, Instagram, Snapchat, etc. Stick to your core values. It is OK to emphasize things such as loyalty, but show your kids the difference between the the ways we communicate.

Should You Spy on Your Kids?

Excerpt from this article:

Digital monitoring — from tracking those whom loved ones communicate with to snooping on their social media accounts to checking their locations — is becoming common even among people who view themselves as mindful of the boundaries with their children and partners.

Is there such a thing as responsible spying on loved ones?

The answer depends on whom you ask. Strong believers in privacy reject the premise of the question outright, while others believe it is possible if consent, trust and respect are involved.

“It comes down to power dynamics,” said Mary Madden, a researcher at Data & Society, a nonprofit research organization. “You can imagine a scenario where, in a family, it’s an unhealthy dynamic.”

…“The game changes when we’re talking about a 16-year-old who feels ‘stalked’ by their parents,” Dr. Boyd wrote in an email. “This is because the sharing of information isn’t a mutual sign of trust and respect but a process of surveillance.”

In her fieldwork with teenagers, she said, she was disturbed to find that the privacy norms established by parents influenced their children’s relationships with their peers. Teenagers share their passwords for social media and other accounts with boyfriends and girlfriends.

“They learned this from watching us and from the language we used when we explained why we demanded to have their passwords,” she said. “And this is all fine, albeit weird, in a healthy relationship. But devastating in an unhealthy one.”

The Perils of ‘Sharenting’

Excerpt from this article:

…Researchers, pediatricians, and other children’s advocates are in the early stages of designing a public-health campaign to draw attention to what they say is an inherent conflict between a parent’s freedom to publish and a child’s right to privacy.

… “It’s very rare that parents are sharing maliciously, but they haven’t considered the potential reach or longevity of what is happening with the information they’re posting,” says Stacey Steinberg, a law professor at the University of Florida’s Levin College of Law and the associate director of the school’s Center on Children and Families.

It’s typical for adults to mention a child’s name and birthdate in birth announcements and other posts on sites like Facebook and Instagram, for instance, which puts kids at risk of identity theft and digital kidnapping—when someone lifts images of another person’s kids and portrays them as their own. Some parents publish real-time information about their children’s whereabouts, potentially risking their safety. And well-meaning adults readily go online to share photos of their kids in a variety of intimate settings.

In Steinberg’s new paper, “Sharenting: Children’s Privacy in the Age of Social Media,” set to be published in the Emory Law Journal in the spring of 2017, she writes of a blogger who posted photos of her young twins while they were potty training. “She later learned that strangers accessed the photos, downloaded them, altered them, and shared them on a website commonly used by pedophiles,” Steinberg wrote. “This mother warns other parents not to post pictures of children in any state of undress, to use Google’s search features to find any images shared online, and to reconsider their interest in mommy blogging.”

 

Drop dead

naked

Excerpt from this article on Dropcam, describing its motion detection feature for monitoring your house when you’re away, which emails a photo if the camera picks up some activity:

I never thought much of this until I opened an email to see a photo of me completely naked walking by the camera, on my way to grab from a pile of recently folded clean clothes after I took a shower.

Obviously, that’s a bit of a shock, but I was home alone and I’m the only one that opens my email, so I wasn’t too disturbed by it. But then I realized that image is on Dropcam’s system. And Google bought Dropcam so my photo is somewhere in Google’s cloud. There’s a web-accessible photo of my naked ass (with no black bar added above) somewhere and I have no idea where it is or how easy it is for anyone to find. Wonderful.

Tech company accused of collecting details of how customers use sex toys

The data collected, she claims, allows the We-Vibe maker to link information about use and preferences to a specific customer.

Excerpt from this article:

A Chicago woman is suing a tech company she accuses of collecting intimate information about how its customers use their sex toys.

The woman, identified only as NP in her lawsuit, is suing the maker of We-Vibe, a personal vibrator that can be controlled by a smartphone, accusing the company of secretly amassing “highly sensitive, personally identifiable information” about how and when she used the device.

The woman claims that the device maker violated numerous laws by collecting information about her and other users’ preferred vibration settings, the dates and times the device is used, “and incredibly”, the email addresses of We-Vibe owners who had registered their devices. The data collected, she claims, allows the We-Vibe maker to link information about use and preferences to a specific customer.