Facebook’s ’10 Year Challenge’ Is Just a Harmless Meme—Right?

Excerpt from this article:

Imagine that you wanted to train a facial recognition algorithm on age-related characteristics and, more specifically, on age progression (e.g., how people are likely to look as they get older). Ideally, you’d want a broad and rigorous dataset with lots of people’s pictures. It would help if you knew they were taken a fixed number of years apart—say, 10 years.

Sure, you could mine Facebook for profile pictures and look at posting dates or EXIF data. But that whole set of profile pictures could end up generating a lot of useless noise. People don’t reliably upload pictures in chronological order, and it’s not uncommon for users to post pictures of something other than themselves as a profile picture. A quick glance through my Facebook friends’ profile pictures shows a friend’s dog who just died, several cartoons, word images, abstract patterns, and more.

In other words, it would help if you had a clean, simple, helpfully labeled set of then-and-now photos.

Advertisements

American Airlines is offering biometric boarding at LAX Terminal 4

American Airlines is offering biometric boarding at LAX Terminal 4

Excerpt from this article:

The program will let American test whether passengers like and are willing to use facial recognition to expedite the boarding process and ensure that the technology meets Customs and Border Protection requirements.

Gemalto’s tech can replace boring old paper boarding passes by instantly matching passenger faces against the pre-populated Department of Homeland of Security database.

Computer programs recognise white men better than black women

Excerpt from this article:

Joy Buolamwini of the Massachusetts Institute of Technology will present work which suggests it is true.

Ms Buolamwini and her colleague Timnit Gebru looked at three sex-recognition systems, those of IBM, Microsoft and Face++. They tested these on a set of 1,270 photographs of parliamentarians from around the world and found that all three classified lighter faces more accurately than darker ones. All also classified males more accurately than females. IBM’s algorithm, for example, got light male faces wrong just 0.3% of the time. That compared with 34.7% of the time for dark female faces. The other two systems had similar gulfs in their performances. Probably, this bias arises from the sets of data the firms concerned used to train their software. Ms Buolamwini and Ms Gebru could not, however, test this because those data sets are closely guarded.

IBM has responded quickly. It said it had retrained its system on a new data set for the past year, and that this had greatly improved its accuracy.

Anti-surveillance clothing aims to hide wearers from facial recognition

An image of a Hyperface pattern, specifically created to contain thousands of facial recognition hits.

Excerpt from this article:

The use of facial recognition software for commercial purposes is becoming more common, but, as Amazon scans faces in its physical shop and Facebook searches photos of users to add tags to, those concerned about their privacy are fighting back.

Berlin-based artist and technologist Adam Harvey aims to overwhelm and confuse these systems by presenting them with thousands of false hits so they can’t tell which faces are real.

The Hyperface project involves printing patterns on to clothing or textiles, which then appear to have eyes, mouths and other features that a computer can interpret as a face.