March 5, 2024 at 11:45 am

Vending Machine Error Shows That There’s Facial Recognition Watching Customers When They Buy

by Trisha Leigh

Listen, way too many stories in the news these days seem like they would be more at home in a science fiction novel (not the fun kind) than in real life.

I would say a creepy vending machine error fits right into the way things are going.

It was a vending machine on the University of Waterloo campus in Iowa that displayed a message that revealed a hidden feature – one that put students on edge as far as privacy.

Vending machines, like other appliances, have gotten “smarter” and smarter over the decades, so maybe it shouldn’t be a surprise that they’re essentially watching us.

This one was an M&M-branded machine that – apparently – employs facial recognition.

Source: SquidKid47 via Reddit

“We wouldn’t have known if it weren’t for the application error. There’s no warning here,” said one of the students, River Stanley.

Students reported their concerns and, in the meantime, began to cover the tiny camera hole with gum and sticky notes.

The machine was manufactured by Invenda, which is owned by MARS (who makes M&Ms, among other things). Their “intelligent” vending machines come installed with a “demographic sensor”.

This sensor is meant to count customers, to engage in demographic profiling, and using both, to calculate the age and gender of people most likely to use the machine.

Source: Shutterstock

Another student agreed they did not expect to confront this particular issue.

“I’m kind of shocked just because it’s a vending machine, and I don’t really think they need to be taking facial recognition.”

Invenda, though, says they’re not breaking any data privacy laws.

“The demographic detection software integrated into the smart vending machine operates entirely locally. It does not engage in storage, communication, or transmission of any imagery or personally identifiable information.”

Source: Shutterstock

For the students, though, it seems to be a question of whether or not they should include the tech, not whether or not they’re technically allowed to.

The University is responding to their concerns with decisive action.

“The university has asked that these machines be removed from camus as soon as possible. In the meantime, we’ve asked that the software be disabled.”

I mean, ok, but you only have their word for it.

Personally, I’d still be bringing gum or post-it notes to cover the camera until it was gone for good.

If you enjoyed that story, check out what happened when a guy gave ChatGPT $100 to make as money as possible, and it turned out exactly how you would expect.