More technology will be opened, installed, and started up for the first time today than any other day of the year (maybe second only to yesterday). The day after Christmas, millions of people are trying out their Roombas, Amazon Echos, Google Nests, smart appliances, etc. Millions of people are introducing a new convenience and tech relationship into their households.
But how many of us actually consider the data collection we’re inviting into our homes?
In the fall of 2020, gig workers in Venezuela posted a series of images to online forums where they gathered to talk shop. The photos were mundane, if sometimes intimate, household scenes captured from low angles—including some you really wouldn’t want shared on the Internet.
In one particularly revealing shot, a young woman in a lavender T-shirt sits on the toilet, her shorts pulled down to mid-thigh.
The images were not taken by a person, but by development versions of iRobot’s Roomba J7 series robot vacuum. They were then sent to Scale AI, a startup that contracts workers around the world to label audio, photo, and video data used to train artificial intelligence.
They were the sorts of scenes that internet-connected devices regularly capture and send back to the cloud—though usually with stricter storage and access controls. Yet earlier this year, MIT Technology Review obtained 15 screenshots of these private photos, which had been posted to closed social media groups.
The photos vary in type and in sensitivity. The most intimate image we saw was the series of video stills featuring the young woman on the toilet, her face blocked in the lead image but unobscured in the grainy scroll of shots below. In another image, a boy who appears to be eight or nine years old, and whose face is clearly visible, is sprawled on his stomach across a hallway floor. A triangular flop of hair spills across his forehead as he stares, with apparent amusement, at the object recording him from just below eye level.
iRobot—the world’s largest vendor of robotic vacuums, which Amazon recently acquired for $1.7 billion in a pending deal—confirmed that these images were captured by its Roombas in 2020. All of them came from “special development robots with hardware and software modifications that are not and never were present on iRobot consumer products for purchase,” the company said in a statement. They were given to “paid collectors and employees” who signed written agreements acknowledging that they were sending data streams, including video, back to the company for training purposes. According to iRobot, the devices were labeled with a bright green sticker that read “video recording in progress,” and it was up to those paid data collectors to “remove anything they deem sensitive from any space the robot operates in, including children.” – MIT Tech Review
This article goes on to describe the widespread data supply chain and the growing practice of sharing potentially sensitive data to train algorithms – a process that relies on human data annotation which ultimately led to the leaked images.
Although these were developmental products and not from the main line of Roombas, it’s telling of the give-and-take relationship we have to our devices. We often ignore, forget, or don’t even know the data permissions we’re giving developers. And we certainly aren’t aware of how far this data can spread through commercial sales and partnerships.
Tech Free Zone at Home
Over the last decade, we’ve invited more and more smart devices – from washing machines to smart speakers – into our homes, all of which collect personal data to varying degrees. At the minimum we all have a smartphone around us at all times collecting data, unless you’re ahead of the game and use a dumbphone.
Our voices, our faces, our movements and behaviors are all out in the open for collection.
Often, we opt in simply by using the product, as noted in privacy policies with vague language that gives companies broad discretion in how they disseminate and analyze consumer information.
“It’s much easier for me to accept a cute little vacuum, you know, moving around my space [than] somebody walking around my house with a camera.” – MIT Tech Review
Now, more than ever, escaping invasive data collection is a real need.
I think it’s wise to design a Tech-Free Zone somewhere in your living space. Basically to have a room or area that allows you to completely escape technology (phone included). If it’s not reasonable or feasible to do so, then you should at least have a way to completely shut down and disconnect from all tech.
Not only do Tech-Free Zones limit your exposure to data capture, but also are effective for mental health. Both of which are the primary points of contention against the consumer tech movement.
Nothing (in tech) is truly free because you pay with your data. In this world, you have to pay extra for privacy. If we look far enough into the future, I believe public Tech-Free Zones will become a true luxury experience.