SHARE

Click, click, click, America loves its Amazon Prime delivery, right? With the press of a button, subscribers have little knickknacks and home goods at their doorsteps, sometimes the same day. Is it all worth it, though? The delivery job is reportedly tough, drivers are often independent contractors with little protection or job security, and now Amazon is integrating what is arguably a bio-digital security surveillance state into its delivery process. After months of use, some of the drivers are reportedly unhappy with what they interpret as unfair and inaccurate technology.

In February, Vice reported that Amazon planned to install a new form of artificial intelligence (AI) with machine learning cameras in all of its Amazon-branded delivery vans. These cameras would monitor performance and can creepily monitor drivers’ faces, all in the name of safety and security. If they didn’t sign the “biometric consent” form, they reportedly would lose the job. Now, months later, Vice has checked in on Amazon’s new policy and workers predictably say it doesn’t even work right.

“‘Maintain safe distance,’ the camera installed above [the driver’s] seat would say when a car cut him off,” the Vice report detailed. “That data would be sent to Amazon and would be used to evaluate his performance that week and determine whether he got a bonus.”

Oh joy, worker performance has been inherently linked to the camera’s bad data intake. According to the workers interviewed for the story, the AI monitoring system often mistakes glances away from the camera as violations and elements out of a driver’s control as infractions. These infractions damage drivers’ company safety ratings and can lock them out of bonuses and prizes that they might have received prior to the new AI camera system.

Amazon claims accidents are way down in the trucks that have the cameras installed, but at what cost? Amazon claims to have a team that can manually review camera events so as to not penalize drivers, but workers are doubtful that Amazon has the manpower or desire to investigate and mitigate every erroneous claim. One delivery company that acts as a go-between with Amazon and the drivers told Vice it has reached out about driver contests or concerns, but Amazon never responded to the inquiries.

As CarBibles’ resident ex-rideshare driver, I have a bit of experience with these automated camera systems. Both of the most popular rideshare apps track GPS and approximate driver data via your phone. Initially, I thought most of that data was relatively benign. Before I stopped driving, though, both companies instituted a facial-recognition-style driver verification, all in the name of safety.

Did it work? I’m not sure. Any passenger complaints could, in theory, be used to correlate against whatever data either rideshare company had been collecting on you. Similar to what these frustrated Amazon drivers have realized, this data may not have enough context. Was I speeding or simply keeping up with traffic? Did I slam on the brakes, or did someone cut me off? In my more than 9,000 rides as a rideshare driver, I had a few “unsafe driving” complaints, after which the company threatened to deactivate (fire) me. Yet, I had no context for what theoretical violation I had committed. I am a safe driver and I follow all traffic laws, usually. 

Like ridesharing, most Amazon delivery drivers are independent contractors with little protection from being fired without a just cause. A crappy AI shouldn’t be the reason why someone loses the ability to put food on the table or earn the money that’s rightfully owed. Read more about why Amazon is implementing these changes and how the employees have reacted on Vice.

MORE TO READ