- cross-posted to:
- technology@beehaw.org
- cross-posted to:
- technology@beehaw.org
Jackson soon discovered that Amazon suspended his account because a Black delivery driver who’d come to his house the previous day had reported hearing racist remarks from his video doorbell. In a brief email sent to Jackson at 3 a.m., the company explained how it unilaterally placed all of his linked devices and services on hold as it commenced an internal investigation.
The accusations baffled Jackson. He and his family are Black. When he reviewed the doorbell’s footage, he saw that nobody was home at the time of the delivery. At a loss for what could have prompted the accusation of racism, he suspected the driver had misinterpreted the doorbell’s automated response: “Excuse me, can I help you?”
As it turned out, he wasn’t. But when they stopped servicing him, they had every reason to believe that he was.
Do you continue to service a customer whose behavior is otherwise unacceptable until you’re absolutely sure he’s a bigot? Or do you abide by your legal obligation to protect your workers from such behavior?
I don’t know if Amazon did the worst thing here, but I don’t know that the best thing is far off from what they did.
Who at Amazon would be hurt by a bigot using their Echo or doorbell? Stopping deliveries sure but this is a couple of steps further.
That’s a great question and I don’t know what kind of exposure Amazon employees have to audio logs from those devices but I’m certain there’s some sure to required troubleshooting and debugging.
I also don’t know how integrated the various aspects of a user’s account are and whether it would even be possible for Amazon to have taken a smaller step.