Add this to their list of troubles: PimEyes, the controversial face-search engine that crawls the web to ostensibly help individuals monitor their online presence.
The facial scanner was found populating its database with images of the dead, perWired, scraping digital memorials from Ancestry.com without permission.
Creepy? Yes. Ethics and privacy fiasco? Also yes.
For PimEyes’ part, they say their raising-of-the-dead was unintentional, and that they have blocked their crawlers from Ancestry and deleted the images in question.
For everyone else’s part, it’s another major “yikes” added to the tally of AI’s ethical repercussions. PimEyes has already been on the radar of privacy advocates, who worry the face scans may be used to gather sensitive personal info without consent.
Watchdogs fear the tech — good enough to ID subjects wearing sunglasses and masks, and accessible to anyone with $21 — could also be used for nefarious purposes, from stalking to identity theft.
The Wild West of post-mortem privacy
This bizarre PimEyes story highlights how few hard-drawn limits exist for facial-recognition engines — and on the privacy of the deceased in general.
More regulations seem likely. Post-mortem privacy has increasingly factored into high-profile cases, and EU policies already bar pictures of dead people when they carry a privacy interest for the living.