shutterstock_188249969 banner.png

Photo: / Pressmaster

This App Claims It Can Detect 'Trustworthiness.' It Can't


A Tokyo-based company claims to have invented a software that can tell when people are lying, to be used by financial institutions targeting unbanked customers

DeepScore is marketing its facial and voice recognition to banks and insurance providers in countries where most people lack formal identification or previous credit history. Its system involves the potential customer answering questions through video on an app, while the AI technology grades their answers based on changes in their voice and muscle twitches.

"Privacy and human rights advocates are alarmed by DeepScore’s premise—that the minute signals captured by facial and vocal recognition algorithms reliably correspond to something as subjective and varied as a person’s honesty."

They argue that facial expressions cannot reliably indicate someone’s mental state, and could discriminate against people with anxiety or who are neurotypical, while facial recognition is demonstrably less accurate when used on women and people with darker skin.

Ogino said “any algorithm is not 100 percent accurate,” but the goal is to bring loan opportunities and insurance to people who would otherwise be excluded because businesses have no way to determine whether they’re low - or high-risk. Its possible uses involve life-or-death decisions

View article