[Demo] Deploying AI doppelgangers to de-identify user research recordings

Under biometric privacy laws like BIPA and CCPA, user research recordings containing users’ faces or voices can put your company at risk for lawsuits and fines. Legal departments are increasingly requiring more stringent redaction, and in some cases banning recording outright. This comes at a high cost for UX teams who are already being asked to do more with less, as losing access to recordings can increase duplicative research effort and reduce the accuracy of results.

AI offers new solutions for UX teams who want to keep research recordings longer without violating biometric privacy laws. In this demo, we’ll show how we used off-the-shelf tools to intelligently redact users’ voices, faces, and bodies in research videos. By removing biometric identifiers, you can compliantly archive research recordings indefinitely, enabling your team to mine them for insights for years to come.