Deepfakes Are Evolving. This Company Wants to Catch Them All

Some Fortune 500 companies have begun testing software that can spot a deepfake of a real person in a live video call, following a spate of scams involving fraudulent job seekers who take a signing bonus and run.

The detection technology comes courtesy of Get Real Labs, a new company founded by Hany Farid, a UC-Berkeley professor and renowned authority on deepfakes and image and video manipulation.

Get Real Labs has developed a suite of tools for spotting images, audio, and video that are generated or manipulated either with artificial intelligence or manual methods. The company’s software can analyze the face in a video call and spot clues that may indicate it has been artificially generated and swapped onto the body of a real person.

“These aren’t hypothetical attacks, we’ve been hearing about it more and more,” Farid says. “In some cases, it seems they’re trying to get intellectual property, infiltrating the company. In other cases, it seems purely financial, they just take the signing bonus.”

The FBI issued a warning in 2022 about deepfake job hunters who assume a real person’s identity during video calls. UK-based design and engineering firm Arup lost $25 million to a deepfake scammer posing as the company’s CFO. Romance scammers have also adopted the technology, swindling unsuspecting victims out of their savings.

Impersonating a real person on a live video feed is just one example of the kind of reality-melting trickery now possible thanks to AI. Large language models can convincingly mimic a real person in online chat, while short videos can be generated by tools like OpenAI’s Sora. Impressive AI advances in recent years have made deepfakery more convincing and more accessible. Free software makes it easy to hone deepfakery skills, and easily accessible AI tools can turn text prompts into realistic-looking photographs and videos.

But impersonating a person in a live video is a relatively new frontier. Creating this type of a deepfake typically involves using a mix of machine learning and face-tracking algorithms to seamlessly stitch a fake face onto a real one, allowing an interloper to control what an illicit likeness appears to say and do on screen.

Farid gave WIRED a demo of Get Real Labs’ technology. When shown a photograph of a corporate boardroom, the software analyzes the metadata associated with the image for signs that it has been modified. Several major AI companies including OpenAI, Google, and Meta now add digital signatures to AI-generated images, providing a solid way to confirm their inauthenticity. However, not all tools provide such stamps, and open source image generators can be configured not to. Metadata can also be easily manipulated.

A gif of realtime facial recognition used by Will Knight

GIF: Will Knight

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Todays Chronic is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – todayschronic.com. The content will be deleted within 24 hours.

Leave a Comment