Loading stock data...

Hollywood Agency CAA Aims to Help Stars Manage Their Own AI Likenesses

caa veritone

As the entertainment and sports talent agency, Creative Artists Agency (CAA), continues to navigate the ever-changing landscape of Hollywood, it has taken a significant step forward by introducing an innovative solution to protect its clients from unauthorized use of their digital likenesses. This move comes at a time when many stars have fallen victim to AI-generated deepfakes, used without their consent.

The Rise of AI Deepfakes in Hollywood

In recent years, the misuse of celebrities’ names, images, and voices has become a pressing concern in the entertainment industry. With the advancement of artificial intelligence (AI), creating digital clones of individuals has become increasingly easy, raising numerous privacy concerns. Tom Hanks, a renowned actor and client on CAA’s roster, was recently targeted by an AI scam that used a generated video of him to promote a dental plan without permission.

CAA’s Solution: theCAAvault

To address this issue, CAA has developed a virtual media storage system called "theCAAvault," which provides a secure platform for A-list talent to store their digital assets, including their names, images, digital scans, voice recordings, and more. This innovative solution is made possible through a partnership with AI tech company Veritone, which provides the digital asset management solution.

The Importance of Consent-Based Use

According to Alexandra Shannon, CAA’s head of strategic development, the misuse of celebrities’ likenesses has become a significant problem. "Over the last couple of years or so, there has been a vast misuse of our clients’ names, images, likenesses, and voices without consent, without credit, without proper compensation," she noted.

To combat this issue, CAA has taken a consent-based approach to AI applications. Clients can now store their AI digital doubles and other assets within a secure personal hub in theCAAvault, which can only be accessed by authorized users. This allows them to share and monetize their content as they see fit.

Setting Precedents for Consent-Based Use

Shannon emphasized that CAA’s goal is to set precedents for what consent-based use of AI looks like. "Frankly, our view has been that the law will be catching up with this technology, but we want to make sure that we’re ahead of it," she said.

By making this solution available to its clients, CAA aims to protect them from unauthorized use of their digital likenesses. This move is not only a step forward for the agency but also sets a precedent for the industry as a whole.

Challenges and Future Plans

While CAA’s solution is an innovative step towards protecting celebrities’ rights, it comes with a price tag. Shannon acknowledged that the cost of participating in the vault can be high, but assured that the costs will continue to come down over time.

As for which clients are currently using this service, Shannon remained tight-lipped, only stating that it was only a select few at the moment.

Conclusion

In conclusion, CAA’s introduction of theCAAvault is a significant development in the fight against AI-generated deepfakes. By taking a consent-based approach to AI applications and providing a secure platform for clients to store their digital assets, the agency has set itself apart as a leader in protecting celebrities’ rights.

As the entertainment industry continues to navigate the ever-changing landscape of AI-generated content, CAA’s solution will undoubtedly play a crucial role in safeguarding its clients from unauthorized use. With its innovative approach and commitment to consent-based use, CAA is setting a precedent for the industry to follow.

Related Articles