AI can now place a digital Coca-Cola next to any meal
|
By
Andrew Tarantola Published July 29, 2024 |
AI is infiltrating the world of advertisements, and Coca-Cola is the latest to find a use for it. The Coca-Cola Company announced Monday ahead of Siggraph that it has partnered with the ad agency to WPP to incorporate AI from Nvidia into its global ad campaigns.
“With Nvidia, we can personalize and customize Coke and meals imagery across 100-plus markets, delivering on hyperlocal relevance with speed and at global scale,” Samir Bhutada, global VP of StudioX Digital Transformation at Coca-Cola, said in a press statement released Monday.
Coke has been working with WPP to develop Prod X, a custom production studio and digital twin tools that the beverage company can use in its ads. A digital twin is just a virtual copy of a real-life object that can be manipulated in a 3D environment. You can probably see why it would helpful for a company like Coca-Cola.
WPP also announced Monday that Coca-Cola will be among the first adopters of Nvidia NIM microservices for Universal Scene Description (OpenUSD), a “3D framework that enables interoperability between software tools and data types for building virtual worlds,” that was invented by Pixar Animation Studio. With NIM and USD, WPP is able to leverage a large catalog of branded images and digital models, and assemble them into localized, culturally relevant scenes so that Coca-Cola can better target local markets.
This content engine is based on Nvidia’s Omniverse Cloud, an API and SDK platform that connects a variety of 3D tools.
WPP leverages that platform to connect product-design data from software such as Adobe’s Substance 3D with, for example, generative AI systems from Adobe and Getty so that its designers can create photorealistic product models (in this case, bottles of Coca-Cola) using natural language prompts.
Ad makers can generate enormous libraries of visual assets as well as the python code needed to create the 3D scenes around those assets.
“The beauty of the solution is that it compresses multiple phases of the production process into a single interface and process,” Perry Nightingale, senior vice president of creative AI at WPP, said of the new NIM microservices. “It empowers artists to get more out of the technology and create better work.”
Related Posts
New study shows AI isn’t ready for office work
A reality check for the "replacement" theory
Google Research suggests AI models like DeepSeek exhibit collective intelligence patterns
The paper, published on arXiv with the evocative title Reasoning Models Generate Societies of Thought, posits that these models don't merely compute; they implicitly simulate a "multi-agent" interaction. Imagine a boardroom full of experts tossing ideas around, challenging each other's assumptions, and looking at a problem from different angles before finally agreeing on the best answer. That is essentially what is happening inside the code. The researchers found that these models exhibit "perspective diversity," meaning they generate conflicting viewpoints and work to resolve them internally, much like a team of colleagues debating a strategy to find the best path forward.
Microsoft tells you to uninstall the latest Windows 11 update
https://twitter.com/hapico0109/status/2013480169840001437?s=20