-->
Save your FREE seat for Streaming Media Connect in February. Register Now!

Q&A: AI in Advertising With Rembrand’s Cory Treffiletti

Article Featured Image

AI is consuming the advertising space in rapid and innovative ways. This has prompted discussions about traditional product placement becoming a dead concept, with generative AI replacing physical product placements through its wide range of adaptive flexibility for different settings and platforms.

One company at the forefront of AI advertising innovations is Rembrand. Headquartered in Palo Alto, California, with additional offices in two countries, Rembrand focuses on AI activation for video advertising, using their Enhanced In-Scene Advertising technology.  

I talked with Cory Treffiletti, CMO of Rembrand, about his long career in digital marketing and how he developed his approach to “data-driven storytelling,” along with how speedily AI is developing within the advertising industry and upending the traditional media buying process. We also touched on its unique advantages for content creators, how Rembrand’s AI is being trained for dynamic use in various settings, and what needs to happen to help evolve the public perception of AI and ease fears over it replacing jobs or being misused.

Tyler Nesler: You have been involved in digital marketing since 1994 and pioneered the “data-driven storyteller” approach that leverages data and insights around an audience. What does data-driven storytelling mean to you in the context of marketing, and how is new technology enhancing the core benefits of storytelling?

Cory Treffiletti: Data-driven storytelling blends my two favorite elements of being a marketer. The ability to be creative and the ability to be analytical. For me, marketing is the study of people and culture and the data we have at our fingertips to better understand those elements is almost limitless. My process is quite simple: data in, create value, data out. I start by taking data in to tell me about the audience and the target we are going after. You review the data and insights, and you create hypotheses about what motivates them and what gets them excited or to act. You then take that hypothesis to a creative place where you create value. At this stage, you are looking to create an idea, a story, or a hook that nobody else has come up with. Maybe you connect some dots that others haven’t noticed yet, and you craft an idea. Then, you test that idea in a logical manner, orchestrated in a way to produce analytics that provide you with a signal that is either positive or negative and either supports or disproves your hypothesis. Next, you choose a direction based on those signals. 

In this model, your story is organic. It evolves, and it never gets stale or stagnates. That is being a data-driven storyteller. As I have continued to use that model, the technology makes it easier to gain access to data, and AI allows me the chance to process ideas, create hypotheses, and test ideas creatively much faster. For example, I can create imagery that meets a hypothetical approach, and I can test it quickly without having to rely on someone to digest my ideas and develop the artwork. I can do it all fast and by myself.

TN: In what ways do you see AI specifically evolving within the advertising space and fundamentally changing how media buying is conducted? What are some key advantages to this evolution, and what are some present challenges (i.e., the overall industry not quite catching up yet to this rapid evolution, etc.)? 

CT: I think the challenge with AI is the single point of understanding that most people in the industry have. To date, most people see it as a replacement for roles and responsibilities, thereby displacing people. I don’t see it that way. The biggest companies out there are creating text-to-image or text-to-video, and even text-to-music with Generative models, or they are using AI to increase decisioning speed inside of media optimization and MarTech tools. Those are great use cases, but people must also look at it as a tool, like Excel or PowerPoint. Excel did not kill accountants. It made it easier for them to do what they do, leading to other technologies and platforms. 

AI is a tool we create to do things we want it to do. Specifically – and I love this line, which I will paraphrase – AI allows me as a marketer to tweak a finished product endlessly and at every stage of the process. If I am pitching an idea, I can almost effortlessly pitch a finished product using the right combination of tools and prompts, and I can do so in a fraction of the time. I am still being creative, but I am creative more quickly. That means my inspiration doesn’t get lost, and my passion doesn’t burn out for a project. I am not stuck in tiring meetings and journeys waiting for someone else to do their part. We can all be in a room, tweaking the project together and seeing the development at each stage. That could be a script, a homepage design, positioning, or any content we need to be developed. The process is accelerated, and that is exciting to me.

TN: Rembrand's In-Scene Media utilizes cutting-edge AI to integrate branded media and virtual objects into videos seamlessly. How is new technology making traditional product placement irrelevant in today's marketplace?

CT: It’s a great distinction you make. Product placement takes too long, either by needing to be in the original scene or having it added with visual effects artists in post-production, requiring weeks to do it properly. It is also easy to overlook because its nature is to blend into the scene. Maybe you see it, maybe you don’t. For us, the concept of AI-generated In-Scene Media is to ensure that the media is unskippable (which is very important given how people consume video online and through VOD, OTT, and CTV) and natural but also calls attention to itself and delivers on the exposure by creating awareness, recall, and driving actions like purchase intent or consideration for B2B products. 

With AI, we create these media opportunities in hours, enabling brands to be there seamlessly and easily on a CPM model. We see these natural insertions – with a little animation or 3D activation – call attention without being too intrusive, and they create a positive association with the creators and content that people love and that transfers to the brands. When a brand is seen as supporting a creator and content the audience is passionate about, they comment positively and start to enjoy that brand as well. That’s the goal of all advertising, and we’ve been taken by surprise just how positive the response has been. We see comments where viewers call out a similar media insertion in someone else’s video for that same brand. The viewers are pretty smart!

TN: Now, Rembrand is offering "non-interruptive In-Scene Media." Can you outline some details about this new product and discuss its unique advantages for content creators?

CT: Essentially, we enable your brand to be woven into high-quality video content, mostly on YouTube and TikTok, but it is also coming to other platforms soon. Your brand is inserted as a product, as art on the wall, or on a table. We generate these insertions with the brand, using our AI and a library of animations that enable it to be inserted into a video in post-production. That insertion is what we refer to as “physics-informed,” meaning it has to obey the laws of physics: it is the right size, and it has the right lighting and shadowing. If something passes in front of or behind it, it doesn’t have the ghostly hallucinogenic blur that most AI has. It looks completely normal except for the animation that calls attention to it. It is subtle enough not to be annoying but is always noticed, and since it is in the video itself, it can’t be skipped. 

It also reaches audiences who have been considered unreachable previously, such as the people who pay for YouTube Premium. They see these insertions. On the flip side, the creators are very happy because they can maintain their vision for their content and are not forced to do any sponsored reads or mention the brand in any way. Many of those mentions feel forced or are less-than-authentic. Creators love it because they get their content supported by big brands with little to no work. This happens in a couple of hours, and we take all the negotiation between a brand and a creator and squish it down a single insertion order. 

Later this year, we plan to launch a fully programmatic version for inclusion in trading platforms and media buying solutions. We want to make this a standardized, simple, and easy extension for any brand to have always on and support this high-quality creative channel.

TN: It seems that virtual product placements are best optimized for social media content creators at the moment due to content created around stable/fixed-shot indoor settings. How do you see the technology evolving in the near future to integrate better into more dynamic/less predictable settings?

CT: It comes back to the difference between In-Scene Media and virtual product placement. Virtual product placement actually doesn’t work for social media because the turnaround time for content creators is too short. It can work for traditional TV and film but not really for social media. We started in social media because nobody else was able to do it. It was a green field. Our AI speeds up the process so much that we can create the connection between the brand, audience, and creators in the timeline that works for creators. 

We plan to expand into TV and film content because it will be just as easy for us to do that, even if it is a more cluttered environment with more competition. You are correct that the fixed placement and indoor elements make it easy, but there is a lot of that in traditional video as well. Our AI is being trained on multiple shot clusters, moving cameras, outdoor lighting, and more as we speak. So it is really only a short time before we can offer the solution to everyone in any setting. The constraint was less about the placement environment and more about the timeline for creators.

TN: Overall, AI in its many forms is still regarded with hesitation/concerns by some media and the public, especially with possible unethical uses of generative AI. How do you address these types of concerns to people to help lessen any lingering stigma of AI?

CT: There is concern about the use of AI for deep fakes and more, but those are broad uses that have no constraints placed on them. Our business is constrained on purpose. For us, we are providing a service that uses AI to fuse together a brand and content with the expressed approval of the creator, the brand, and the implicit approval of the audience. The audience knows the content is supported by advertising, so they get it for free or at a low cost. They get to experience that content uninterrupted because we blend into it rather than intruding and pausing the content. 

Meanwhile, the brands and the creators get to work together and both sides have approval on the insertion and the finished product. We are 100% transparent about where the brand runs, with brand safety checks in place, also automated with AI, and we can halt a media insertion before it goes live. That means both sides have control, and I think the fundamental concern people have with AI is the loss of control. For us, as I mentioned above, AI is a solution to a problem rather than an unconstrained tool to be used any which way. That means we use it in a controlled environment. The AI industry as a whole could benefit from placing constraints on the usage to give some elements of control back to the folks who are using it.

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

Q&A: Wurl’s Ria Madrid Discusses BrandDiscovery's Groundbreaking Generative AI for CTV Ads

A Q&A with Ria Madrid of Wurl - she discusses BrandDiscovery, their new tech that makes it possible for marketers to precisely match CTV ads with the emotion and context of what viewers are watching to create positive attention, using Plutchik's Wheel of Emotions. Partners like Media.Monks are already driving impressive results for their clients through Wurl's solution, which uses scene-level contextual targeting to help advertisers align the emotional sentiment of their campaign creatives with content closest to the ad break.

Akta's Cutting-Edge AI-Based Video Platform Continues To Innovate, Leveraging Google’s Gemini

Leading the way in video innovation, Akta's integration enriches video metadata, extracts captions, generates clip descriptions and more

NAB 2024: Assessing the AI Revolution in Entertainment

The signal-to-noise ratio in today's relentless AI buzz is far from optimal—particularly at NAB 2024, where the Everything AI vibe was off the chart from the moment the show began—but one session that cut through the noise and swapped hype for refreshing insight and candor was titled The AI Revolution in Entertainment: One Year On…

How AI Helps Solve Streaming Content Monetization

Much discussion of AI and streaming relates to streamlining and automating workflows, but how content companies can leverage it to personalize their content and target ads more efficiently, among other monetization strategies, is another question the industry is examining closely. Chris Pfaff dives into this question with Vevo's Bethany Atchison, Warner Bros. Discovery's Dan Trotta, TVREV's Alan Wolk, and Erickson Strategy's Paul Erickson in this clip from Streaming Media Connect 2023.

Pros and Cons of AI in Streaming

What are the significant upsides and downsides to the growth and adoption of ChatGPT and other AI applications for streaming professionals, and how can they leverage its strengths effectively? Boston 25 News' Ben Ratner, LiveX's Corey Behnke, AugX Labs' Jeremy Boeman, Mobeon's Mark Alamares, and Intellivid Research's Steve Vonder Haar discuss in this clip from Streaming Media Connect 2023.

How to Deal with AI Hallucinating, Copyright, and Fact-Checking

How can streaming pros deal with all of the copyright and fact-checking pitfalls of using AI systems trained with public datasets as error-ridden and inappropriately expropriated as the internet itself? Boston 25 News' Ben Ratner, IntelliVid Research's Steve Vonder Haar, AugXLabs' Jeremy Toeman, and LiveX's Corey Behnke discuss how to navigate this minefield in this clip from Streaming Media Connect 2023.