Avid’s fail-fast, succeed-fast approach to embracing AI and other new technologies



Michael sat down with Shailendra Mathur, VP of Technology and Architecture at Avid, to discuss how they study and implement new opportunities AI brings to the industry. From integrations in Azure analytics to the RAD Lab, Shailendra explains how Avid investigates and decides when and how to utilize the latest technologies.

Michael: Hi, Michael Kammes here from Shift Media, and today we’re sitting down with Shailendra Mathur, VP of Technology and Architecture at Avid. Thank you so much for being here today.

Shailendra: Thanks for having me, Michael.

Michael: I am thrilled. We talked to a lot of people this week about technology, and a lot of it’s about what the announcements are. And there’s a lot of press on Avid’s announcements, but we’re gonna talk about AI cause, obviously, that’s the hot thing right now, and we’re really excited to see what Avid is doing with AI. So let’s start, kinda at the top level. Avid’s done a lot of research into AI. There’s been a lot of transparency and publishing of papers. Can you kind of go over how AI is handled internally, the lab that Avid has, and how that documentation is getting out to the world?

Shailendra: Yeah, absolutely. So, Avid has had AI integrations, for example, with our media asset management system. We have integrations with the Azure analytics service. So that’s how we enrich metadata, and we can search using expression analysis and other facets. So we’ve been kind of utilizing some of the AI functionality on that side, but we also started something called the RAD Lab, which is the research and advanced development. It’s RAD!

Michael: I like that.

Shailendra: And, frankly, it was also a way of bringing in researchers, the young folks, the who are out there right now. And so some of these are internship programs, but these are fail-fast, succeed-fast, investigate and figure out what we want to do with some of the technologies because there are so many ideas of how we should be doing AI for editorial, for asset management. There are so many. Which ones do we pick first? So this, using the RAD Lab, we did quite a bit of research, and part of the mission that we had was not just to keep it private to ourselves, but as you said, we’ve been publishing, but it’s also because of the collaborations, right? We are publishing. So we are published at the SMPTE conference. We actually had HPA presentations last year and this year.

So those have also just brought out other collaborators on that. And you know, when we are picking some technologies to investigate, other people have been contributing and say, “Hey, did you think of this?” So that’s been sort of our mission. So in terms of what we have done so far in that research, there are things like AI-based codecs. That’s something that we started looking at, especially when we looked at storage efficiency. You know, HEVC, AV1, these are all proceeding anyway, but AI adds another aspect to the codecs, so we started investigating that. That’s part of what’s published in the SMPTE journal as well. Some of the results we brought out are looking at things like semantic search technologies. Of course, ChatGPT is everywhere.

But it’s more the open AI models that actually help semantic indexing and semantic search. So that’s been another one. Related to that have been things like saliency maps and figuring out contextual information from images that can be actually used for different purposes. So that’s another paper that we published, which basically allows for better compression and color correction, extracting regions of interest. This is some of the work that we are doing and publishing, and you’ll probably see more coming out as a result of this work. So this is just research, but yes, there will be productization as well.

Michael: What would you say the ethos is for Avid in terms of how they view AI and AI’s role?

Shailendra: The ethos is that it’s all to help the creatives. Creatives are the life and blood of this industry. Whatever we do, we want to make sure that it’s an assistive technology versus something that’s replacing anybody. This is not about replacing. It’s all about assisting. It’s about recommending, right? Even when you look at ChatGPT, we think of these as recommendation engines, right? It’s recommending how to do things better, right? That’s really the ethos that we are following.

Michael: To get a little bit more specific on where AI fits. Now, I’m sure by NAB 2024, we’ll be sitting down, and the conversation will be skewed a little bit. But what tasks for creatives today would you say this is AI, and what tasks would still be in the creative realm?

Shailendra: Like I said, it’s a lot to do with recommendations, right? So just think of what we do with search today. Today a lot of folks have to log metadata, right? Right up front. If you don’t have the metadata, you can’t search for content appropriately. So it’s a pretty established field that you can use ML-based models for metadata augmentation, right? So that’s well understood. But then also, as a creative, you may be missing other related content. If so, then that’s where contextual search comes in, or semantic search comes in. Where it may not be exactly the person’s name, it could be another language. It could be some other information, or the person changed names. So that semantic information now is giving you a richer set of information back to work with as I created.

Shailendra: And the same thing with a journalist. You might be writing a story, right? You’re writing a story, but something else is happening, and you want to make sure that you can capture what’s happening out there. Or you could have some content for it to be used as B-roll, or it could be content in your archive that you weren’t even aware of. But as you’re writing, this is all assisting you in writing the news story. But it could also be scriptwriting. And in fact, it’s interesting that the HPA this year, Rob Gonsalves, was part of our team. He actually gave a presentation where he literally started showing how you could actually generate some script, start putting some animatics together, all using this technology. This is not replacing the creative, he was acting as a creative, and this was just speeding up their work. Right? So I think that that’s the way this is going to proceed.

Michael: That brings me to my next question because everyone in the industry is concerned about this – “What’s my future as a creative, as an editor, as somebody who does VFX or motion graphics? Do I have to worry about machine learning and AI taking my job?” And what would be your response to that?

Shailendra: No, I think this is one of the fears that everybody has. The way I think about this is that it’s AI, you know. You can say it’s taking over the world, but no. I mean, even our brains, we ourselves, I mean, I come from a research background in computer vision, and we’ve studied neurology. And as part of what we learned, we’ve barely mapped out 10% of our brain. How can we say that AI will replace our brains when we don’t ourselves know how our brains work? What it is doing is a lot of mimicking and basically has a lot of horsepower to do things. So will it get there? Maybe? I don’t know. But at this point, I’m a glass-half-full guy, you know. I’d rather focus on the positives of where it can assist us and where it can help us. I don’t think it’ll take over the jobs. It is going to be about assisting. There will be job changes. Sure. But those job changes will be very positive in my mind.

Michael: And well, that’s also been the job of a creative since the beginning of motion pictures, right? Your job has always evolved, whether it’s cutting celluloid or cutting video, or, you know, not using a bin button but instead logging stuff into a computer. It all has constantly evolved.

Shailendra: You’re just doing it faster now. Somebody still has the job of curating content. But now you’re being assisted in that. I don’t think it’s gonna take over a job. It will change them for sure.

Michael: We sat down with Mark Turner from MovieLabs, who obviously has, as you probably know, put out the 2030 Vision paper. And there are ten principles outlined in that. Yeah. I’m curious, has there been any work in RAD regarding AI and how it plays into MovieLab’s 2030 Vision?

Shailendra: So, what’s very interesting is MovieLabs, EBU and SMPTE actually just published the ontology primer, which we really believe in because we actually believe that asset management, as it stands right now, will move to much more of knowledge management, as you go forward. And that primary literally lays this principle out as well. And it’s one of the core principles moving forward. So we are very, very much aligned with that. And yes, that is going to be one of the areas that we are very interested in, and we’re working together with MovieLabs and others to bring that out. What does that look like? This is all part of the RAD Lab projects too. There are graph databases there that are coming up and implementations around that. So these are all going to be areas that we continue focusing on together with the MovieLabs site, the rest of the MovieLabs 2030 Vision. Well, we’re already showcasing products that are actually starting to show the way forward. Things like bringing the application to the media asset

Michael: Yeah. That media is sitting in cloud.

Shailendra: Exactly. So there are three ways we are doing that. Literally, virtualized editing that’s actually happening. Our customers are leveraging that today., in the cloud, public cloud storage and working directly on that. We have a web browser view that allows you to edit and asset manage. So again, even though the web browser view is remote, you might be sitting remotely, but it is close to the media because you’re not moving the whole content over. So that’s another way of thinking of it. And we just introduced NEXIS | EDGE. NEXIS | EDGE as a product is the same thing, but in that case, it’s not a browser view. It’s a much richer editorial environment, like the full editing system where you’re just accessing the media remotely in the swimming mode. So these are all aligned with MovieLabs, principles, the cloud principles. So [we] completely believe in where they’re going and will be right along for the journey.

Michael: Excellent. Shailendra, thank you so much for your time. You’re welcome. I’m Michael Kammes with Shift Media here at NAB 2023. And thanks for watching.

Shailendra: Thank you.

Miss our interview with Mark Turner, Project Director of Production Technology at MovieLabs? Watch it now to learn more about their 2023 Vision.


For tips on post-production, check out MediaSilo’s guide to Post Production Workflows.

MediaSilo allows for easy management of your media files, seamless collaboration for critical feedback and out of the box synchronization with your timeline for efficient changes. See how MediaSilo is powering modern post production workflows with a 14-day free trial.