Moments Lab’s public API can be used to integrate its MXT-1.5 multimodal AI with Oasis. MXT-1.5 combines what it sees and hears in videos to generate rich metadata, including human-like descriptions of what’s happening in every shot and sequence.
Bitcentral says its collaborative Oasis MAM software “is a popular media workflow solution for news production, with field-centric tools that seamlessly connect remote teams with TV studios to push stories quicker.”
Frederic Petitpont, Moments Lab co-founder and CTO, said: “Integrating our multimodal AI indexing technology with Bitcentral’s Oasis MAM puts enhanced content search and discovery at news teams’ fingertips. It means they’ll be able to instantly find the media assets they need to build and publish stories faster than ever.”
Sam Peterson, COO at Bitcentral, added: “We’re excited to offer Moments Lab’s cutting-edge AI indexing technology to our Oasis MAM customers and provide further efficiency to news production workflows.”
Moments Lab’s API works by analyzing media files (proxy or original) from on-prem or cloud storage. Media assets are processed and enriched by MXT-1.5, which detects faces, text, logos, landmarks, objects, actions, context, shot types and speech to generate a semantic, human-like description of the content for increased searchability. The output can be reviewed and validated in either Moments Lab or directly in the Oasis MAM.