Harmful narratives, like “mis” and “dis” information, are an important and growing problem for companies, government and ultimately, society. In recent years, and specifically in 2021, we saw an accelerating trend of digital social media narratives resulting in real life actions. In the wake of recent events and trends like COVID, vaccine hesitancy, and election interference, awareness and urgency for combating misinformation and harmful narratives is at an all-time high. And it’s not just governments that are being targeted. Awareness of threats – and also opportunities – from online communities is also front and center for enterprises and corporations.
Once they have viral properties, these narratives propagate rapidly and are often perceived as truth. The ability to understand the emergence of an inaccurate narrative quickly, and at its source, is a tremendous benefit that helps equip a company in protecting their assets, personnel, brand, and, ultimately, making sure the facts are known. This is not a new area for companies, most of the Fortune 1000 have risk intelligence teams and departments to tackle this issue and play important roles in how companies interact with society. The market for risk intelligence and analytics is $32B and growing at ~15-20% YoY, as companies prioritize and grow their teams.
The ability to understand the emergence of an inaccurate narrative quickly, and at its source, is a tremendous benefit that helps equip a company in protecting their assets, personnel, brand, and, ultimately, making sure the facts are known.
80% of the Fortune 1000 have these units and they are growing significantly in importance as online narratives have begun to have physical world consequences. But the tools available for companies and government departments to track these narratives today are extremely limited and can largely be classified as sentiment analysis.
A couple years ago, while on sabbatical, Sam Clark was mission driven to build an AI platform that helped to dissect narratives on YouTube particularly around election interference. Unbeknownst to him, future co-founder, Mark Listes had worked in the federal government, combatting election interference for several years. Mark found the tools available for tracking and combatting the spread of harmful narratives incredibly lacking. Mark had started his career years earlier founding a voting rights non-profit and toward the end of his career in government had transitioned to working in the Department of Defense’s innovation arm which partners with private industry for cutting-edge technologies.
By dissecting and categorizing the narratives in text, video and audio content on social media platforms, companies are better prepared and able to engage with communities as they choose.
Pendulum’s platform applies AI and NLP technologies to uncover threats and opportunities contained in narratives in the earliest days of their formation and track them as they spread online. By dissecting and categorizing the narratives in text, video and audio content on social media platforms, companies are better prepared and able to engage with communities as they choose.
With support for YouTube, BitChute, Rumble and Podcasts currently available, the platform will grow to encompass all social platforms of importance over the coming months.
The team is building in Seattle, check out their openings here and we are excited to jump in and to help them grow over the coming years.