Leveraging AI for Conversational Engagement: Insights from Chartwell PowerUp
At the recent Chartwell PowerUp in Orlando, I had the opportunity to share our thoughts on the role of AI in conversational engagement, especially around severe weather, emergencies, and outages. See our presentation here >>
We have been a strong proponent of leveraging AI in both proactive, omni-lingual communications and two-way interactive conversations, and the rise of large language models (LLMs) over this past year has brought these technologies into further focus. Natural language processing (NLP) chat solutions created over the past several years have been upgrade and/or replaced by conversational AI solutions that leverage LLM’s prompted with utility documents and website materials. This allows for a personalized and omni-lingual flow of information, that includes integrated flow such as outage reporting and status updates. As an example, I showed how I could report an outage and ask unprompted questions about storm preparation and life-support, for which I was provided relevant, structured answers. I tried to text in Spanish and German and was delivered grammatically (and contextually) correct responses without the need to manually translate each response.
We also shared our roadmap for automatic communication template and program creation that combines Generative AI with an author/publish model. This helps to avoid the challenges faced with manual content creation, on-demand across numerous languages. But all of this comes with a cost that is currently being subsidized by the AI providers and must be considered. Personalized, relevant on-demand smart content creation requires compute cycles, which could result in new deployment models that combines cloud providers, open source and custom AI models, and actual infrastructure. It is no wonder there is a rise in popularity for chipmakers such as Nvidia and AMD to name a few. It is an exciting time and we’re happy to share our experiences with the industry.
Interested in continuing the conversation? Let’s schedule some time >>