Generative AI has become a sensation in the business world, with organizations worldwide eager to leverage its potential while managing the risks involved. These risks include concerns over hallucination, traceability, training data, IP rights, skills, and costs. Despite these challenges, the transformative power of AI in enhancing customer and employee experiences cannot be ignored, and the demand for implementation is relentless.
One area of focus in generative AI is Large Language Models (LLMs), which are revolutionizing how we access and interact with knowledge. Traditionally, enterprises have relied on keyword-based search engines to access corporate and customer knowledge. These search engines played a crucial role in the initial deployment of chatbots in enterprises, particularly for answering niche questions. IBM Watson Assistant has successfully utilized this approach for nearly four years. Now, with large language models and generative AI, we are taking this approach even further.
Introducing Conversational Search for Watson Assistant, we are thrilled to announce the beta release of Conversational Search. Powered by IBM Granite large language model and Watson Discovery enterprise search engine, Conversational Search enables AI assistants to provide scalable, business-focused answers, delivering faster and more accurate responses to customers and employees. Integrated into our augmented conversation builder, Conversational Search allows customers and employees to automate answers and actions, from helping customers understand credit card rewards and apply, to providing employees with information on time off policies and seamless vacation booking.
IBM recently announced the General Availability of Granite, our latest Foundation model series designed to accelerate the adoption of generative AI into business applications with trust and transparency. With this beta release, users can leverage a pre-trained Granite LLM model on enterprise-specific datasets to quickly power question-and-answering assistants for Watson Assistant. Conversational Search expands the range of user queries handled by your AI Assistant, reducing training time and increasing knowledge delivery.
Users of the Plus or Enterprise plans of Watson Assistant can now request early access to Conversational Search. Reach out to your IBM Representative to gain exclusive access to the Conversational Search Beta or schedule a demo with our experts.
So, how does Conversational Search work? When a user asks a question, Watson Assistant determines the best way to assist them, whether through a prebuilt conversation, conversational search, or escalation to a human agent. This is achieved using our new transformer model, which achieves higher accuracy with minimal training. Conversational search relies on two crucial steps: retrieval and generation. For retrieval, Watson Assistant utilizes search capabilities to retrieve relevant content from business documents using Watson Discovery, which understands context and meaning. For generation, Watson Assistant leverages the Retrieval Augmented Generation framework, reducing the need for extensive training. Businesses can upload their latest documentation or policies, and the model will retrieve information and provide updated responses.
At IBM, we prioritize responsible AI use. Conversational Search can be enabled to recognize only certain topics, and organizations can adjust their preferences for using search based on their corporate policies. We also offer “trigger words” to automatically escalate to a human agent if necessary.
To demonstrate Conversational Search in action, let’s consider a scenario where a customer wants to apply for a credit card. Watson Assistant guides the customer through the application process, extracting necessary details and providing welcome offer information. When the customer asks about the rewards offered by the card, Watson Assistant utilizes Conversational Search to retrieve the answer from the bank’s knowledge documents. If the customer asks a question about their credit score, Watson Assistant recognizes it as a special topic and escalates to a human agent. The conversation is summarized and sent to the agent for resolution. Finally, the customer applies for the credit card, satisfied with the assistance provided.
IBM is committed to open innovation, offering deployment options that suit enterprise needs. Watson Assistant Conversational Search is available on IBM Cloud and Cloud Pak for Data. In the future, semantic search will be configurable for Conversational Search deployments as software and SaaS options. Organizations can also bring their proprietary data to IBM LLM models and customize them using watsonx.ai or leverage third-party models for conversational search and other use cases.
If you’re just starting your generative AI journey for customer service, sign up for a client briefing session with IBM Consulting to transform your customer service experience.
Source link