AI in Customer Support: Enhancing Efficiency, Collaboration, and Service Quality
By introducing AI into our support processes, we are raising the bar on both speed and quality of service. Our automated system handles routine inquiries instantly, so our experts are available to help you with more specific needs. At the same time, AI fosters seamless international collaboration by bridging language and cultural barriers, making expertise and solutions globally accessible.
SERVICES
Our Methodical Approach
The introduction of our AI solution began with a thorough evaluation of various public providers. To test the effectiveness of the potential AI models, we conducted ten productspecific tests. The two providers that delivered the highest rates of correct and helpful responses were Google Gemini and Azure OpenAI.
The technical integration was carried out via the REST API of our TOPdesk customer support ticketing system. A decisive success factor during implementation was prompt engineering. The quality of the answers depends heavily on the precision of the prompts given to the AI. By carefully formulating these instructions, we ensured that the AI delivered not only fast but also high-quality and relevant responses.
Tool Calling: Expanded Knowledge for Better Answers
One of the most important features integrated into the AI solution is the so-called tool calling. This function allows the Large Language Model( LLM) to use external tools, such as Google Search. Through this connection, the AI can access current and expanded data sources. This improves response accuracy for time-sensitive or complex queries and ensures that answers are not only based on static training data, but also take into account dynamic, up-to-date information.
RAG and Internal Knowledge Databases
Further development of the AI solution is already planned. A central element of the next phase is the integration of Retrieval- Augmented Generation( RAG). RAG will connect the AI models from Google Gemini and Azure OpenAI with another key data source: Wibu-Systems’ internal documents. These include manuals, guides, and other exclusive resources. This strategy ensures that the AI can access the company’ s consolidated internal knowledge. By combining the broad knowledge of LLMs with specialized in-house expertise, the quality of answers will be improved further. To ensure optimal performance, parameters such as chunking( splitting documents into smaller sections) and document indexing will be carefully fine-tuned.
Field Report: First Deployment of AI in Support
Our first AI-powered support deployment exceeded expectations and delivered faster, smarter assistance for our customers from day one. Particularly striking was that the costs for implementation and operation were significantly lower than expected. This greatly lowered the barrier to entry and showed us that AI solutions can deliver great value even with a manageable budget.
Another success factor was that public providers such as Google Vertex AI or Microsoft Azure OpenAI Service came with an impressive baseline understanding of our product information. This allowed us to achieve results very quickly without time-consuming, extensive training.
Our Areas of Application in Everyday Work
■ Ticket Summarization: AI generates precise and understandable summaries, speeding up processing.
■ Translations for International Subsidiaries: Language barriers are almost completely overcome through fast and high-quality translations.
10 WIBU-SYSTEMS AG