Skip to content

About Us > Press > New service offering “Trusted AI Chatbots & Assistants”

New service offering “Trusted AI Chatbots & Assistants”

diconium also ensures trust in chatbots and assistants as the key to AI success 


Stuttgart, June 12, 2024

diconium, the partner for digital growth along the entire value chain, is now launching its new “Trusted AI Chatbots & Assistants” service offering. It provides companies with valuable support in the development of trustworthy AI chatbots and assistants - from the initial analysis to consulting and implementation. The aim of the new service offering is to create tailor-made AI solutions that are protected against hallucinations, the disclosure of internal information and other risks. To this end, diconium relies on a holistic development approach that enables risks to be dealt with systematically based on a new type of quality and risk taxonomy in the four categories of quality, safety, security and law. 

David Blumenthal-Barby, Principal Specialist AI bei diconium: "Building an AI chatbot or assistant in 20 minutes is now entirely possible - but definitely not a good idea. This is because off-the-shelf AI chatbots can entail risks in terms of quality, safety, security and law. This has recently been demonstrated again by a number of practical examples. In order to create trust in AI bots and assistants, a systematic approach is required to minimize the risks. In other words, trust is also the key to AI success with chatbots and assistants. With 'Trusted AI Chatbots & Assistants', we are offering precisely this opportunity and further expanding our position as a trustworthy partner in the field of artificial intelligence." 

  

Developing trustworthy AI chatbots requires systematic handling of risks

AI chatbots and assistants offer companies a wide range of internal and external applications for more efficient processes and increased customer and employee satisfaction. However, incorrectly programmed AI chatbots and assistants can have the exact opposite effect and, among other things, provide incorrect information, disclose internal information, behave inappropriately or expose the operator to legal risks. To prevent this, diconium provides support with a holistic development approach. With a focus on four key risk areas: 

  

  • Quality: Successfully introduced AI chatbots and assistants have practical added value for their users. This is the only way to ensure a positive return on investment for companies. 
  • Safety: Users and operators are protected against misconduct by the AI chatbot or assistant (e.g. incorrect information, damage to reputation). 
  • Security: High security standards protect the AI chatbot or assistant against attacks - especially against new attack patterns that specifically target AI systems. 
  • Legal: Legal risks (e.g. data protection) of the company- and industry-specific use of AI chatbots and assistants are taken into account from the outset. 

 

AI-Chatbots

More than just a hype

How companies can benefit from systematically managing risks on the way to trustworthy AI chatbots and assistants and further information on Trusted AI Chatbots & Assistants

 More about AI-Chatbots

ai_chatbot_services

About diconium

diconium, a subsidiary of the Volkswagen Group, is the partner for digital growth along the entire value chain. We build strategies, develop commerce platforms, and give you access to data-driven intelligence.  

Rooted in commerce, grown in digital transformation – diconium is just as integrated into the digital sales of STIHL as it is in the Volkswagen ID.4’s software, Bechtle’s brand platform or Coop’s and Trumpf’s online shops. We can guarantee this with more than 2,000 employees in interdisciplinary teams of consultants, data experts, and software developers across 15 locations in Europe, North America, and Asia. 


Press contact

Barbara Skrodzki
senior director marketing

Newsletter

Get exclusive access to innovations, knowledge, and methodology for business and technology. Our monthly newsletter will keep you up to date on events and curated topics by our experts.  

Subscribe