Bots

Connecting to external AI services to act as Virtual User in Nimbus

Bots in Nimbus are leveraging external AI services to act as Virtual User in Nimbus. A bot can fulfill specific tasks, such as:

  • Handling specific customer requests
  • Looking up specific data

Configuring Bots

The bot configuration in Nimbus is built to be as flexible as possible, so common concepts can be re-used.  Each Bot configuration consists of the following properties:

Field Description
Name Name of the Bot, as it appears in Nimbus lists and selections.
Organization Unit

Determines where the Bot will be available for selection within Nimbus UIs. 

Note: Shared bot resource utilization

💡Sharing your bot: You can design a bot as “general purpose”, making its configuration widely available on a high-level OU. This allows multiple services and their their Virtual Users to re-use this bot configuration and underlying APIs.


☝ Before sharing:

  • Note that that each service may access this bot associated API key, causing cost1 and potentially getting impacted by any (later) configuration changes to the bot.
  • Keep in mind that wide-purpose bot access to resources and capabilities may also result in too much data being disclosed to the customer (e.g. by changed capabilities, instructions, topic covered, attached sources, permissions, etc.).
 
Type1

The type of your Bot. Directly affects the underlying API, LLM and capabilities. Refer to the following table for details:

INC Language model comparison matrix

Type Primary Use Case Limitations Integration Effort
M365 Copilot – Direct Line 3.0 Enterprise voice interactions with Microsoft 365 Copilot, enabling transcription of spoken conversations grounded in M365 work data and supporting natural interruption.
  • For generalized use for Microsoft 365 and Copilot Studio scenarios, rather than custom external voice agents;
  • Subject to Microsoft licensing and service capacity limits.
Medium – Native to the Microsoft 365 ecosystem. Requires configuration within Nimbus, plus integration with Azure Copilot Studio API handle topics. 
🔎 See: Use Case - Setting up a Nimbus Virtual User using Copilot 
Nimbus AI Services – Audio Intent Analyzer Low latency, conversational AI, real-time audio stream.  Contact‑center audio analysis focused on speech‑to‑text, intent, language, and sentiment detection for intelligent routing and IVR replacement. 
  • Primarily for intent classification rather than full speech‑to‑speech conversation 

Low – Configuration stays within Nimbus, parametrization is done directly in workflows as part of the “Virtual User” activity.

🔎 See: Use Case - Setting up a Nimbus  Virtual User for Intent Analysis 

Azure OpenAI – Audio GPT Realtime1 Low latency, conversational AI, real-time audio stream. Used as speech‑in / speech‑out conversational AI for building custom voice Agents that can cover a wide range of roles. 
  • Not end‑user ready out of the box.
  • Needs custom client and backend integration, with regional availability determined by Microsoft1.
  • Subject to Microsoft licensing and service capacity limits.

Medium –  Native to the Microsoft 365 ecosystem. Requires configuration within Nimbus, plus integration with Azure Copilot Studio API handle topics. 

Allows customers to bring their own LLM and 3rd party integrations.

🔎 See: Use Case - Setting up a Nimbus Virtual User using OpenAI GPT Realtime 

Comparison: Types of Bots for Nimbus Virtual Users

Notes

1 Due to Microsoft Azure AI Foundry GPT-Realtime availability in limited regions, data processed by Nimbus Virtual User GPT-Realtime integration will temporarily leave your regional boundary: 

  • DE01, DE02, CH01, CH02, UK01, EU01, AU01) and computation will be performed in Sweden Central region. No persistent storage of your data occurs as part of this process.
  • US01 cluster customers' data will not leave the regional boundary.
 

 

✅ The following fields are shown only when your selected Bot Type requires an API Key and Endpoint. 

💡 Please note: 

 
Field Description
API Key

Key for your external API access. 

💡The bot uses the authentication to leverage API features. Specific permissions are granted in the 3rd-party UI.

Endpoint

The Endpoint of the API your bot should use. 

💡This is the API address where the bot can send Request and get Responses. This depends on what you want the bot to do (e.g., send a message, get weather data, etc.).


Currently available: 

M365 CoPilot Direct Line 3.0 (Default)2

M365 CoPilot Direct Line 3.0. (Default)2 | MSFT Key Concepts Reference.

Copilot - Direct Line API Endpoints

General1 https://directline.Botframework.com/v3/directline/conversations 
Europe https://europe.directline.botframework.com/v3/directline/conversations/ 
Microsoft Copilot Directline API Authentication endpoints

1 ☝A request might fail if you use the global base URI for a regional bot, as some requests could go beyond geographical boundaries. The URLs are maintained by Microsoft and retrieved from the Learn | Direct Line documentation. We advise to test performance and stability.

🧠Don't forget: Note down the endpoint for later use, e.g. for the Bots configuration of your Nimbus Virtual Users.

 
 
 

INC Azure Billing Cognitive Services

Disclaimer: Support for Azure Billing & Cognitive Services 

☝ The usage of Microsoft Azure AI Services for Speech1 and AI will cause additional costs outside your Nimbus subscription, which are exclusively determined by Microsoft.

Please note: 

  • Nimbus Support does not cover pricing discussions or make recommendations on Microsoft Azure Services. → Please inform yourself via the references below on pricing models.
  • Nimbus Support can not provide any technical help, insights or solutions on Azure Cognitive Service related technical problems.

1 Azure Speech Services: https://azure.microsoft.com/en-us/pricing/details/cognitive-services/speech-services/

2 Generative AI Services (Copilot Studio):

 

Known Limitations

INC AI and Virtual User Limitations

General note on AI-driven interactions

AI driven replies are not deterministic and depend highly on what the Customer is saying. For example, if the Customer is saying “I have trouble with my internet” it is not necessarily given that the Bot will associate “Router, Modem” as your workflow routing exit, unless specifically handled in your Virtual User integration. In this specific example, AI instructions should also cover alternative wordings like “Router, Internet”, to be handled in topics accordingly.

🔎Refer to our Best Practices - Virtual Users in Nimbus for some considerations and risks when using Virtual Users instead of Human interaction.

 

Microsoft Copilot Limitations

  • Expect processing delays: Processing AI answers takes a few seconds for voice-to-text-transcription, followed by AI processing and a transcription back into a voiced response. Luware is trying to minimize the delay on Nimbus call infrastructure, but the dependency on external APIs will always incur a delay. The Customer will hear silence during this processing and no audio feedback / silence detection.
  • Ensure you have no ambiguity in your topics. For instance, the word “Service” may be too generic if you want to transfer to different services. Rather use healthcare|medical|emergency as identifiers or use more complex Regular Expressions to identify replies.

Nimbus Audio Intent Analyzer

Azure Open AI - Open GPT Realtime limitations

General Virtual User limitations

The current Nimbus implementation with AI-driven bots underlies the following limitations: 

  • Supported Modalities: Virtual Users (Bots) are currently available for Audio/Video modality tasks.
  • Virtual User Reporting: Sessions involving Virtual Users are not reflected as dedicated User Session. Virtual User session reporting is planned for a later point this year.

Table of Contents