In this Use Case, we're building a Copilot Agent which classifies Customer input into dimensions of data you define. This could be intent or sentiment, or categories. It is up to you!
This generic capability of classifying messages can be used in different scenarios. One of such scenarios is to replace traditional, multi-step IVR by a single, natural language-driven interaction. A traditional IVR could have many steps and could look like:
- “Welcome, choose your language.”
→ (Customer selects French) - “Tapez 1 si vous êtes client, tapez 2 si vous êtes pas encore client chez BEST Insurance.”
→ (Customer presses 2)
⮑ Nimbus recognizes the Caller as an existing client “Sylvia”, retrieves information about the client. - “Bienvenue Sylvia! Tapez 1 …..tapez 2”
→ (Silvia presses 2) - “Un Agent sera bientôt disponible pour s'occuper de votre déclaration d'accident.”
→ (Silvia waits for the Agent to connect.) - Sylvia is connected to the Agent
- Sylvia finally can express her intent.
By replacing this (rather long) IVR interaction with Copilot, it can look like this:
- “Welcome, what can I do for you?”
→ (Customer says:) “Bonjour, j'ai eu un accident de voiture et je voudrais savoir si les dommages sont couvertes par mon assurance.”
⮑Use Case - The Copilot Agent processes the input and extracts:- Language: French (FR)
- Customer Type: Existing
- Topic: Insurance
- Subtopic: Coverage
- Sentiment: Neutral
- Based on this understanding, the Copilot routes the Customer directly to the appropriate service or provides an immediate, context-aware response – eliminating the need for multiple IVR steps.
- Sylvia is connected to the Agent.
- The Agent is already aware of the topic Sylvia wants to speak about.
As you can see, the Copilot approach is much more time-efficient and Customer-friendly while minimizing errors during the Interaction.
Before you start / Preconditions / Prerequisites / etc.
You require <tenant administrator, service owner> rights to to create, update, etc...
- Nimbus: Contact Center and Virtual Users license.
-
Microsoft:
- A “Copilot Studio” Licence → Also see Azure Billing notes below.
- Power Automate and Nimbus Power Automate Connector Prerequirements are met.
- A deployed AI model or Power Automate Premium license to use AI Builder.
INC Azure Billing Cognitive Services
Disclaimer: Support for Azure Billing & Cognitive Services
☝ The usage of Microsoft Azure AI Services for Speech1 and AI2 will cause additional costs outside your Nimbus subscription, which are exclusively determined by Microsoft.
Please note:
- Nimbus Support does not cover pricing discussions or make recommendations on Microsoft Azure Services. → Please inform yourself via the references below on pricing models.
- Nimbus Support can not provide any technical help, insights or solutions on Azure Cognitive Service related technical problems.
1 Azure Speech Services: https://azure.microsoft.com/en-us/pricing/details/cognitive-services/speech-services/
2 Generative AI Services (Copilot Studio):
INC Icon Legend Accordion
Show Icon Legend 💡 = A hint to signal learnings, improvements or useful informati...
Show Icon Legend
💡 = A hint to signal learnings, improvements or useful information in context. | 🔍 = Info points out essential notes or related page in context. |
☝ = Notifies you about fallacies and tricky parts that help avoid problems. | 🤔 = Asks and answers common questions and troubleshooting points. |
❌ = Warns you of actions with irreversible / data-destructive consequence. | ✅ = Intructs you to perform a certain (prerequired) action to complete a related step. |
Overview of the Nimbus Workflow and Copilot Activities

Designing the Copilot
Let's start by applying the best practices to our creation process by defining a scenario:
Imagine visitors of a nature reserve. At the entrance a welcome guide listens to each one of them, understanding multiple languages, why they are here and what they want to see. The guide will then call a ranger to come to the entrance and pick up the visitor for a tour. Once the ranger arrives, the guide hands out a datasheet of the visitors conversation and interests so that the ranger can personalize the tour.
Defining the Bot's role
Acting as the Guide, the Copilot bot will gather all the necessary information and perform a classification of all the relevant information.
Defining the Tasks
- Greet
- Explain well to arriving visitors why sou are here and what you need to know to do your job.
- Listen & Understand
- Classify the visitors messages into urgency, topic, sentiment, language.
- Handover
- when language and topic has been detected (it's sufficient, urgency and sentiment are a plus)
- when the visitor requests for anything else (do not call the ranger, but point to the park's administration office ==> Agent handover)
- as a fallback, in case of too many retries (instruct the ranger to do the default tour ==> fallback language and topic)
Access, checkpoints and thinking
The guide doesn't need access to the park, but he needs to know how to call the rangers and handover the information. The guide does not need to recognize the visitors and doesn't need a long-term memory. There is no need for supervisor approval for giving instructions to the rangers.
The message classification task requires analytical thinking, we'll be using non-generative AI.
Building the Copilot
Description | Screenshot |
|
![]() |
|
![]() |
|
![]() |
Create a custom topic “On message received”. Here is an overview of the topic flow. We will get into the details in the following steps.
|
![]() |
Start with the trigger “Message is received”. 💡This topic runs for every Customer message. |
![]() |
💡The first Customer message will contain the Nimbus Within Copilot Studio:
|
![]() |
If yes,
|
![]() ![]() |
💡In the “other” branch, we know that we receive the second Customer message. It will be the reaction to the welcome message. In our case “Hi, how can I help you?” ✅Parallel Power Automate Flow: As we want to classify the first Customer message we built a Power Automate Flow:
|
![]() ![]() |
|
Output JSON:
|
Show System Message….
| |
Now that we classified the message and collected the values in our variables, we can use the Nimbus Connector to Update the Nimbus task.
|
![]() |
|
![]() |
|
![]() |
|
![]() |
Adding the Copilot to Nimbus
Add the Bot
- In Copilot Studio, go to Settings > Security > Web channel security
-
🧠 Copy the Secret 1 key for later.
- In Nimbus, go to Admin > Configuration > Bots. Add a new bot with the following configuration:
-
Endpoint:
https://directline.botframework.com/v3/directline/conversations
- API key: 🧠 Your copied secret key
-
Endpoint:

Create a Response Template
- In Nimbus, go to Admin > Configuration > Bot Response Templates
- Add a new default response template for Copilot.
Create the Virtual User
- In Nimbus, go to Admin > Configuration > Virtual Users.
- Add a new Virtual User with a Description and choose your bot accordingly.
- On the Initial Message to Bot the most important part is the System Parameter
CallId
. It corresponds to the NimbusTaskID
.
💡TheCallId
is used to start the Copilot “On message received” Trigger mentioned above, which will engage the Bot with a Customer. The next reply by the Customer is then handled exclusively Copilot itself.

Using the Copilot in Nimbus
The final step is to add your newly defined Virtual User to a workflow.
Use the Virtual User in your Workflow
- Within Nimbus go to Configuration > Workflows.
- Create a new Audio/Video Workflow or add your new “Add Virtual User” activity to an existing workflow.
- Within the activity, you need to configure the Exit with the Regular Expression
^Understood
to ensure that the Virtual User stops after processing the second input.
💡The wording “Understood” is adjusted in Copilot.
- Within the activity, you need to configure the Exit with the Regular Expression
- You can then add an “Announcement” Custom Context Parameters to listen to their values and work with them according to your needs.
- Example Custom Parameters:
$(CustomContextParameters.<Language/Topic/Urgency/Sentiment)
- Example Custom Parameters:
Known Limitations
INC AI and Virtual User Limitations
AI-driven interactions
AI driven replies are not deterministic and depend highly on what the Customer is saying. For example, if the Customer is saying “I have trouble with my Skateboard” it is not necessarily given that the Bot will use the same word “Skateboard” as your exit unless specifically handled via Copilot.
🔎Refer to our Best Practices - Virtual Users in Nimbus for some considerations and risks when using Virtual Users instead of Human interaction.
Microsoft Copilot Bot Limitations
- Expect delays: Processing AI answers takes a few seconds for transcription of voice input, AI processing and then transcription back into a spoken reply. Luware is trying to minimize the delay on Nimbus call infrastructure, but the dependency on external APIs will always incur a delay. The Customer will hear silence during this processing and no audio feedback / silence detection.
-
Ensure you have no ambiguity in your topics. For instance, the word “
Service
” may be too generic if you want to transfer to different services. Rather usehealthcare|medical|emergency
as identifiers or use more complex Regular Expressions to identify replies.
Nimbus Technical Limitations
The current Nimbus implementation with AI-driven bots underlies the following limitations:
- Supported Modalities: Virtual Users (Bots) are currently available for Audio/Video modality tasks.
- 3rd-Party API Support: Nimbus will be built flexibly to support “BYO” (bring your own) customer bot configurations and APIs. The currently supported bot integration is M365 CoPilot Direct Line 3.0 APIs.
- Authentication: The Copilot Secret needs to be copy-pasted into Nimbus. Further authentication types are planned to be supported in the near future.
- Generative AI: Copilot events (Reviewing, Thinking) during Orchestration are currently being ignored.