Use Case - Building a Classification Agent

Using Copilot to quickly identify Customer intent and rout accordingly

In this Use Case, we're building a Copilot Agent which classifies Customer input into dimensions of data you define. This could be intent or sentiment, or categories. It is up to you! 

This generic capability of classifying messages can be used in different scenarios. One of such scenarios is to replace traditional, multi-step IVR by a single, natural language-driven interaction. A traditional IVR could have many steps and could look like:

  1. Welcome, choose your language.
    → (Customer selects French)
  2. Tapez 1 si vous êtes client, tapez 2 si vous êtes pas encore client chez BEST Insurance.
    → (Customer presses 2) 
    ⮑ Nimbus recognizes the Caller as an existing client “Sylvia”, retrieves information about the client.
  3. Bienvenue Sylvia! Tapez 1 …..tapez 2
    → (Silvia presses 2)
  4. Un Agent sera bientôt disponible pour s'occuper de votre déclaration d'accident.” 
    → (Silvia waits for the Agent to connect.)
  5. Sylvia is connected to the Agent 
  6. Sylvia finally can express her intent.

By replacing this (rather long) IVR interaction with Copilot, it can look like this:

  1. Welcome, what can I do for you?” 
    (Customer says:) “Bonjour, j'ai eu un accident de voiture et je voudrais savoir si les dommages sont couvertes par mon assurance.” 
    ⮑Use Case -  The Copilot Agent processes the input and extracts:
    1. Language: French (FR)
    2. Customer Type: Existing
    3. Topic: Insurance
    4. Subtopic: Coverage
    5. Sentiment: Neutral
  2. Based on this understanding, the Copilot routes the Customer directly to the appropriate service or provides an immediate, context-aware response – eliminating the need for multiple IVR steps
  3. Sylvia is connected to the Agent. 
  4. The Agent is already aware of the topic Sylvia wants to speak about.

As you can see, the Copilot approach is much more time-efficient and Customer-friendly while minimizing errors during the Interaction.

Before you start / Preconditions / Prerequisites / etc.

You require <tenant administrator, service owner> rights to to create, update, etc...

  • Nimbus: Contact Center and Virtual Users license.
  • Microsoft: 
    • A “Copilot Studio” Licence → Also see Azure Billing notes below.
    • Power Automate and Nimbus Power Automate Connector‍ Prerequirements are met. 
      • A deployed AI model or Power Automate Premium license to use AI Builder.
 

INC Azure Billing Cognitive Services

Disclaimer: Support for Azure Billing & Cognitive Services 

☝ The usage of Microsoft Azure AI Services for Speech1 and AI will cause additional costs outside your Nimbus subscription, which are exclusively determined by Microsoft.

Please note: 

  • Nimbus Support does not cover pricing discussions or make recommendations on Microsoft Azure Services. → Please inform yourself via the references below on pricing models.
  • Nimbus Support can not provide any technical help, insights or solutions on Azure Cognitive Service related technical problems.

1 Azure Speech Services: https://azure.microsoft.com/en-us/pricing/details/cognitive-services/speech-services/

2 Generative AI Services (Copilot Studio):

 

INC Icon Legend Accordion

Show Icon Legend 💡 = A hint to signal learnings, improvements or useful informati...

Show Icon Legend

💡 = A hint to signal learnings, improvements or useful information in context. 🔍 = Info points out essential notes or related page in context.
☝ = Notifies you about fallacies and tricky parts that help avoid problems. 🤔 = Asks and answers common questions and troubleshooting points.
❌ = Warns you of actions with irreversible / data-destructive consequence. ✅ = Intructs you to perform a certain (prerequired) action to complete a related step.
 
 

Overview of the Nimbus Workflow and Copilot Activities

 
 

Designing the Copilot

Let's start by applying the best practices to our creation process by defining a scenario:

Imagine visitors of a nature reserve. At the entrance a welcome guide listens to each one of them, understanding multiple languages, why they are here and what they want to see. The guide will then call a ranger to come to the entrance and pick up the visitor for a tour. Once the ranger arrives, the guide hands out a datasheet of the visitors conversation and interests so that the ranger can personalize the tour.

Defining the Bot's role

Acting as the Guide, the Copilot bot will gather all the necessary information and perform a classification of all the relevant information. 

Defining the Tasks

  • Greet
    • Explain well to arriving visitors why sou are here and what you need to know to do your job.
  • Listen & Understand
    • Classify the visitors messages into urgency, topic, sentiment, language.
  • Handover
    • when language and topic has been detected (it's sufficient, urgency and sentiment are a plus)
    • when the visitor requests for anything else (do not call the ranger, but point to the park's administration office ==> Agent handover)
    • as a fallback, in case of too many retries (instruct the ranger to do the default tour ==> fallback language and topic)

Access, checkpoints and thinking

The guide doesn't need access to the park, but he needs to know how to call the rangers and handover the information. The guide does not need to recognize the visitors and doesn't need a long-term memory. There is no need for supervisor approval for giving instructions to the rangers. 

The message classification task requires analytical thinking, we'll be using non-generative AI. 

Building the Copilot

Description Screenshot
  1. Head to Copilot Studio
  2. Creating a new (blank) Agent in Copilot Studio
  1. Go to the system topics. 
  2. Leave the “On Error” topic on
  3. Toggle everything else off.
  1. Change the “On Error” topic
  2. You want to output a Message with the System.Error.Message in the text.

Create a custom topic “On message received”. Here is an overview of the topic flow. We will get into the details in the following steps.

 

 

Start with the trigger “Message is received”.

💡This topic runs for every Customer message.

💡The first Customer message will contain the Nimbus TaskID.  By handing it over to Copilot using we can use it in the “Initial Message” of the Virtual User we can hand over the TaskId in a variable so that we can update the task when needed.


Within Copilot Studio:

  1. Add a Condition
  2. Fill in the global variable Global.TaskId and check if it is Blank.

If yes,

  1. Set its value to the Activity.Text.
    💡Activity.Text holds the complete Customer message string.
  2. Follow up with your welcome message in the same branch.

💡In the “other” branch, we know that we receive the second Customer message. It will be the reaction to the welcome message. In our case “Hi, how can I help you?”


✅Parallel Power Automate Flow: As we want to classify the first Customer message we built a Power Automate Flow:

  1. Add Tool > Flow. 
    💡Alternatively: You can build the Power Automate Flow directly in Copilot Studio, or prepare it in Power Automate before. it will be populated in Copilot Studio.
  2. In Copilot Studio, you need to map the outcomes to variables to collect them.
Overview of the flow

 

  • In the flow, we call an AI model using a system message and the customer input.  → See below.
  • The model returns a JSON which we need to parse and then respond to Copilot.

Output JSON: 

{
"language": "en",
"urgency": "high",
"sentiment": "negative",
"topic": "technical support"
}

Show System Message….

Classify customer messages into ISO 639-1 language code, urgency, sentiment, and topic. Output the result strictly in JSON format according to the provided schema.

# Criteria for Classification

- **Language**: Classify the language of the input message using the ISO 639-1 language code (e.g., "en" for English, "es" for Spanish).
- **Urgency**: Assess how urgent the message is. Options include "low", "medium", "high".
- **Sentiment**: Analyze the sentiment of the message. Options include "positive", "neutral", "negative".
- **Topic**: Identify the main topic or category of the message (e.g., "billing", "technical support", "general inquiry"). Be concise but descriptive.

# Output Format

Respond in JSON format that adheres to the following schema:
{
  "language": "string (ISO 639-1 code)",
  "urgency": "string ('low', 'medium', 'high')",
  "sentiment": "string ('positive', 'neutral', 'negative')",
  "topic": "string"
}

# Additional Notes

- When processing input, replace any dynamic placeholders with the actual content if provided (e.g., if the placeholder represents customer message text).
- Ensure the text is fully interpreted, and the output adheres to the specified format.

# Example

**Input:**
Classify the following message:

I have a problem with my login.

Respond in JSON:
{
 "language": "...",
 "urgency": "...",
  "sentiment": "...",
  "topic": "..."
}

**Output Example:**
{
  "language": "en",
  "urgency": "high",
  "sentiment": "negative",
  "topic": "technical support"
}
 
 

Now that we classified the message and collected the values in our variables, we can use the Nimbus Connector to Update the Nimbus task. 

  1. Add Tool > Connector 
  2. Choose Luware Nimbus Update Task
    → Also See: Flow Actions > Update Task.
  1. Map the Nimbus TaskId to the Global.TaskId from earlier…
  1. …and the Custom Context Parameters (System Fields and Parameters).
  2. Use a a powerFX formula to create the required Table object:
Table(
   { name: "Sentiment", value: Global.sentiment },
   { name: "Language", value: Global.Language },
   { name: "Topic", value: Global.Topic },
   { name: "Urgency", value: Global.Urgency },
{ name: "Message", value: System.Activity.Text }
)
  1. Now, add a message with the Exit criteria response, so the Nimbus workflow activity will exit once the message is classified.
  2. Deploy your Copilot Agent.

Adding the Copilot to Nimbus

Add the Bot

  1. In Copilot Studio, go to Settings > Security > Web channel security 
  2. 🧠 Copy the Secret 1 key for later.
  3. In Nimbus, go to Admin > Configuration > Bots. Add a new bot with the following configuration: 
    1. Endpoint: https://directline.botframework.com/v3/directline/conversations
    2. API key: 🧠 Your copied secret key

Create a Response Template

  1. In Nimbus, go to Admin > Configuration > Bot Response Templates 
  2. Add a new default response template for Copilot.

Create the Virtual User

  1. In Nimbus, go to Admin > Configuration > Virtual Users.
  2. Add a new Virtual User with a Description and choose your bot accordingly.
  3. On the Initial Message to Bot the most important part is the System Parameter CallId. It corresponds to the Nimbus TaskID.
    💡The CallId is used to start the Copilot “On message received” Trigger mentioned above, which will engage the Bot with a Customer. The next reply by the Customer is then handled exclusively Copilot itself.

Using the Copilot in Nimbus

The final step is to add your newly defined Virtual User to a workflow.

Use the Virtual User in your Workflow

  1. Within Nimbus go to Configuration > Workflows.
  2. Create a new Audio/Video Workflow or add your new “Add Virtual User” activity to an existing workflow. 
    1. Within the activity, you need to configure the Exit with the Regular Expression ^Understood to ensure that the Virtual User stops after processing the second input. 
      💡The wording “Understood” is adjusted in Copilot.
  3. You can then add an “Announcement” Custom Context Parameters to listen to their values and work with them according to your needs. 
    1. Example Custom Parameters:
      $(CustomContextParameters.<Language/Topic/Urgency/Sentiment)

Known Limitations

INC AI and Virtual User Limitations

AI-driven interactions

AI driven replies are not deterministic and depend highly on what the Customer is saying. For example, if the Customer is saying “I have trouble with my Skateboard” it is not necessarily given that the Bot will use the same word “Skateboard” as your exit unless specifically handled via Copilot. 

🔎Refer to our Best Practices - Virtual Users in Nimbus for some considerations and risks when using Virtual Users instead of Human interaction.

Microsoft Copilot Bot Limitations

  • Expect delays: Processing AI answers takes a few seconds for transcription of voice input, AI processing and then transcription back into a spoken reply. Luware is trying to minimize the delay on Nimbus call infrastructure, but the dependency on external APIs will always incur a delay. The Customer will hear silence during this processing and no audio feedback / silence detection.
  • Ensure you have no ambiguity in your topics. For instance, the word “Service” may be too generic if you want to transfer to different services. Rather use healthcare|medical|emergency as identifiers or use more complex Regular Expressions to identify replies.

Nimbus Technical Limitations

The current Nimbus implementation with AI-driven bots underlies the following limitations: 

  • Supported Modalities: Virtual Users (Bots) are currently available for Audio/Video modality tasks.
  • 3rd-Party API Support: Nimbus will be built flexibly to support “BYO” (bring your own) customer bot configurations and APIs. The currently supported bot integration is M365 CoPilot Direct Line 3.0 APIs.
  • Authentication: The Copilot Secret needs to be copy-pasted into Nimbus. Further authentication types are planned to be supported in the near future.
  • Generative AI: Copilot events (Reviewing, Thinking) during Orchestration are currently being ignored.

Table of Contents