Use Case - Building a Callback Assistant in Copilot

Using Copilot to initiate, manage and schedule Callbacks in Nimbus

In this Use Case, we’re building a Copilot Agent to manage callback requests efficiently. The Agent automatically converts any callback request into a task. It supports outbound campaigns in Nimbus and can create or schedule a callback task whenever a customer asks to be contacted. Using natural language understanding, the Agent detects and extracts the preferred date and time from the customer's message.

Preconditions

To implement this Use Case you need:

  • For Nimbus: 
  • For Microsoft: 
    • A “Copilot Studio” Licence → Also see Azure Billing notes below.
    • Power Automate licensing needs, and all other Nimbus Power Automate Connector‍ preconditions must be met. 
      • A deployed AI model or Power Automate Premium license to use AI Builder.
      • Tenant Admin rights are required to implement all actions and connect to data sources required for your Services.
 

INC Azure Billing Cognitive Services

Disclaimer: Support for Azure Billing & Cognitive Services 

☝ The usage of Microsoft Azure AI Services for Speech1 and AI will cause additional costs outside your Nimbus subscription, which are exclusively determined by Microsoft.

Please note: 

  • Nimbus Support does not cover pricing discussions or make recommendations on Microsoft Azure Services. → Please inform yourself via the references below on pricing models.
  • Nimbus Support can not provide any technical help, insights or solutions on Azure Cognitive Service related technical problems.

1 Azure Speech Services: https://azure.microsoft.com/en-us/pricing/details/cognitive-services/speech-services/

2 Generative AI Services (Copilot Studio):

 

INC Icon Legend Accordion

Show Icon Legend 💡 = A hint to signal learnings, improvements or useful informati...

Show Icon Legend

💡 = A hint to signal learnings, improvements or useful information in context. 🔍 = Info points out essential notes or related page in context.
☝ = Notifies you about fallacies and tricky parts that help avoid problems. 🤔 = Asks and answers common questions and troubleshooting points.
❌ = Warns you of actions with irreversible / data-destructive consequence. ✅ = Intructs you to perform a certain (prerequired) action to complete a related step.
 
 

Overview of the Callback Agent

 
 

Designing the Copilot

Let's start by applying the best practices to our creation process by defining a scenario:

This bot is designed to transform any callback-related request into a manageable and trackable task within Nimbus. It is especially useful when your customers call from various locations and time zones, and you need a reliable way to handle their requests without losing oversight.

The bot uses advanced natural language understanding to interpret callback requests, detect and validate date and time information, and automatically create or schedule callback tasks. Whether the request is part of an inbound inquiry or an outbound campaign, the bot ensures every callback is captured and processed accurately.

Operating within the Virtual User (VU), the bot oversees all planned and scheduled tasks, allowing it to plan, initiate, or reschedule callbacks as needed. This ensures seamless coordination across different teams and time zones.

We this bot when you want to automate and control callback handling, eliminate manual tracking, and ensure no customer request falls through the cracks.

Role

The Callback Assistant is a Virtual User designed to manage and automate customer callback requests. Think of it as a reliable team member whose job is to ensure no callback request is missed, misunderstood, or delayed.

Its responsibilities include:

  • Understanding when a customer wants to be called back.
  • Extracting relevant information like phone number, name, and preferred time.
  • Checking if a callback is already scheduled.
  • Scheduling or initiating callbacks based on urgency.
  • Informing customers about past or upcoming callbacks.
Tasks

This Virtual User operates with limited autonomy: it can read and write to a specific SharePoint lists, initiate calls via Nimbus, and interact with customers—but only within the scope of its defined tasks.


Each task is implemented as a tool in Copilot Studio, triggered by specific customer intents:

  • Check for Existing Callbacks:
    Trigger: “I’m waiting for a call back!” or “When will you call me?”
    Action: Search the SharePoint list for entries matching the customer’s phone number.
    • If a future callback exists → Inform the customer of the scheduled time (adjusted to UTC+2).
    • If a past callback failed → Notify the customer and offer to reschedule.
  • Schedule a Callback:
    Trigger: “Can you call me back?” or “Schedule a callback.”
    Action: Ask for the preferred time using natural language understanding (e.g., “tomorrow morning”).
    • Customers call in from different locations and time zones. Detect date and time using natural language understanding. 
    • Validate that the time is in the future.
    • Create a new entry in the callback list using the customer’s name and number.
  • Start an Immediate Callback: 
    Trigger: “Call me back as soon as you can!” or “I need help now.”
    Action: Start an immediate call via Nimbus using the stored phone number.
  • Handover:  Exit the conversation on immediate callback, or after schedule a callback. 

Each tool is designed with clear entry conditions and limited permissions, ensuring the bot only acts when appropriate and never oversteps its role.

Access, checkpoints and thinking

The Callback Assistant only accesses:

  • The SharePoint list containing callback entries.
  • The customer’s phone number and name (passed from Nimbus).
  • The ability to initiate calls via Nimbus.

It does not access CRM systems, customer history, or sensitive data unless explicitly required and approved. This ensures data minimization and responsible automation.

To maintain control and oversight callback scheduling could include a confirmation step before the callback is initiated. This could be done by the tool feeding callbacks into Nimbus from the Sharepoint list. It would be sufficient to mark a callback entry as "created by Copilot" and then check on this column when feeding the callbacks in. We do not implement this, but you may consider it when implementing such an agent.


🔎All actions are logged for transparency and auditability using Azure insights. 
Capture telemetry with Application Insights - Microsoft Copilot Studio | Microsoft Learn

Building the Copilot Agent

Description Screenshot
  1. Head to Copilot Studio
  2. Creating a new (blank) Agent in Copilot Studio
  3. Enable Orchestration

Create a custom topic “Process JSON Input”. he goal is to hand over the customer's phone number and name.


JSON to process

{
"Phone": "+$(Caller.TelNumber)",
"Name": "$(Customer.DisplayName)"
}

You can follow the instructions in Luware Nimbus - Copilot - Initial Data Handover from Nimbus to Copilot to build the topic. 

 

 

At the end of the “Process JSON Input” topic, redirect to a new topic called Greeting”.

Customize the custom topic “Greeting”.

  1. Add a condition to check if the CustomerName is empty
    1. When a name is found (variable not blank), incorporate it into the message. 
    2. If no customer name is found, return a generic greeting message.

💡The goal is to greet a customer by name, whenever the Global.CustomerName variable has been identified and retrieved from Nimbus.

🔎 This variable matches to System Fields > Call data parameter $(Customer.DisplayName) which is only filled when a customer name has been looked up via Nimbus Power Automate Connector, by implementing one of the Power Automate Use Cases e.g. from a CRM, Excel List, Phone Directory etc.

 
  1. Create a custom topic:Return Current Phone Number Memory
    💡This allows the customer to correct or specify the phone number they want to be called back on.
  2. Add a Message to return the stored phone number.

 

 

✅From this point on the base functionality is done. You can adjust all other topics according to your needs.

Build the tools

✅As this Copilot has “Orchestration” enabled, the logic is build into the tools. Let's go ahead and create the tools.

Start an immediate call back

Description

Start call back immediately. Only run this tool when there is urgency to immediately start a call back rather than scheduling one at a certain time. Use the global variable phone number as input or ask for user input if the variable has no value.

Find call back entries for a given phone number

Description

Use this tool to retrieve the OutboundTaskState value and respond to the question:
“When will you call me back?”

This tool checks the callback status using the OutboundTaskState field. Interpret values as follows:

Scheduled: Callback is planned — respond with “You have a callback scheduled. We’ll contact you soon.”
In Progress: Callback is ongoing — respond with “We’re currently trying to reach you.”
Destination Accepted: Callback was successful — respond with “We’ve already called you back successfully.”
Max Retries Reached, Removed, or Failed: Callback was unsuccessful — respond with “We tried calling but couldn’t reach you.”
If the state is unknown or missing, respond with “Callback status is unclear. Please check again later.”

Inputs

Select your Sharepoint list and set the filter query to custom with the following powerfx formula

"Number eq '" & Global.PhoneNumber & "'"

Add a call back entry to the scheduled call back list

Description

Use this tool to schedule a new callback when the user says things like: “Call me back”, “Schedule a callback”, or “I didn’t get a callback.”
Parse natural language datetime (e.g. “in 2 hours”, “tomorrow at 6”) using the current time in the user’s timezone (UTC+2).
If the input is ambiguous (e.g. missing AM/PM or date), ask the user to clarify.
Store the callback time in UTC.
Do not use this tool for urgent or immediate callbacks.

Inputs

Use the global variables as entry values. Adjust to your needs.

Finally, respond with a fixed message which will also act as the exit criteria in the Nimbus Virtual User workflow activity.

Adding the Copilot to Nimbus

Bot Authentication

Copilot - Authentication

✅The following steps explain how to define your authentication in Copilot Studio so Nimbus can access the API later.

  1. Log into Copilot Studio.
  2. Security: Go to Agents > Settings > Security. 
    1. Select No authentication → See Design and Limitation Notes below.
    2. Go to Web Channel Security > Secrets and Tokens.
    3. 🧠Copy either Secret 1 or 2 for later within your Nimbus Bots configuration.
    4. Ensure “Require secured access” is enabled.

Copilot API Endpoint

Copilot - Direct Line API Endpoints

General1 https://directline.Botframework.com/v3/directline/conversations 
Europe https://europe.directline.botframework.com/v3/directline/conversations/ 
Microsoft Copilot Directline API Authentication endpoints

1 ☝A request might fail if you use the global base URI for a regional bot, as some requests could go beyond geographical boundaries. The URLs are maintained by Microsoft and retrieved from the Learn | Direct Line documentation. We advise to test performance and stability.

🧠Don't forget: Note down the endpoint for later use, e.g. for the Bots configuration of your Nimbus Virtual Users.

 

Configuring the Bot

✅The bot contains the API key and endpoint for your Copilot agent:

  1. Head to Nimbus Administration > Configuration > Bots and create a new Bot.
  2. Give your Bot a descriptive name
    💡This name is just for Nimbus UI purposes, and must not match Copilot. 
  3. Define the Organization Unit under which this Bot will be selectable. 
    ☝ Note that any service on this OU or higher can access it (which may result in additional Azure cost).
  4. Type is M365 Copilot Direct Line 3.0.
  5. API Key: 🧠 Paste the Secret from the Copilot setup above.
  6. Endpoint: 🧠 Paste the Endpoint URL from the Copilot setup above.

Create a Response Template

  1. In Nimbus, go to Admin > Configuration > Bot Response Templates 
  2. Add a new default response template for Copilot.

Create the Virtual User

  1. In Nimbus, go to Admin > Configuration > Virtual Users.
  2. Add a new Virtual User with a Description and choose your bot accordingly.
  3. On the Initial Message to Bot we handover the following data:
{
"Phone": "+$(Caller.TelNumber)",
"Name": "$(Customer.DisplayName)"
}

Using the Copilot in Nimbus

The final step is to add your newly defined Virtual User to a workflow.

Use the Virtual User in your Workflow

  1. Within Nimbus go to Configuration > Workflows.
  2. Create a new Audio/Video Workflow or add your new “Add Virtual User” activity to an existing workflow. 
    1. Within the activity, you need to configure the Exit with the Regular Expression ^Goodbye$ to ensure that the Virtual User stops after processing the second input. 
      💡The wording “Goodbye” is adjusted in Copilot.
       

 

Known Limitations

INC AI and Virtual User Limitations

AI-driven interactions

AI driven replies are not deterministic and depend highly on what the Customer is saying. For example, if the Customer is saying “I have trouble with my Skateboard” it is not necessarily given that the Bot will use the same word “Skateboard” as your exit unless specifically handled via Copilot. 

🔎Refer to our Best Practices - Virtual Users in Nimbus for some considerations and risks when using Virtual Users instead of Human interaction.

Microsoft Copilot Bot Limitations

  • Expect delays: Processing AI answers takes a few seconds for transcription of voice input, AI processing and then transcription back into a spoken reply. Luware is trying to minimize the delay on Nimbus call infrastructure, but the dependency on external APIs will always incur a delay. The Customer will hear silence during this processing and no audio feedback / silence detection.
  • Ensure you have no ambiguity in your topics. For instance, the word “Service” may be too generic if you want to transfer to different services. Rather use healthcare|medical|emergency as identifiers or use more complex Regular Expressions to identify replies.

Nimbus Technical Limitations

The current Nimbus implementation with AI-driven bots underlies the following limitations: 

  • Supported Modalities: Virtual Users (Bots) are currently available for Audio/Video modality tasks.
  • 3rd-Party API Support: Nimbus will be built flexibly to support “BYO” (bring your own) customer bot configurations and APIs. The first currently supported bot integration is M365 CoPilot Direct Line 3.0 APIs.
  • Authentication: The Copilot Direct Line 3.0 API Secret needs to be copy-pasted into the Nimbus Bots configuration. Further authentication types are planned to be supported in the near future.

Table of Contents