Use Case - Setting up a Nimbus Virtual User using Copilot

How to set up Copilot to enable Virtual Users in Nimbus Workflows

In this Use Case, we're going to address the following topics in order to set up your first Virtual User in Nimbus

  • How to use Microsoft Copilot Studio to set up your first AI Agent acting as Virtual User.
  • How to configure Nimbus in order to use the Virtual User in your workflows.
  • How to test your Service and make adjustments.

Preconditions

Nimbus Tenant Admin rights are required to make necessary adjustments in the Configuration and related Distribution Service Settings to set up a workflow.

Microsoft Services: In order to use AI-driven Bots in Nimbus you will need access to Microsoft Cognitive Services.1

Nimbus Contact Center Licensing2:

  • Virtual Users in Nimbus – using this AI Bot-driven functionality – are available to Contact Center services only.
  • A Contact Center Service must be set up, so you can use the related “Add Virtual User” Workflow Activity.

1 Azure Bot usage causes additional cost outside of your Nimbus subscription. Please note the following licensing remarks below.
2 Refer to Nimbus Features for more details.

 

INC Azure Billing Cognitive Services

Disclaimer: Support for Azure Billing & Cognitive Services 

☝ The usage of Microsoft Azure AI Services for Speech1 and AI will cause additional costs outside your Nimbus subscription, which are exclusively determined by Microsoft.

Please note: 

  • Nimbus Support does not cover pricing discussions or make recommendations on Microsoft Azure Services. → Please inform yourself via the references below on pricing models.
  • Nimbus Support can not provide any technical help, insights or solutions on Azure Cognitive Service related technical problems.

1 Azure Speech Services: https://azure.microsoft.com/en-us/pricing/details/cognitive-services/speech-services/

2 Generative AI Services (Copilot Studio):

 

INC Icon Legend Accordion

Show Icon Legend 💡 = A hint to signal learnings, improvements or useful informati...

Show Icon Legend

💡 = A hint to signal learnings, improvements or useful information in context. 🔍 = Info points out essential notes or related page in context.
☝ = Notifies you about fallacies and tricky parts that help avoid problems. 🤔 = Asks and answers common questions and troubleshooting points.
❌ = Warns you of actions with irreversible / data-destructive consequence. ✅ = Intructs you to perform a certain (prerequired) action to complete a related step.
 
 

Use Case Overview

Details will be covered below in this Use Case. As a quick tl;dr, here is an overview of items you will need: 

What? Where? Why?
✅ Copilot Bot Setup1 https://copilotstudio.microsoft.com/  To define the bot role, behavior, topics, triggers and to get the API authentication details
Bot Configuration Nimbus Admin > Configuration To get the Bot API endpoint and Authentication details into Nimbus
Bot Response Templates  Nimbus Admin > Configuration To map the bot response JSON to Nimbus parameters for further processing within workflows.
Virtual Users  Nimbus Admin > Configuration To define the initial bot instructions as well as tying all other configuration items together.
“Add Virtual User Activity” (in Workflow) Workflows > Conversation Handling Activities  To invite the Bot to the conversation and route the outcomes accordingly.

1 Note that behaviors, UI designs and configuration details are subject to frequent changes by Microsoft and may mismatch from our Use Case descriptions. Feel free to Contact support to request an update to our Use Case documentation.

Step 1 - Copilot Setup

✅ The first step is creating an AI “Agent” on your Microsoft Copilot Studio. https://copilotstudio.microsoft.com/ 

🔎 Note that this part heavily relies on the official Microsoft Copilot Studio Guidance Documentation and related capabilities. You might want to read into chapters such as:

 

Agent Creation

  1. Sign into Copilot Studio.
  2. Go to Agents and Create a new Agent.
    1. Name: Give your Agent a descriptive name that signals the intended use, e.g. Insurance Bot, Attendant, Customer Data Requester.
    2. Description: Can be left empty. Only for listings in Copilot, not used by Nimbus. 
    3. General Instructions: Configure your Bot's directive and behavior. As an example: 
      As a telephony helpdesk assistant you help customers with accurate and helpful responses. Ensure answers are in between 2-3 sentences. Maintain a courteous and accurate communication style.
    4. Orchestration: Disabled
      💡Note that any generative events will be ignored in Nimbus Workflows. Your trigger phrases should be as specific and deterministic as possible to ensure a quick workflow routing experience.
  3. Topics1: Distinguished by System and Custom topics.
    1. System topics: The “Conversation Start” topic is the core focus point in this Use Case. This is how the Bot starts the conversation with the Customer, even proactively if necessary. 
      💡We will get back to this later in the Nimbus Setup as our “Initial Message” in the Virtual User configuration below.
    2. 2Custom topics: This is where you define the competencies of your Bot. It primarily consists of trigger phrases (e.g. “Health service, Insurance service, Problem handling, Support request, Personal Assistance” that your Bot is supposed to react to with topics in specific or general replies. 
      Following the official chatBot topics documentation you can also go into very specific cases here based on your most-frequented Service inquiries such as “In which countries can I find branch offices of service X”).
  4. Publish: Once everything is configured3 don't forget to publish your Bot.

1🧪 Experimentation required: As Nimbus will exclusively rely on your configured Agent Bot capabilities, this area is the one where you need to put the most thought and testing in. The steps below therefore are only rough guidelines for your Bot configuration.

2💡Good practice: Make the Customer aware that they are (about to) speak with an AI. You can of course also lead with an “Announcement” in your workflow as an elegant transition first, looking up useful Customer information in CRM via Nimbus Power Automate Connector.

3💡Good to know: Other tabs like: Actions, Activity, Analytics in Microsoft Copilot Studio are just for informational purposes and do not require any configuration for Nimbus.

Authentication

✅Next up we are going to configure how Nimbus can access the Copilot API,

  1. Security: Go to Agents > Settings > Security. 
    1. Select No authentication → See Design and Limitation Notes below.
    2. Go to Web Channel Security > Secrets and Tokens.
    3. 🧠Copy either Secret 1 or 2 for later within your Nimbus Bots configuration.
    4. Ensure “Require secured access” is enabled.
  2. Endpoint: Copilot Direct Line 3.0: https://directline.Botframework.com/v3/directline/conversations
    🧠Copy the Endpoint for later in the steps below.

Step 2 - Nimbus Setup

✅ After setting up your Copilot Bot the next steps are done within the Nimbus Administration. This consists of setting up a Bot configuration that relates to your Copilot Agent. We will then map the responses to be eventually used in a “Add Virtual User” activity within your Nimbus Workflows.

 

Configuring the Bot

✅ The Bot is your “mapping” to the Bot and API endpoint you configured above, including the authentication.

  1. Head to Nimbus Administration > Configuration > Bots and create a new Bot.
  2. Give your Bot a descriptive name
    💡This name is just for Nimbus UI purposes, and must not match Copilot. 
  3. Define the Organization Unit under which this Bot will be selectable. 
    ☝ Note that any service on this OU or higher can access it (which may result in additional Azure cost).
  4. Type is M365 Copilot Direct Line 3.0.1
  5. API Key: 🧠 Paste the Secret from the Copilot Setup above.
  6. Endpoint: 🧠 https://directline.Botframework.com/v3/directline/conversations
  7. Save and Close.

1💡More options will be supported in future.

Configuring Bot Response Templates

✅ Next up are Bot response templates, which tell Nimbus what to do with the JSON data that the Bot will reply with:

  1. Head to Nimbus Administration > Configuration > Bot Response Templates and create a new Template.
  2. Give your Template a descriptive name
    💡This name is just for Nimbus UI purposes.
  3. Define the Organization Unit under which this Template is selectable. 
    💡Should ideally match the Bot OU so they can be made equally available in other Nimbus UIs. Of course you can pick a “higher” OU if you want to share this Template definition among similar services that expect the same kind of data in the reply.
  4. “System Field Mappings” you can adjust to the examples below.

Field Mapping for Copilot DirectLine

💡Here is a typical JSON answer from a Copilot Direct Line Bot:

{
	"type": "message",
	"id": "5ZASDrgTOO0GtNlVtdgEE2-uk|0000021",
	"timestamp": "2025-05-20T20:31:01.9981516Z",
	"serviceUrl": "https://directline.Botframework.com/",
	"channelId": "directline",
	"from": {
		"id": "user1"
	},
	"conversation": {
		"id": "5ZASDrgTOO0GtNlVtdgEE2-uk"
	},
	"locale": "en-EN",
	"text": "hello, Bot"
}

💡In Nimbus' “Field Mapping” within the Response Template you want to pay special attention to the "text" portion of the JSON response above. As Nimbus identifies all Parameters with a $ sign, you can map the response as follows:

  1. Into the fields Answer Raw, Exit and Answer Formatted type: $['text']
  2. Optionally: 
    1. Map the same $['text'] field to your own Custom Parameter for later usage, e.g. within your Power Automate Flow Actions.
    2. Repeat this for any other Custom Parameters you want to map.
Example of  mapping for a Copilot DirectLine
Copilot DirectLine example for mapping JSON ‘text’ field values to Nimbus System Parameter Fields and Custom Parameter Fields.

Configuring the Virtual User

Virtual Users combine the previous Bot and Template configuration. You can give them instructions to act as your Agent that will interact with the Customer.

  1. Head to Nimbus Administration > Configuration > Virtual Users and create a new Virtual User.
  2. Give your Virtual User a descriptive name, e.g. Service Desk Operator.
    💡This name is just for Nimbus UI purposes, e.g. for later selection in your WorkflowsNext step.
  3. Define the Organization Unit under which this Virtual User is selectable. 
    💡Should ideally match your intended Services and Workflows accessing this Virtual User. Of course you can pick a “higher” OU if you want to share this definition among similar Services that will use the same kind of Agent for Customer interaction.
  4. The “Initial Message” acts as a "Topic” or “Query” for your bot. This field can also be used to force a bot response in case it is not configured to send an initial message by itself. 

    💡Note that this field is optional in Nimbus:

    Depending on your Bot configuration underlying AI may already engage with the Customer automatically. Optionally you can also this message to go directly into the first Conversation Topic1, e.g. by using Parameters and starting right with the first inquiry.


    A possible example of an Initial Message could be:

    Hello, this is $(CustomContextParameters.DOC_ClientId) from $(Customer.Company) speaking. I have a question. What are my options?

    1🔎For Copilot as AI Agent, refer to System topics: > “Conversation Start” (MSFT documentation).

     
  5. Check that Audio/Video modality is enabled and select a Speech Recognizers (Transcriber) 
    ✅ Speech Recognizer are configured to parse the language of the calling Customer into text for the bot to process. They can be configured multilingual, but are more effective when set to a specific language.
An example for a Virtual User configuration with some Parameters in the Initial Message.

 

💡Flexible Input: By adding Custom Parameters or System Fields and Parameters to your “Initial message” you can also lead right into a question or topic. Of course it needs to be ensured that these Parameters have Defaults or Fallbacks so the bot can operate normally. 

 

Configuring your Nimbus Workflow

✅Last but not least you need to configure a new Workflows to handle the Audio/Video interaction between Customers and your new Virtual User.

  1. Head to Nimbus Administration > Configuration > Workflows.
  2. Design or adjust any existing workflows you want to repurpose. 
    💡 We recommend creating a copy of an existing Workflow so you can switch and test it with ease and return back should something not work as intended.
  3. In your Workflow, Accept a call as normally, but do NOT add a queue it. Instead, start with an Announcement to prepare your calling Customer that they are speaking to a Virtual User (Virtual Customer Assistant).
  4. Connect your Caller to the AI by using the “Add Virtual User” in your workflow. You can find it in the Conversation Handling Activities. Configure the “Add Virtual User” activity as follows:
    1. Virtual User: Select from your list of Virtual Users as previously configured in the Configuration above.
    2. Max Input Timeout: Define this to a reasonable time where no answer has been given by the Customer - e.g. 1 minute. → The activity will take the “Idle Timeout” exit.
    3. Text to Speech: determines the voice (and language) the Bot will be using for Text to Speech transcription. 
      ☝Note that the Bot may give very long answers if you haven't given it specific General Instructions in Copilot Studio, e.g. to limit itself to a few sentences only.
    4. Exits: will determine the exit node taken within your Workflow. 
      💡For example, if your Customer requests “I'd like to speak to someone in person”, the Bot may answer with “Of course! I'll have someone call you back as soon as possible” . In this case you need an “Callback” exit condition, evaluating the bot replies via Regular Expressions, for example: support|help|assist|speak|talk

      💡Prepare for all eventualities: Note that determinism of your bot highly relies on the “General Instructions” and especially on the “Topics” configuration in your Copilot setup. Make sure your “Failed” exit conditions are also covered, e.g. when no possible topic could be identified.

      💡Remember that in the Bot Response Templates $['text']is mapped to your Workflow “Exit” conditions, not the Customer's initial request or voice transcription.

       
  5. Finally: 
    1. Don't forget to route your all your Activity exits, e.g. Failed/Idle conditions. 
    2. Route to Service Queues, Voicemail or Transfer where necessary.
    3. Don't forget theDisconnect” activity at the end and to Save your workflow.

✅ Testing your Virtual User

  1. To test your Virtual Use both in Copilot Studio (text prompts) and in Calls. 
    1. You can engage a test call to your Service UPN via General Service Settings.
    2. 🔎 Keep in mind: You can use the “Initial message" defined in your Virtual Users as a first conversation topic, including the use of Nimbus Parameters and System Fields. Examples of this can be found in our more advanced Use Case - Building a Classification Agent.
  2. Testing your Workflows: 
    1. We recommend testing different workflow paths taken by adding some simple Announcements to each entry and exit node for easier input / output validation, also as you can check on parameters this way.
    2. 🔎 Keep in mind: The Workflow Exits depend on what the Virtual User is replying to the Customer (match from the Bot Response Templates). Your Regular Expressions should be flexible enough to handle most common cases. For example you can catch multiple words with pipe “|” characters to collect synonyms in a single exit.
 

Further Use Cases

Once you established a baseline connection with your AI you might want to visit our library of AI Use Cases Cases for more inspiration.

 

Known Limitations

INC AI and Virtual User Limitations

AI-driven interactions

AI driven replies are not deterministic and depend highly on what the Customer is saying. For example, if the Customer is saying “I have trouble with my Skateboard” it is not necessarily given that the Bot will use the same word “Skateboard” as your exit unless specifically handled via Copilot. 

🔎Refer to our Best Practices - Virtual Users in Nimbus for some considerations and risks when using Virtual Users instead of Human interaction.

Microsoft Copilot Bot Limitations

  • Expect delays: Processing AI answers takes a few seconds for transcription of voice input, AI processing and then transcription back into a spoken reply. Luware is trying to minimize the delay on Nimbus call infrastructure, but the dependency on external APIs will always incur a delay. The Customer will hear silence during this processing and no audio feedback / silence detection.
  • Ensure you have no ambiguity in your topics. For instance, the word “Service” may be too generic if you want to transfer to different services. Rather use healthcare|medical|emergency as identifiers or use more complex Regular Expressions to identify replies.

Nimbus Technical Limitations

The current Nimbus implementation with AI-driven bots underlies the following limitations: 

  • Supported Modalities: Virtual Users (Bots) are currently available for Audio/Video modality tasks.
  • 3rd-Party API Support: Nimbus will be built flexibly to support “BYO” (bring your own) customer bot configurations and APIs. The currently supported bot integration is M365 CoPilot Direct Line 3.0 APIs.
  • Authentication: The Copilot Secret needs to be copy-pasted into Nimbus. Further authentication types are planned to be supported in the near future.
  • Generative AI: Copilot events (Reviewing, Thinking) during Orchestration are currently being ignored.

Table of Contents