In the Companion Service Settings you can set up AI-assisted features that will provide guidance for your Nimbus Users in order to deliver a better service quality to the customer.
Transcription
INC Transcription Preconditions
PRECONDITIONS
✅Related: Refer to Use Case - Setting Up Transcription for detailed step-by-step instructions.
Licensing:
- Enterprise Contact Center The service needs to have an Enterprise or Contact Center license assigned in order to use the Transcription feature.
- Your tenant needs to have available Companion licenses in Licenses Tenant Settings.
- Each Nimbus User that wants to use the Live Caption/Transcription feature needs to have a Companion license applied. This is done by an administrator in General User Settings.
- As an Administrator/Team Owner you need to enable the Live Caption/Transcription features in Companion Service Settings and add it as widget via Extension Service Settings.
API Services1:
- As an Administrator, you need to set up speech services in Azure and note down the API key.
- The API key needs to be added in the Nimbus Speech Recognizer settings. The configured Speech Recognizer is then used for Transcription.
1 🔎Notes: This feature uses a speech recognition engine hosted by Microsoft and supports regions as specified by Microsoft AI services. The usage of the Transcription features and related APIs will cause additional monthly ACS costs outside of your Nimbus subscription.
INC Azure Billing Transcription
AZURE BILLING
The usage of the Transcription feature will cause additional monthly ACS costs. The costs are determined by Microsoft. Also see https://azure.microsoft.com/en-us/pricing/details/cognitive-services/speech-services/.
- Before enabling the Transcription feature, get in touch with your Luware Customer Success specialist to discuss terms, usage scenarios and necessary setup procedures.
- Please note that Nimbus and Transcription support does not cover pricing discussions or make recommendations based on Microsoft Azure infrastructure.
The Transcription feature transcribes call contents using Voice-to-Text capabilities. It improves quality assurance and increases the efficiency of call processing, for example making it easier to summarize past customer interactions and possibly transfer them to a CRM system for easier search and later analysis.

Once a call has ended, a Voice Transcription is generated in the background. You can show this in My Sessions to your Users or retrieve the transcribed text via Nimbus Power Automate Connector by leveraging the Trigger Event "When the virtual user assistant has an update". You can also write the outputs to any target, e.g. into a Teams message or Email.
![]() ![]() |
![]() |
✅ Related Follow-up steps:
- You can start using the Nimbus Power Automate Connector by leveraging Trigger Events and Flow Actions, as described in our example Use Case - Analyzing a Transcript.
- For Transcription to be visible to your Users, the “Companion” widget must be enabled via Extensions Service Settings > Widgets.
- The detailed interactions for Nimbus Users are described on the Transcription feature page. You can inform your Service team once the feature has been enabled.
Virtual Assistants

Live Captioning
✅ Precondition: Transcription is required (including Speech Recognizers) to use Live Caption.
The Live Caption feature converts ongoing call participant voices directly into text, using Voice-to-Text capabilities. Contents are displayed live on the My Sessions page.

Related Concepts and Steps
✅ Enabling Voice Transcription and setting up a Speech Recognizer in the Configuration is mandatory to use Live Caption. This is necessary so that the right language engine is used for voice detection.
💡Good to know:
- When the feature is disabled or unavailable (e.g. throttling, missing settings or service impediments on Microsoft side) info messages are shown in the frontend UI widget. Nimbus task handling and user-customer interactions themselves are not affected by this and will continue normally.
- Only the transcription between the current User and the Customer is kept, no third parties. → Also refer to the Known Limitations chapter below.
✅ Follow-up steps:
- For Live Caption to be visible to your Users, the “Companion” widget must be enabled via Extensions Service Settings > Widgets.
- The detailed interactions for Nimbus Users are described on the Transcription feature page. You can inform your Service team once the feature has been enabled.
Known Limitations
INC Transcription Limitations
KNOWN TRANSCRIPTION LIMITATIONS
- Transcription is currently only supported for inbound calls. Outbound Calls (i.e. Call On Behalf ) will not be transcribed.
- Using Live Captioning currently mandates Transcription to be enabled as well. Speech Recognizers are required for both features.
💡We are actively working on further improvements to make both features configurable separately.
Supervision - By Design (Not a limitation): As long as Supervisors remain in Listen or Whisper mode during a conversation, their voice is not being transcribed. Only during ”Barge In" they are part of the conversation and transcription is active.