Companion Service Settings

AI-assisted User Guidance for your Service

In the Companion Service Settings you can set up AI-assisted features that will provide guidance for your Nimbus Users in order to deliver a better service quality to the customer. 

Speech Recognizer

Speech Recognizers are the prerequisite to using any Companion features that involve voice-to-text capabilities.

✅ Precondition: Enabling either Transcription or Live Captioning features described on this page requires prior setup of a Speech Recognizer in the Nimbus Configuration. This is necessary so that the right language engine is used for voice detection.

Enabling Transcription and Live caption requires a (previously configured) Speech Recognizer to allow live language interactions.

Transcription

INC Transcription Preconditions

PRECONDITIONS

✅Related: Refer to Use Case - Setting Up Transcription for detailed step-by-step instructions.

Licensing:

  • Enterprise Contact Center The service needs to have an Enterprise or Contact Center license assigned in order to use the Transcription feature.
  • Your tenant needs to have available Companion licenses in Licenses Tenant Settings
  • Each Nimbus User that wants to use the Live Caption/Transcription feature needs to have a Companion license applied. This is done by an administrator in General User Settings.
  • As an Administrator/Team Owner you need to enable the Live Caption/Transcription features on the Companion Service Settings.
  • To show either feature to Nimbus users, you also need to enable the “Companion” widget on the Extension Service Settings.

API Services1:

  • As an Administrator, you need to set up speech services in Azure and note down the API key. 
  • The API key needs to be added in the Nimbus Speech Recognizer settings. The configured Speech Recognizer is then used for Transcription.

1 🔎Notes: This feature uses a speech recognition engine hosted by Microsoft and supports regions as specified by Microsoft AI services. The usage of the Transcription features and related APIs will cause additional monthly ACS costs outside of your Nimbus subscription.

 

INC Azure Billing Transcription

AZURE BILLING

The usage of the Transcription feature will cause additional monthly ACS costs. The costs are determined by Microsoft. Also see https://azure.microsoft.com/en-us/pricing/details/cognitive-services/speech-services/.

  • Before enabling the Transcription feature, get in touch with your Luware Customer Success specialist to discuss terms, usage scenarios and necessary setup procedures.
  • Please note that Nimbus and Transcription support does not cover pricing discussions or make recommendations based on Microsoft Azure infrastructure.
 

When active, the Transcription feature transcribes call contents using Voice-to-Text capabilities. It improves quality assurance and increases the efficiency of call processing, for example making it easier to summarize past customer interactions and possibly transfer them to a CRM system for easier search and later analysis.

Once a call has ended, a Voice Transcription is generated in the background. You can show this in My Sessions to your Users or retrieve the transcribed text via Nimbus Power Automate Connector by leveraging the Trigger Event "When the companion has an update". You can also write the outputs to any target, e.g. into a Teams message or Email.

Example of finalized Transcription on the “My Sessions” page

💡Good to know

 
Trigger Event for Nimbus Companion
Flow Action for Nimbus Companion
Example Preparing an Transcription Adaptive Card card to send to the user

Virtual Assistants

Live Captioning

Precondition: Transcription is required to be enabled in order to use Live Caption.

The Live Caption feature converts ongoing call participant voices directly into text, using Voice-to-Text capabilities. Contents are displayed live on the My Sessions page.

Example of ongoing Live Captioning on the “My Sessions” page

💡Good to know

  • ✅ Follow-up action: For Live Captioning to be visible to your Users, the “Companion” widget must be enabled via Extensions Service Settings > Widgets. Detailed interactions for Nimbus Users are described on the Transcription feature page. You can inform your Service team once the feature has been enabled.
  • When the feature disabled or unavailable (e.g. throttling, missing settings or service impediments on Microsoft side) info messages are shown in the frontend UI widget. Nimbus task handling and user-customer interactions themselves are not affected by this and will continue normally.
  • Only the transcription between the current User and the Customer is kept, no third parties. → Also refer to the Known Limitations chapter below.
 

Known Limitations

INC Transcription Limitations

KNOWN TRANSCRIPTION LIMITATIONS

  • Transcription is currently only supported for inbound calls. Outbound Calls (i.e. Call On Behalf ) will not be transcribed.
  • Using Live Captioning currently mandates Transcription to be enabled as well. Speech Recognizers are required for both features. 
    💡We are actively working on further improvements to make both features configurable separately.
 

Supervision - By Design (Not a limitation): As long as Supervisors remain in Listen or Whisper mode during a conversation, their voice is not being transcribed. Only during ”Barge In" they are part of the conversation and transcription is active.

 

 

Table of Contents