Use Case - Setting Up Transcription

Learn how to set up transcription for your services and users step by step.

In this use case, we are going through the steps needed for setting up the Transcription feature, including

  • Service Administration steps to enable transcription and live caption for your Services and make them visible via widgets.
  • User Administration steps to enable the Transcription feature for Service Users.

INC Azure Billing Transcription

AZURE BILLING

The usage of the Transcription feature will cause additional monthly ACS costs. The costs are determined by Microsoft. Also see https://azure.microsoft.com/en-us/pricing/details/cognitive-services/speech-services/.

  • Before enabling the Transcription feature, get in touch with your Luware Customer Success specialist to discuss terms, usage scenarios and necessary setup procedures.
  • Please note that Nimbus and Transcription support does not cover pricing discussions or make recommendations based on Microsoft Azure infrastructure.
 

INC Transcription Preconditions

PRECONDITIONS

✅Related Admin Use Case: Refer to Use Case - Setting Up Transcription for detailed step-by-step instructions.

Nimbus service and user licensing

🔎Transcription features have service and user requirements. These requirements apply for either mid-session Live Caption or post-session Transcription / Summarization features.

Service requirements

Enterprise Routing Contact Center 

  • Prerequisite: Only Enterprise Routing or Contact Center services have the Companion tab and related Nimbus Features available to the service users.
  • Feature toggle: As Administrator/Team Owner you can enable single Companion features via Companion Service Settings.
  • Data display: To show the transcription data to Nimbus users, you also need to enable the “Companion” widget on the Extension Service Settings. Of course you can also opt to not show the transcription, instead just leveraging the data via the Nimbus Power Automate Connector. → More on this below.

User requirements

Companion 

  • License distribution: Your tenant needs to have sufficient available Companion licenses assigned in Licenses Tenant Settings. You can review and bulk-assign available licenses via License Management.
    💡If no or insufficient licenses are shown, get in touch with a Luware customer success partner to discuss terms and conditions.
  • Enable feature: Every Nimbus user requiring access to features described on this page needs to have a Companion license applied. This can be done by any Administrator in the General User Settings.
    ⮑ Once enabled, the My Sessions page will show a new “Companion” widget with (optionally) enabled features sorted into tabs.
 
 

Azure Speech Services

🔎Features described in the following use “Speech Services” hosted by Microsoft, supporting regions as specified by Microsoft AI services. The usage of the Transcription features and related APIs will cause additional monthly ACS costs outside of your Nimbus subscription.

Speech Services API Setup

 
 

Optional: Power Automate connector integration

🔎Optional step: The Nimbus Power Automate Connector can extract Transcription/Summarization data for implementing additional use cases. To achieve this, the following steps need to be performed by an administrator: 

  • You need to set up a Power Automate flow that uses the “Companion” Trigger Event to react to any ongoing transcription session event.
  • You then require the “Companion” Flow Action to capture the data for any further processing.

💡Some Use Case examples from our Knowledge Base:

 
 
 

INC Icon Legend Accordion

Show Icon Legend

💡 = A hint to signal learnings, improvements or useful information in context. 🔍 = Info points out essential notes or related page in context.
☝ = Notifies you about fallacies and tricky parts that help avoid problems. 🤔 = Asks and answers common questions and troubleshooting points.
❌ = Warns you of actions with irreversible / data-destructive consequence. ✅ = Intructs you to perform a certain (prerequired) action to complete a related step.
 
 

Step 1: Add a Speech Recognizer

  1. Go to Configuration > AI > Speech Recognizers.
  2. Click Create New.
  3. Specify Name & Organization Unit
  4. Specify the Type of the Azure AI services.💡This allows to use your own Azure Speech recognizer. If unavailable, Nimbus will provide a native solution. 
    1. Azure AI Services: Allows you to use your own API. In this case you need to the API key,  Region & Language to be recognized. Visit Use Case - Setting Up a Speech Recognizer in Azure Portal and Nimbus to learn how to create your key.
    2. Nimbus AI Services: Will use Nimbus native AI Services for Transcription.
  5. Configure “Multilanguage” options & specify Language(s) to be detected.
    💡Default 1 – and with Multilanguage enabled – up to 4 
  6. Click Create.

Step 2: Enable Transcription and Live Captioning for your Service

  1. Go to Service Administration.
  2. Select the Service for which you want to enable the Transcription feature.
  3. Make sure that your Service has an Enterprise or Contact Center license assigned in General Service Settings.
  4. Go to the tab Companion Service Settings.
  5. Enable Transcription and (optionally) Live Captioning.
    💡 Note that Live Caption only works if Transcription is also enabled.
    💡 You can also opt to hide either feature from the Nimbus Portal, but still keep Transcription active. This allows you to evaluate the data via Nimbus Power Automate Connector using “Companion” Trigger Events and Flow Actions.
  6. In the field Speech Recognizer, select your previously created Speech Recognizer. 
  7. Click Save & Apply.
    ⮑ Transcription features are now immediately in effect when the next audio call is coming in.

Step 3: Setup the Widget

💡 This step is necessary if you want to make Live Caption and Transcription visible for the User in Nimbus Portal. If Live Caption and Transcript are disabled in Extensions Service Settings, Transcripts will be saved in the background but Users won't see them in Nimbus Portal.

  1. Go to Extensions Service Settings.
  2. In section widget, enable the “Companion widget.
  3. Click Save & Apply.

Step 4: Assign a Companion License to your Service Users

  1.  Go to User Administration.
  2. Select the User you want to assign a Companion license to and click Edit.
  3. Enable Companion in General User Settings.
    💡 Make sure that the User is part of a Service Team with Transcription/Live Captioning feature enabled.
    ⮑ Service Users should now see the Transcript widget in My Sessions. If they don't have a Companion license applied, the “License missing.” message is shown in the widget.

Troubleshooting

INC Transcription Troubleshooting

Not seeing any transcripts in the Transcript widget can have several causes. The following table lists error messages and explains why they are shown.

Message 🤔 Why do I see the message?
No transcription is available for this interaction. There was no transcription generated for the selected conversation.
No transcription is available for this interaction. 
Task was not accepted.
The selected call was not accepted and therefore there is no transcribed conversation.

Transcription not available.

User Transcription License missing.

The user doesn't have a Transcription License assigned. Transcription Licenses can be assigned to users in Users > General User Settings > Licenses.

Table of Contents