Use Case - Analyzing a Transcript

After a voice call, agents might want to get insights of a transcribed call and supervisors might want to get the transcription stored on the customer in a CRM. In this use case, we send an adaptive card to the agent when the transcript is ready. It should provide information about the sentiment, summarize the text, and contain the complete script of the conversation.

Adaptive card with the transcription

Preconditions

 

INC Icon Legend Accordion

Show Icon Legend

💡 = A hint to signal learnings, improvements or useful information in context. 🔍 = Info points out essential notes or related page in context.
☝ = Notifies you about fallacies and tricky parts that help avoid problems. 🤔 = Asks and answers common questions and troubleshooting points.
❌ = Warns you of actions with irreversible / data-destructive consequence. ✅ = Intructs you to perform a certain (prerequired) action to complete a related step.
 
 

Participant Types

In Power Automate, there are tree participant types used in transcriptions: Customer, User, and Other. The following table gives you more information on them:

Customer User Other
  • Inbound callers
  • There can be only one participant of this type per transcription session
  • Internal Nimbus users
  • There can be multiple participants of this type per transcription session
  • Anyone else
  • There can be multiple participants of this type per transcription session

Overview of the Flow

Expand to see the overview of the flow…

                                     

 
 

Build the Flow

Description Screenshot
Start with the trigger “When the virtual user assistant has an update” and select the service. The event needs to be Voice Transcription ready.
You will use the service session ID to get the corresponding transcription data.

Then, prepare a couple of variable that we need:

Variable “CleanText”, Type: String

Variable “TranscriptionFacts”, Type: Array

 

 

💡 Cleantext will be used to analyse the text with AI 

💡TranscriptionFacts will represent a node in the JSON of the adaptive card.

In the Participants data, we want to filter the Agent portion:

 

Filter Array on the Transcription Data where

items()?['type'] is equal to User

In the Participants data, we want to filter the Customer portion:

Filter Array on the Transcription Data where

items()?['type'] is equal to Customer

The Identifier/Id field holds the OfficeID of the Agent. We use it to get agent details from O365. 

 

We need the UPN of the agent to send the adaptive card at the end of this flow.

Iterate through the Phrases
…and also iterate through the Participants
Match the Phrase ParticipantId with the IdentifierId of the participant

IF TRUE

we fill the variables CleanText and TranscriptionFacts

Create the Clean Text string
Create the Adaptive Card Node

Now we can use the “CleanText” for the AI text analysis.

 

Variants: The text analysis can be achieved 
EITHER using AI Builder in Power Automate 
OR using a deployed AI model via it's API.

 

 

💡In the following we continue with the data returned from Variant 1.

Variant 1: Using the built-in AI builder Premium Connector

In our example we'll be using this variant. Here is an overview of what you need:

 

For the first element select the "AI Summarize" prompt and add the “CleanText” variable as input text. It is also recommended to add a textual context to the prompt. You can use this generic phrase and refine it according to your case.

 

This is a transcribed conversation between a customer and a contact center agent.”.

 

The second element looks as follows. You could use the language information from the transcription data. Each phrase has a language property attached to it. 

We just set it to english to simplify our flow.

 

 

 
 

 

Variant 2: Using a deployed model on Azure AI

If you use this variant, you only need to make one request to your model using the HTTP element. Then parse the response and adapt the next step accordingly. It consists of these two elements:

The details of the HTTP request depend on your model. We are using GPT4o deployed in Azure.

The response can be parsed using a Parse JSON element on the message content from the HTTP request as follows:

 

 
 
Finally, we send the adaptive card to the Agent using the User Principle Name and the following adaptive card JSON
{
                        "type": "AdaptiveCard",
                        "body": [
                            {
                                "type": "ColumnSet",
                                "columns": [
                                    {
                                        "type": "Column",
                                        "items": [
                                            {
                                                "type": "Image",
                                                "style": "Person",
                                                "url": "https://luware.com/de/wp-content/uploads/2024/05/Luware-Nimbus_BoxLogo-860x860.png",
                                                "size": "Small"
                                            }
                                        ],
                                        "width": "auto"
                                    },
                                    {
                                        "type": "Column",
                                        "items": [
                                            {
                                                "type": "TextBlock",
                                                "weight": "Bolder",
                                                "text": "Luware Nimbus via Power Automate",
                                                "wrap": true
                                            },
                                            {
                                                "type": "TextBlock",
                                                "spacing": "None",
                                                "text": "Created 01.02.2024 at 15:56 ",
                                                "isSubtle": true,
                                                "wrap": true
                                            }
                                        ],
                                        "width": "stretch"
                                    }
                                ]
                            },
                            {
                                "type": "TextBlock",
                                "size": "Medium",
                                "weight": "Bolder",
                                "text": "Your last call with @{body('Filter_array_Customer')[0]['DisplayName']}"
                            },
                            {
                                "type": "FactSet",
                                "facts": [
                                    {
                                        "title": "Sentiment",
                                        "value": "@{outputs('Analyze_positive_or_negative_sentiment_in_text')?['body/responsev2/predictionOutput/result/sentiment']}"
                                    },
                                    {
                                        "title": "Summary",
                                        "value": "@{body('Create_text_with_GPT_using_a_prompt')?['responsev2']?['predictionOutput']?['text']}"
                                    }
                                ]
                            },
                            {
                                "type": "TextBlock",
                                "text": "Transcription",
                                "wrap": true,
                                "weight": "Bolder"
                            },
                            {
                                "type": "FactSet",
                                "facts":  @{string(variables('TranscriptionFacts'))}
                            }
                        ],
                        "$schema": "https://adaptivecards.io/schemas/adaptive-card.json",
                        "version": "1.4"
                    }

 

This is what the final card should look like.

Table of Contents