table of content
Humans can hold meaningful conversations in complex setups with ease because of the ability to generate coherent responses to a given context. We seek clarification when in ambiguity, follow-up seamlessly with extended arguments, resolve coreferences and mentions, detect digressions and transitions, while still seeking the goal of the conversation, adapting to the changing conversational implicatures. While comprehension ability helps in understanding the context, it's the dialogue Generation/ management skill which forms the basis of a good conversation. Humans generate responses of diverse forms : a persuasive response to an objection, empathetic responses to distress, elucidation for vagueness , elicitation of information etc . An AI system which attempts to replicate or even augment human-level conversational skills, invariably requires powerful language generation capability in conjunction with NLU.
This blog will give you a 30,000-foot view of an AI technology called Natural language generation (NLG) that companies like Salesken use to augment human intelligence during sales conversations, aid in sales and revenue intelligence.
A very reasonable question you may ask is :
But what does all text generation have to do with intelligence? Is that creativity?
No, it's just some complicated nonlinear statistics.
Natural Language Generation (NLG) - An Overview :
NLG - a subfield of artificial intelligence, is a software process that automatically transforms data into vanilla English content. NLG has been a field of AI research for decades, but the most impressive and valuable advances in technology have only happened recently.
Text generation, which is often formally referred to as natural language generation (NLG), is one of the most important yet challenging tasks in natural language processing (NLP). NLG aims at producing understandable text in human language from linguistic or non-linguistic data in a variety of forms such as textual data, numerical data, image data, structured knowledge bases, and knowledge graphs. Among these, text-to-text generation is one of the most important applications and thus often referred as “text generation”.
Use cases of Generative AI and its models
Automatic Signal Discovery
When Salesken onboard a new organization, the team has to create Signals (Utterances you want to track) and assign them to proper dimensions (or segments) so that the Conversational AI engine tracks them whenever these are uttered during the calls (between the end Customer and the Agent).
Often in Salesken, the team starts going through user manuals/documents and listens to a series of calls to understand what is being spoken between the Agent and the Customer and then creates those signals.
Example of signals for a bank
I am your relationship manager calling from ICICI bankICICI directSingle platform to operate savings account and demat accountLife InsuranceCritical illness cover against 34 illnessHome LoanProvisional sanction letter granted quickly
The challenge with this approach is that
- It cannot be scaled - There is a lot of friction in onboarding the customers due to this lengthy and often erroneous process. And finally, the team has to revisit the calls time and again to figure out if something new (signals) has popped up, which leads to resource constraint and often missing out on important information (signals)
- The team will have to create multiple paraphrases/synonyms to track semantically similar utterances.
- There could be a possibility of grammatical errors by creating signals manually.
- We cannot figure out what is happening in the conversation, so how do I track the signals?
The solution proposed is a data-centric approach coupled with Machine Learning which will discover the signals. This discovery will start from semi-automatic to full-automatic as we mature this process. The final objective should be a self-learning system that will automatically discover these signals and assign the correct dimensions, which the Conversational AI engine can use to detect and track the signals from the calls made between the Agent and the Customer.
There are four components .
- Specific websites for the onboarding organization will be crawled and scrapped. That information will be used to create the signals.
- Documents like pdfs, ppts and word doc (which comes from the onboarding organization) will be scrapped. That information will be used to create the signals.
- The call transcripts (between the end customer and the agent from the onboarding account) will be clustered. That information will be used to create the signals.
- Finally, an ideal system should have some kind of KMS (Knowledge Management System) provided by Salesken to the customers and the information is gathered in a specific hierarchical format. That information will be used to create the signals.
Technical approach taken to build the Automatic Signal Discovery System
In this article, I discuss a few exciting applications of generative AI to help you create signals.
1. Clarification generation Model
The main aim of the conversational systems is to return an appropriate answer in response to the user requests. However, some user requests might be ambiguous. In IR settings such a situation is handled mainly through diversification. A user is asking an ambiguous question (where an ambiguous question is a question to which one can return more than one possible answer)
In Salesken, when we onboard a customer, both the customer sales team and the Salesken team sets signals which they want to track during the conversation between the Agent and the end customer. The signals are created by humans and they can be words, phrases or sentences. We use this model to generate multiple meaningful sentences of one phrase or sentence so that our conversational AI engine can track them during the conversation.
Example for one of our Banking and Finance client:
2. Content Generation from Phrases
This is an entailment encouraging text generation model to generate content for short phrases -
It has been fine-tuned on a large corpus of English text (40 gb of data), which has learnt from various different domains and acts as a generic model for the generative purpose.
In Salesken, there is an Ed-Tech client trying to see one of their educational products. The Sales manager wants to know if something is spoken about that product which deals with "Data Science Beginner" in the conversation. Our model creates meaningful sentences from this phrase which further gets detected by the search engine if there is a conversation on this during the call.
Example for one of our Ed-Tech client:
3. Grammar Correction & Natural Rephrasing model
Grammar Correction & Natural Rephrasing is the task of correcting different kinds of errors in text, such as grammatical and word choice errors. This model is typically formulated as a sentence correction task. This system takes a potentially erroneous sentence as input and is expected to transform it to its corrected version. This model has been trained in-house on large datasets (SNLI, RTE, PAWS, STS Datasets, Wikitext) after performing a great deal of NLP transformations like dropping auxiliary POS tags, lemmatization and inflexions.
In Salesken, This model is such a utility model which we are using in every NLU/NLI pipeline as the conversational data by default is very noisy as it comes out of the ASR engine. Grammatical corrections of those sentences strengthen all the downstream tasks.
Example of Grammar Correction and Rephrasing Model:
4. Paraphrase Generation Model & Diversity Ranker Model
This model generates paraphrases which are helpful when you are creating a variety of sentences on one given topic. Then we rank them to remove the noise, and all different rankers like semantic and surface-level provide us with the flexibility to mould our model as per other use cases.
At Salesken, we use this in Automatic Signal Discovery to generate and create multiple signals; by using this, we can eradicate the entire signal creation process, which is done manually.
5. Generic Signal Generation Model
In Salesken, this model generates generic signals (These signals are uttered in most of the conversation, which is done by the sales representative and the end customer), and it is fine-tuned on a dataset that is manually curated in-house by the annotators.
NLG models made available by us, can be utilised directly without the need to train , for many use cases. We have put together few models that can produce reasonable, believable, and engaging text in hardly any time at all. These models can be further fine tuned for the domain specific datasets. It really is incredible how easy this can be done when using the PyTorch and Transformers frameworks.
To check out, NLP models open-sourced by Salesken,
In this article, we present a comprehensive review of how we are using generative AI to create an automation of signal discovery which takes care of grammar correction, generation of sentences using keywords and paraphrasing of signals. The advances are likely to increase, and generative design techniques are likely to come into the core curricula of data science, creative, and engineering professions globally.
Frequently Asked Questions
Don’t forget to share this blog!
You might also like to see other resources
Want To See Salesken In Action?
See how Salesken can provide unparalleled insights into every customer interaction