Sales and Distribution Digitisation Series | AI
This article is part of the Sales and Distribution Digitisation series and explores the use of artificial intelligence (AI) technology across the financial services sector.
In this first article, we explore the topic of AI for good in relation to servicing vulnerable customers. We observe how it is currently applied and our current understanding of the ‘vulnerable customer’ and look at what improvements are needed to serve vulnerable customers better – highlighting how AI could be applied better to support customers with additional needs and requirements.
The adoption of AI in financial services
The use of artificial intelligence technologies in understanding the customer, enhancing the customer experience and risk management isn’t new to the financial services sector, with those applying fast-evolving AI tech benefiting from deeper customer insight and improved customer experience. AI is driving value and delivering growth for those who understand its power to bring them closer to their customers.
The benefits are clear: revenue growth through higher sales conversion and customer loyalty, operational efficiency through greater automation across business functions, reduced errors, and greater innovation, thanks to improved customer insight. According to McKinsey and Company, “the potential for value creation is one of the largest across industries, as AI can potentially unlock $1 trillion of incremental value for banks”. So, as banks continue to realise the value of AI, its adoption across the sector will increase, and its application will begin to evolve.
If AI is currently used to enhance the customer experience for a ‘standard’ customer. How could it be applied to help a segment of customers who require a highly personalised approach and need to be seen, heard and understood at the most vulnerable moments in their life?
The digital transformation shift because of the pandemic replaced face-to-face contact for customers, meaning delivering good customer experience and personalisation through these new digital channels is even more vital. AI can play a role in providing a more human experience in a very non-human way of interacting with customers in these current conditions.
Understanding the idea of the ‘vulnerable customer’
There are preconceived notions of what a vulnerable customer may look like. We may instantly think of physical disability or displaying atypical traits. But the truth is that vulnerable customers reach far beyond the physically or mentally disabled. Some are short-term circumstantial periods of vulnerability, and others are more long-lasting.
According to the Financial Conduct Authority, “A vulnerable customer is someone who, due to their personal circumstance, is especially susceptible to detriment, particularly when a firm is not acting with appropriate levels of care.”
So vulnerabilities could be a result of unexpected life events out of their control, such as illness, bereavement, mental health conditions, job loss or divorce, as well as longer-term conditions such as a lack of literacy or a very low income.
Therefore, vulnerability is far more widespread than we may first think; statistics published by the FCA in the Financial Lives Survey (2020) report that 46% of UK adults (26.0m) showed one or more characteristics of vulnerability *.
Stark statistics like this demonstrate the growing importance of financial services better supporting vulnerable customers. Not only is it morally the right thing to do, but it also makes business sense. If this is the reality, how do financial services cater for vast swathes of the population who fall into the vulnerable category?
Thankfully, the FCA is heavily involved in the protection of vulnerable customers, giving guidance to financial service providers on the fair treatment of their vulnerable customers. And recent new regulation released in the FG21 /1 Guidance for firms on the fair treatment of vulnerable customers clearly details what firms must do to ensure good outcomes for vulnerable customers. Including:
- Understand the needs of their target market / customer base.
- Ensure their staff have the right skills and capability to recognise and respond to the needs of vulnerable customers.
- Respond to customer needs throughout product design, flexible customer service provision and communications.
- Monitor and assess whether they are meeting and responding to the needs of customers with characteristics of vulnerability, and make improvements where this is not happening.**
How AI could be used to support vulnerable customers
The financial services industry has successfully personalised various aspects of the customer relationship with risk calculation scores and insurance risk assessments through sophisticated algorithms. But this is behind a rigid customer onboarding and service journey that does little to cater to the additional needs of vulnerable customers due to the standardised and inflexible way it is deployed through cost-cutting processes and inflexible systems. Failing to deliver a mindful and considerate customer experience for all.
Yet, there is no reason why banks cannot adopt a similar mindset of flexibility and personalisation in terms of the customer journey for the vulnerable by using AI to create an adapted buyer journey, varying with the needs of the different types of vulnerable customers. Some would argue that this should become the standard regardless of vulnerability and that all humans should be delivered the highest level of personalisation and superior customer experience.
A deliberate and mindful vulnerability strategy looks at how to identify the vulnerable, how to adapt the buyer journey, and then how to deliver the right support. If AI is to be leveraged to support vulnerable customers, then it is vital to examine each of these steps and decide how AI can be used to enhance the experience in this digital-first world?
One of the biggest challenges for any service provider is identifying vulnerable customers, and though there may be some customers who willingly self-identify themselves, those experiencing temporary life challenges are far less likely to identify as vulnerable. Many temporary life challenges often carry shame, embarrassment, and fear of judgement on the part of the vulnerable.
Also, consider that staff can find these conversations difficult to navigate and often collaborate with the customer in maintaining the fiction that there is no issue. Either through embarrassment or fear of embarrassing or insulting the customer. This will have always made the identification of vulnerability a challenge, yet it could be argued that the removal of human-to-human interaction and the use of chatbots and digital channels would tackle this problem.
In this more anonymous space, customers may be more inclined to identify as being in a vulnerable situation compared to speaking face-to-face with an advisor (Akin to people preferring to research embarrassing medical conditions online rather than talk to a doctor). And in the same way that frontline advisors are trained to identify and support vulnerable customers, virtual assistants or chatbots to elicit information that will help identify a vulnerable customer.
For vulnerable customers who experience long-term vulnerability due to physical or mental disability or impairment, AI could be applied in a way that identifies certain behaviours and adapts the buyer journey to aid their experience.
The ability to discover more about a customer’s circumstances may be quicker, and the signposting of customer journey direction could be adapted from there on in. And again, chatbot technology can be used to actively ask if the customer has any specific requirements during their engagement. The technology to apply an ‘additional support’ pop-up exists, but its application seldom appears.
Customers in vulnerable circumstances are likely to experience financial difficulty due to being unable to manage their finances effectively, and so the use of risk indicators such as identifying unarranged overdrafts, frequent returned payments, and increasing value of credit card cash advances could suggest financial hardship. By being mindful of such indicators, financial service providers can better picture a customer’s vulnerable situation and anticipate their needs before the customer makes contact.
Then once identification has been achieved, it is then a matter of adapting the customer journey to suit that customer’s needs. Personalisation is possible with AI, and a customer could easily be placed into another customer path that removes them from mainstream customer handling. This digital footprint and interaction could be logged so that future interactions and communications with other departments can be adapted to cater to their current circumstances.
Yet, care must be taken in terms of the storing and sharing of such sensitive information. Unless a customer has explicitly given consent for their information to be shared with others, this could result in further distress for the customer or GDPR breach.
And in terms of adapting the financial products and services, financial service organisations could take a variety of actions to reduce undue stress and future financial risk for the customer. For example, by ensuring that they are not receiving promotions for other products which would not be suited to them, or adapting their existing products to their current needs to help them during their difficulties.
In addition to the adapted customer journey and products, once a customer is identified as vulnerable, there is then the opportunity to offer support from a trained specialist, reintroducing the human element earlier in the process through the customer’s preferred channel. This also presents the opportunity to give additional support by signposting to external organisations and charities.
Governance of AI in financial services
What governance is required to give people confidence that this is AI for good? How do you make sure your AI is doing the right thing? What governance structures are needed for the effective and impactful implementation of AI to support vulnerable customers?
We are still in the formative days of applying AI across the sector for standard customer service delivery, although we can probably expect to see rapid change. Much work is needed to apply existing AI practices, tools and new applications to deliver an enhanced experience for vulnerable customers. And although there is an understanding and body of knowledge of what good practice looks like, a governance framework and emphasis from the top down are needed to make AI for vulnerable customers a mainstream activity.
Barriers to AI adoption for vulnerable customers
The biggest challenge will be the shift around the manufacturing production line mentality. An industry obsessed with standardisation, automation, and lean management may find it challenging to adapt and let go of the old way of doing things. A flexible approach is almost counterintuitive to what has made financial service organisations successful. Yet, flexibility and personalisation delivered through an omnichannel customer experience are key to catering to the vast range of human conditions and life experiences.
Another significant cultural barrier is fear, and this is understandable. There is still stigma and fear around what AI means for our future, future jobs, and humans’ role in the workplace. This fear could result in resistance to change, and this is when specialist external skills are needed to shine a light on the value of AI and show the possibilities. A clearly defined strategy that supports vulnerable customers, leverages technology and processes and positively transforms organisational culture and processes is needed.
A cultural shift can only be achieved when the financial and operational benefits are realised and an understanding of the need for AI to drive customer satisfaction and loyalty for sustainable growth is realised by senior leadership teams.
The pace of this adaptation and wide-scale adoption is up for question, it’s difficult to predict, and some sizeable barriers must be tackled for AI-augmented customer journeys that effectively support vulnerable customers to become the norm.
What is sure is that there is increasing pressure on financial service providers to address their shortcomings in the area of serving and protecting their vulnerable customers. It is not only a moral duty that should be a part of all businesses’ core ethics but changing customer expectations and tightening regulation with detailed instructions on how to protect customers leaves the industry with little option but to comply and implement the appropriate actions.
In the next article in the AI series, we’ll discuss the role of Customer Relationship Management (CRM) and the infancy of AI technology application in the financial services sector.
Hesmur is a digital transformation consultancy helping financial service organizations navigate and adapt to a rapidly changing data-driven world. Through rapid diagnostic and leading-edge digital delivery, we implement measurable change swiftly and effectively, and through our network of highly experienced professionals and data specialists, we use data-driven insights to help financial service organizations maintain that competitive edge with tailored and transformational consultancy services.
- * https://www.fca.org.uk/publication/research/financial-lives-survey-2020.pdf
Co-authored by Robert Tripp and Natalie Silva