Same question, different answer: what brokers need to know about AI consistency and regulated advice
In the third in a four-part series, Zahid Bilgrami, CEO of Mortgage Brain, explores the pillars of Mortgage Brain's AI Charter: cost, intellectual property, consistency, and speed.
Try this experiment. Take a public AI chatbot. Ask it a mortgage-related question. Wait a minute. Ask it the exact same question again.
You will often get two different answers.
For most uses of AI – drafting an email, brainstorming ideas, research – that variability is harmless, and sometimes even useful. But in the context of regulated mortgage advice, it is a liability.
What brokers need to understand about how public AI models actually work
General purpose large language models (LLM) are probabilistic by design. That is a technical term, but the implication for broker firms is straightforward. When these models generate an answer, they are not retrieving a fixed piece of information. They are predicting what the most likely next word should be, based on statistical patterns in the data they were trained on. That prediction can shift each time the model is queried.
What does this mean for brokers? It means the AI tool your firm uses may produce slightly different outputs for identical inputs, even when nothing about the client's circumstances has changed. Different compliance interpretations. Different risk flags. Different document outputs. Different recommendations.
In a regulated advice process, this is a problem for mortgage brokers.
Why consistency is key in mortgage advice
Mortgage advice is a regulated activity. That means every recommendation your firm makes, every decision your advisers take, must be defensible - to your compliance team, the FCA, your professional indemnity insurer and, ultimately, to the client.
How could AI variability impact brokers in practice? Consider what happens when the same client scenario, run through the same AI tool on two different days, produces two different outputs:
• On Monday, the tool flags the case as straightforward. On Thursday, it flags the same case as high risk.
• One adviser in your firm receives a recommendation the AI supports. Another adviser, on the same case, receives a recommendation it questions.
• A case file reviewed by a compliance auditor shows an AI-generated output that no longer matches what the tool produces when the case is reviewed later.
None of those scenarios is hypothetical. They are the predictable consequence of running regulated advice through probabilistic systems. And in each case, the question a regulator will ask is the same: how did your firm arrive at this decision, and can you prove it was applied consistently across clients?
If the audit trail shifts every time the tool is queried, you do not have an audit trail.
The Consumer Duty dimension brokers need to weigh
Consumer Duty raised the bar on outcomes. Broker firms are now expected to demonstrate that clients are being treated fairly, consistently, and in line with their needs and objectives.
What does this mean for brokers relying on probabilistic AI? It means you have a structural mismatch between the consistency Consumer Duty expects and the consistency the tool can actually deliver. Two clients with identical circumstances should receive equivalent outcomes. If your AI tool cannot guarantee that, your firm is carrying a risk it may not have priced in.
The FCA's direction of travel on AI is clear, and "the model gave a different answer" will not be an acceptable defence at review.
Deterministic AI: what brokers should be looking for
There is a way to use AI that does not carry this risk. It is called deterministic design.
A deterministic system is one where the same input produces the same output, every time. No variability. No surprises. Where consistency is essential – and in regulated advice, it is – AI should be built to behave deterministically.
How does this impact brokers? It means that if your firm runs a client case through a properly designed AI system on Monday, and runs the same case through on Thursday, you get the same answer. Everyone from an adviser and compliance officer to an auditor and end client can rely on that.
At Mortgage Brain, where consistency is required, we design systems that behave deterministically. AI supports the process. It does not replace governance. That distinction matters, because it is what makes the output defensible.
Firms relying entirely on third-party AI engines may struggle to achieve this level of behavioural control, because they are not in charge of how the underlying model behaves.
The audit trail question brokers need to think hard about
Every broker firm knows the importance of an audit trail. It is one of the foundations of a compliant advice process. But an audit trail built on top of a probabilistic AI system is not the kind of audit trail your firm needs.
What should brokers be looking for? If the AI output your firm captured at the point of advice is not the same output the tool would produce today, your audit trail is not reconstructable. You cannot go back and show your working in the way a regulator expects. The record exists, but it is fuzzy. And fuzzy does not belong in a regulated file review.
Brokers should be looking for AI systems where the output is reproducible on demand. If a provider cannot demonstrate that, your firm is building its compliance position on sand.
What brokers should be asking their providers
If consistency matters to your firm, and in regulated advice it absolutely must, here are the questions every broker and broker firm should be putting to their AI or technology provider before signing, or at your next renewal conversation.
Is your AI deterministic or probabilistic? This is the foundational question. A provider that does not understand the distinction is not the provider you want running your advice infrastructure. A provider that confirms their system is probabilistic needs to explain, in detail, how they manage the regulatory consequences.
If I run the same client case through your system twice, will I get the same output both times? Ask for a demonstration. A provider confident in their consistency will have no problem showing you.
How do you ensure that AI outputs used in regulated advice are reproducible months or years after the fact? This is the audit trail question, asked directly. Your firm may need to reconstruct a piece of advice years after it was given. Your provider needs to be able to support that.
What controls do you have in place to prevent output shift over time? Even within a single AI system, outputs can shift as models are updated or retrained. Ask how your provider manages this, and what notice your firm receives before any change that could affect advice outcomes.
Can you produce written evidence of output consistency that would satisfy a compliance auditor? If the answer is vague, it is not an answer. Your firm's compliance function needs something tangible to work with.
Where AI is used in a regulated advice journey, how does your system document the basis for each output? Consistency is only half the battle. The other half is being able to show why the AI reached the conclusion it did. A good provider will have thought carefully about this. A thin-layer provider will not.
The bottom line for brokers
In regulated mortgage advice, consistency is the ground on which everything else stands. Without it, your firm's audit trail, Consumer Duty position, and ability to defend its advice under scrutiny are all weaker than they look.
Broker firms need to treat consistency as a hard requirement in their AI procurement process, not a feature comparison point. The right question is not "how clever is the AI?" The right question is "will it give my firm the same answer to the same question, every time?"
Consistency is the third pillar of our AI Charter for a reason. In regulated advice, variability is a liability.
Breaking news
Direct to your inbox:
More
stories
you'll love:
This week's biggest stories:
This week's biggest stories:
Iress
Iress announces major upgrade to Xplan Mortgage platform
Mortgage Rates
Barclays relaunches sub-4% mortgage rate
Lloyds
Lloyds partners with Connells and LMS to launch fully digital homebuying journey
FCA
FCA sued over compensation scheme that 'significantly underestimates harm'
FCA
FCA announces changes to streamline senior managers regime
Bank Of England
Bank of England holds interest rates at 3.75% in 8-1 vote