Skip to main content

ChatGPT – is AI a help or hindrance to underwriters?

02 March 2023

It is hard to ignore the rapid rise in prominence of ChatGPT. Whilst the new chatbot offers exciting potential, there is widespread concern about the risks it poses. In this article we consider some of those opportunities and risks (and whether ChatGPT will replace wordings specialists…)

What is Chat GPT?

ChatGPT is A large language model developed by OpenAI. It can be used for natural language processing tasks such as text generation and language translation. Based on the GPT-3.5 model, it is one of the largest and most advanced language models on the market. 

Can ChatGPT help underwriting?

Potentially. Although largely untested, ChatGPT does have the potential to assist underwriters in gathering large amounts of information quickly.  For example, ChatGPT has the ability to analyse data on weather patterns, economic conditions and demographic trends. This could be particularly helpful for underwriters when compiling information about risk factors affecting particular insureds in books of business. This in turn could allow underwriters to make better informed underwriting decisions or address a policyholder’s needs more efficiently.

However, whilst the potential for harnessing ChatGPT’s capabilities is significant, its use in underwriting is untested. ChatGPT’s strength is that it draws information from the entire world wide web. This is also its biggest weakness because its ability to distinguish accurate information from false information (of which there is plenty online) is limited.  There are therefore no guarantees over the accuracy of any content it produces.

The uncertainties over ChatGPT in this respect are likely to be such that underwriters will be uneasy in using the software as underwriting tool at this point in time.

Can ChatGPT draft policy wordings?!

ChatGPT can produce large and complex documents quickly. This has inevitably led to considerable excitement over the potential for the software to assist with (if not take over) the production of documents, including insurance policy wordings and other contracts.  As things stand, there are a number of challenges when using ChatGPT, which are likely to prevent its use for the creation of technical legal documents such as policy wordings, including: 

- Risk of errors and mistakes

One of the biggest disadvantages when using AI is the risk of errors and mistakes within its output. ChatGPT is not an anomaly in this respect, as it has been reported by CNET to ‘Sometimes write plausible-sounding but incorrect nonsensical answers’.

Adding to this, one research paper has cited ChatGPT even has a tendency to lie!

As ChatGPT is still undergoing training and development, even OpenAI states:

Whenever possible, we recommend having a human review of outputs before they are used in practice’.

- Risk of data breaches and privacy concerns

As ChatGPT stores and processes large amounts of data, its deployment may exacerbate the risk of a data breach. It has already been reported that there have been ongoing efforts to compromise ChatGPT, with Check Point Software Technologies stating that foreign parties have been attempting to overcome access restrictions. 

- We tried it!

In our curiosity, we asked ChatGPT to draft a high-net-worth household insurance policy within certain parameters.  Whilst at first blush the document it produced certainly looked and felt like an insurance policy, with clearly identifiable insuring clauses and exclusions, there were a number of problems that would impact its legal effectiveness (including a failure to appreciate the nuances of s.11 Insurance Act 2015). 


The evolution of ChatGPT is something we will all be watching with interest. In its current form, issues over the accuracy of its content are likely to prevent it from being used as a meaningful tool for underwriters or in the drafting of policy wordings. However, time will tell whether those issues can be overcome, and whether it can become a useful tool for underwriters.

You may be interested in...