I'm interested in…

  • Strategy & Procedure
  • Motor
  • Fraud
  • Disease
  • Catastrophic Injury
  • Commercial Insurance
  • Costs
  • Liability
  • Local Authority
  • Professional Indemnity
  • Scotland

Big Data: The practical considerations for the road ahead - Q&A with Geraldine Gallagher:

Geraldine is DWF LLP’s Head of Business Intelligence, and is therefore an expert on the practical and technical challenges that businesses can face when attempting to extract and analyse data.  We spoke to her about the particular problems that insurance brokers may have to overcome to adapt to the likely impact of IA2015, and to keep or gain a competitive edge in the marketplace.

Is technology enough on its own?

Technology such as predictive analytics will never be enough on its own to gain meaningful insights from the data.  It is essential to apply specialist industry (and sometimes legal) knowledge to the data analytics to identify the factors which have the most impact on potential outcomes.  An experienced broker and possibly even an insurance lawyer would need to work with the data specialist to be able to demonstrate what factors really make a difference.

Also, there is a requirement for continuous testing and adjustments of analytical assumptions.  This is essential to negate the risk of circular, self-fulfilling outcomes and to aid early identification of changing risk patterns in the market to enable counter-strategy design and deployment.  In other words, specialists in insurance broking and in data analytics will always need to think carefully about the trends emerging from the data, and make adjustments if necessary, to prevent the data from becoming meaningless.

How easy is it to work with data that has come from disparate sources?

Sometimes it is far from easy!  To take the example of two sets of data coming together when two organisations merge, the two data sets could use different naming conventions and data architecture, and there could well be different levels of granularity of data collection and storage.  There could also be differences in how much of the data is structured and how much is unstructured.  ‘Big data’ tends to be unstructured, so if it is essential to work with it, there is a need for additional technical tools for 'data scraping', and/or intensive manual input to get the data into a format that can be analysed.

Another impact of different data sets being drawn together is that the data cannot simply be used from source.  The data has to be cleansed, converted and maintained into a consistent and robust data warehouse.

What skills and resources are needed to make sense of the sort of data collected by insurance brokers?

The best place to start is with an analyst team which is not only technically skilled in structuring, extracting and delivering data, but which can 'read', interpret and answer the commercial and practical 'so what' questions which arise out of the analytics.  The team also needs the skill and experience to be able to identify which parts of the raw data have the potential to offer the greatest insight.  That helps to minimise the cost involved in designing a data schema.

Share your views

Please complete your details below to share your views. All comments are moderated and only your name and comment will be visible.

Your Comment

This information is intended as a general discussion surrounding the topics covered and is for guidance purposes only. It does not constitute legal advice and should not be regarded as a substitute for taking legal advice. DWF is not responsible for any activity undertaken based on this information.