Performance data key to fast, reliable digital-insurance services
Written by Lucky Wilson | KGTO Writer on April 20, 2023
Imagine being able to access on-demand insurance via a mobile app for impromptu events like borrowing a friend’s car. Or being able to buy a policy in seconds — without paperwork or phone calls — thanks to AI-fueled apps that gauge consumer behavior based on their devices.
This is today’s insurtech world, where technology-based disruptions like these are driving the future of property and casualty coverages.
This increased reliance on technology has a lot of upside. Specifically, digital insurance companies can better meet evolving customer expectations; differentiate in a crowded market; access opportunities in emerging markets; and even achieve higher levels of environmental sustainability.
But so much reliance on technology does come with a dangerous downside. Greater digitization of apps and services, combined with an increasing number of customers using them, leads to exponentially growing volumes of performance (speed, availability) data providing key insights into the health of these apps and services.
In fact, data volumes are getting so big and unwieldy that it’s quickly becoming humanly impossible to manage and leverage them toward the ultimate end-goal: fast, reliable customer experiences. And these are critical, as customers are coming to expect the same simple, seamless and positive interactions with their digital insurance companies that they experience in all other aspects of their lives.
If those expectations are not met, customers quickly move on.
How to remain relevant
Insurance companies need to foster a customer excellence culture that prioritizes clarity and simplicity in their products and services, and ease and convenience in the policy buying and claims submissions processes. This can be difficult, considering the insurance industry is not always associated with positive emotions from a customer point of view. Financial services, in general, face very high customer service pressures, and there is little to no wiggle room for slow, clunky, crash-prone apps and services. Experts estimates that three out of five consumers feel that the best brands are those that excel throughout the digital customer journey. However many financial services companies report that their digital customer is just average and could use improvement.
All of this makes it a must to get one’s arms around performance data in order to ensure consistently fast, reliable digital customer experiences. To this end, many insurance companies are implementing observability practices, focused on understanding the internal state or condition of complex apps and systems based only on knowledge of its external data outputs. Observability also refers to software tools and practices for pooling and analyzing a steady stream of performance data in order to make it easier to monitor, troubleshoot and debug applications and systems to find and fix problems, ideally before customers are even aware of them.
Unwieldy data
This is where the ballooning size and volume of datasets is becoming a problem. More data is supposed to make things better, but in reality it threatens to make things worse as IT teams become overwhelmed. For this reason, digital insurance companies need to evolve their observability approaches and make 2023 “the year of data efficiency.”
What exactly does that mean?
Say goodbye to “store and explore.” Observability practices have traditionally been based on a “store and explore’’ approach, meaning data is centralized in a monitoring platform before users can query or analyze it. The thinking behind this approach is that data becomes contextually richer, the more you have and the more you can correlate in one central place. Building your architecture in this manner may have worked well in a previous era when data volumes were comparatively smaller.
But given the volumes of data now being generated — the vast majority of which are never even used — companies can no longer afford to aggregate all their data in expensive, “hot” storage tiers for analysis. Rather, data needs to be analyzed and correlated in smaller volumes, in a less expensive structure.
Analyze all data at its source. To keep the storage costs associated with a central repository in line, many companies have resorted to indiscriminately discarding certain data sets. While it’s true that the vast majority of data is never used, the reality is anomalies and problems can sprout up anytime and anywhere, so companies that are randomly omitting data are leaving themselves open to significant blind spots. By analyzing data in smaller chunks, ideally at the source (versus a central repository), companies can effectively survey all their data, creating peace of mind in knowing that they are not going to have any blind-spots. After being analyzed, data can then be relegated to a lower cost storage tier for safe-keeping, ultimately saving significantly on expenses. In fact, some organizations find they don’t even need a central repository at all.
Take the pressure off of downstream pipes and systems. Another challenge of the “store and explore” approach is it can lead to clogged data pipelines and overstuffed central repositories, which slow down significantly and can take much longer to render returns on queries. So another benefit of analyzing data in smaller increments at its source is that organizations become much more nimble in conducting real-time data analytics, helping identify growing hotspots and their root causes faster, which is critical to reducing mean time to repair (MTTR). In addition, if a company is analyzing data at its point of origin and that data is throwing errors, it knows that’s the source instantaneously.
The potential of data observability
Observability has the potential to bring some incredible benefits to the insurance world, including a substantially better mean time to detection and resolution (MTTD, MTTR) for unplanned downtime or performance degradation. But according to Splunk’s State of Observability 2022 Report, there’s a rapidly growing awareness of exploding data volumes as a key challenge and concern.
The good news is that observability is still in its early days, and it’s not too late for digital insurance companies to modify their approaches, putting all their data to work for them more ingeniously in order to ultimately delight customers.
Ozan Unlu is CEO of Edge Delta. These opinions are the author’s own.
See also:
Source link