We all knew that underwriter. The one with the 'Golden Gut.'
He sat in a corner office, relying on decades of intuition, a map on the wall, and a feeling in his bones. He could look at a risk and just know. It was the era of art over science, where 'good judgment' was the ultimate currency.
But let’s be fair to him: He didn't rely on his gut because he wanted to. He did it because he had to.
For decades, the data we needed was locked away—trapped in paper files or siloed in systems too weak to talk to each other. We literally didn't have the computing power to challenge him.
That has changed. We now have the raw processing power to ingest billions of data points and crunch in seconds what would have taken that underwriter a thousand lifetimes to analyze. The data is no longer background noise; it is the signal. And in an age where we are drowning in information, relying on a 'feeling' isn't just outdated—it’s dangerous.
The answer lies in embracing a new paradigm: the shift from risk selection to granular risk pricing. This evolution is defined by three key chapters: the demise of the old-school underwriter, the science of granularity, and the structural shifts that are defining the future of insurance.
The End of Underwriting as We Know It
The idea that traditional underwriting was on a path to extinction is not new. Thirty-five years ago, an insurance provocateur named Richard Dorman said something that should have terrified us: 'Underwriting is dead. Long live pricing.'
Dorman, one of the architects of the Progressive mindset, saw that our industry was obsessed with the binary: 'Yes' or 'No.' We spent all our energy guarding the castle, trying to keep the 'bad' risks out.
But Dorman knew that the 'No' button was a trap. When you default to 'No,' you aren't managing risk; you're handing market share to your competitors. In a free market, almost nothing is uninsurable - if the premium matches the exposure. There are no bad risks, only bad prices.
While other insurers boasted about how exclusive their underwriting was, Dorman's philosophy was about being inclusive through precision. It wasn't about guarding the castle; it was about expanding the kingdom.
The Core Problem: The Hidden Subsidy
Think about the NBA. Twenty years ago, players took shots from everywhere. It was chaotic. Then, data analysts looked at the court and realized the mid-range jump shot was the worst investment in the game: high difficulty, low reward.
The math changed the game. Now, teams prioritize the shots that yield the highest expected value.
Insurance is having its 'Moneyball' moment. For too long, we’ve been taking mid-range jump shots—guessing at risks based on broad averages. The result? A 'hidden subsidy.' The good risks (the safe drivers, the well-maintained homes) end up paying for the bad ones.
The Science of Granularity: Broadening the Scope
To achieve this change in efficiency, the industry needs more data, better models, and a radically new operating system.
The first step is building the Golden Record. This is a single source of truth for a risk, constructed by layering an insurer's internal claims history with external reality. For example, when looking at a property, we no longer guess at the square footage or roofing type from an old listing; we know it because multiple, independent sources including permits, aerial imagery, and inspections have confirmed it.
However, the Golden Record is just the start. The problem is that most insurers keep their data in solitary confinement. Claims data is in one silo, underwriting is in another, and actuarial analysis is often trapped in spreadsheets. To build the Golden Record, insurers must break down these walls so that data can flow cyclically. Underwriting informs pricing, which informs claims, and claims data cycles back to inform the next underwriting decision.
The power of third-party data is essential here. By integrating data partners, insurers can move from a simple risk description to a full risk prediction: from "what happened" to "what will happen".
The Fallacy of Averages: Granularity in Action - Why Political Boundaries Fail
The enemy of modern insurance isn't bad luck; it's the average.
In every line of business—whether it’s class codes in Workers' Comp or zone rating in Commercial Auto—we have historically relied on broad buckets to group risks. We treat unique snowflakes like identical ice cubes because it’s easier.
But nowhere is this failure more obvious, or more costly, than in how we use ZIP codes for territories.
For decades, we relied on political boundaries—ZIP codes, county lines, census blocks—as a means to an end. But when we price on political borders, we are trying to fit a fluid, physical world into a rigid, bureaucratic box. Mother Nature doesn't vote, and she doesn't pay property taxes.
The real inefficiency comes from relying on low-granularity data. To fix this, we have to stop pricing the world based on political boundaries and lean into the facts. Old-school underwriting would look at a ZIP code and say, "This home is protected." But granular data tells the real story: Yes, there’s a hydrant, but the home is five miles from the station, across a bridge that can’t hold a fire tanker, up a driveway too narrow for a truck.
When data is visualized on a map, neighborhoods that appear identical can look very different. Pockets of risk that are invisible to the spreadsheet become clear. This shift in thinking, from rating in terms of zones to rating individual properties, allows underwriters to find the best risks within otherwise high-risk neighborhoods.
The Future: Velocity as a Competitive Advantage
The final, and perhaps most challenging, piece of this evolution is operational execution. The intelligence must be integrated into the market before it becomes obsolete.
Here is the hardest part. We have actuaries building Ferrari-level pricing models. They can predict risk with incredible precision. But then we try to shove those models into IT systems that move at the speed of a Model T.
If it takes IT nine months to hard-code a rate change, your "smart price" is obsolete before it ever hits the market. Velocity is the new currency. The winners of the next decade won't just be the ones with the smartest models; they will be the ones who can actually deploy them.
The problem is translation. The logic that lives in an actuary's brain (or Python code) is often unintelligible to a 20-year-old mainframe or even modern rating platforms managed by IT. We waste months trying to "translate" that logic, and by the time we do, the market has moved. We need to decouple the brain (pricing) from the body (policy administration and rating). Let the actuaries change the price without needing to perform surgery.
A New Contract with Society
Finally, we have to talk about trust. If we use all this data just to cherry-pick the perfect risks, we will create "insurance deserts" where nobody can get coverage.
We need to use data not to exclude, but to find a way to say "Yes" safely. We need to explain to regulators—and our customers—why the price is the price. Transparency isn't just a regulatory requirement; it is the only way to rebuild faith in our industry.
The era of the "Golden Gut" is over. We can't rely on intuition anymore. But we can replace it with something better: a system that is faster, fairer, and brutally honest about the world as it actually is.