
The 45TB environment, supporting 1.2M active policies, suffered from 14-hour ETL windows. This lag meant underwriters worked on old data, crippling the ability to react to intra-day catastrophic loss events.
Managing 15 distinct database instances across dev, test, and prod created massive overhead. High licensing costs and aging hardware prevented the use of modern BI tools, risking data silos and inconsistent state-filing reports.
We set up modular ELT pipelines to ingest data from Policy, Claims, and Billing systems into Snowflake, replacing fragile PL/SQL procedures with expandable SQL-based transformations.
We built a structured storage foundation Raw, Staged, and Curated which confirms 15 years of historical data is SOC2 compliant and always available for complex actuarial modeling.
The Snowflake architecture permits dedicated virtual warehouses for different departments, allowing claims adjusters and finance teams to run heavy queries simultaneously without any performance degradation.
We introduced micro-batching for essential claims telemetry, lessening data latency from 24 hours to under 30 minutes, significantly improving the speed of overall FNOL reporting.
The system confirms unified truth by connecting PowerBI directly to curated Snowflake layers, exposing governed datasets to executive dashboards while maintaining strict NAIC regulatory compliance standards.
85% drop in DBA maintenance and manual performance tuning tasks

99.9% success rate for nightly data loads and complex backfills

70% faster query performance for multi-year actuarial trend analysis


