This paper is also available as an interactive experience with key stats, visualizations, and navigable sections.Explore it

The Transition to Deep AI in Flood Risk Underwriting: A Technical and Economic Paradigm Shift

Executive Summary

The global insurance industry faces an existential crisis of calculability. As climate volatility accelerates, the traditional actuarial methods utilized to price flood risk—primarily reliant on historical averages, coarse-grained zip code aggregation, and static FEMA flood zones—are failing to capture the non-stationary nature of modern hydrological hazards. This systemic failure has resulted in a widening "protection gap," profound adverse selection, and deteriorating combined ratios across property and casualty (P&C) portfolios.

This whitepaper, prepared for the AI consultancy Veriprajna, delineates a comprehensive architectural shift toward "Deep AI" solutions. We posit that the future of flood underwriting lies in the transition from probabilistic aggregation to pixel-level deterministic modeling . By integrating Hyper-Local Computer Vision (CV) for structural vulnerability assessment, Synthetic Aperture Radar (SAR) for real-time hazard monitoring, and Physics-Informed Machine Learning (PINNs) for hydrodynamic simulation, insurers can achieve a level of granularity that transforms flood risk from an unpredictable catastrophe into a managed, priced, and mitigated asset class.

We explore the technical convergence of these technologies, analyzing how street-level imagery can derive First Floor Elevation (FFE) with centimeter-level precision, how orbital radar can penetrate cloud cover to verify claims instantly, and how neural networks embedded with physical laws can simulate millions of climate scenarios in seconds. This report serves as a strategic blueprint for enterprise adoption, arguing that granular, physics-based AI is not merely an operational efficiency but a fundamental requirement for solvency in the 21st century.

1. The Crisis of Granularity in Legacy Underwriting

1.1 The Obsolescence of the 100-Year Standard

For over half a century, the United States flood insurance market has been anchored to the concept of the "100-year flood"—a statistical construct representing a flood event with a 1% annual chance of occurrence. 1 While this metric provided a necessary regulatory benchmark for the National Flood Insurance Program (NFIP), its utility as a precise underwriting tool has eroded significantly. The term itself is widely misinterpreted by the public and policymakers alike, often leading to the dangerous fallacy that a "100-year flood" happens only once a century, whereas in reality, a property within this zone has a 26% probability of flooding over the course of a standard 30-year mortgage. 3

The limitations of the FEMA-derived Special Flood Hazard Areas (SFHA) are structural and severe. These maps are inherently static snapshots of a dynamic environment. Research indicates that approximately 75% of FEMA flood maps are older than five years, with a concerning 11% dating back to the 1970s and 1980s. 1 These outdated delineations fail to account for two critical variables driving modern risk: rapid urbanization, which dramatically alters surface permeability, and climate change, which is intensifying precipitation events beyond historical baselines. 1

Legacy maps create a binary "cliff effect" in risk perception. A property located one foot inside the SFHA is mandated to carry insurance and priced as a high risk, while a neighbor located one foot outside is often designated as "Zone X" (minimal risk) and priced significantly lower, despite facing nearly identical hydrological threats. 5 This binary classification ignores the continuous nature of water flow and elevation. Consequently, homeowners outside these high-risk zones frequently assume they are safe, leading to massive underinsurance; studies suggest that less than 4% of homeowners nationwide carry flood insurance, partially due to this misleading binary signal. 5

1.2 The Pluvial Blind Spot

Perhaps the most critical deficiency in traditional mapping is the failure to account for pluvial (rainfall-driven) flooding. FEMA maps have historically focused on fluvial (riverine) and coastal surge flooding, modeling the overflow of water bodies. 4 However, as urban density increases, the proliferation of impervious surfaces—roads, parking lots, and rooftops—prevents natural drainage, leading to flash floods in areas far removed from rivers or coastlines. 1

Recent analysis reveals that nearly 68.3% of flood damage reports occur outside of FEMA's high-risk flood zones, with a significant portion driven by intense rainfall events that overwhelm municipal drainage systems. 5 These "off-plain" events are invisible to underwriters relying solely on federal maps. In urban environments, micro-topography—the subtle dip of a driveway, the slope of a street, or the presence of a retaining wall—can dictate the difference between a dry home and a catastrophic loss. Zip code-level or census-block aggregation completely smooths over these critical variations. 7

1.3 Adverse Selection and the Information Asymmetry

The reliance on coarse risk measures creates a systemic vulnerability to adverse selection. In insurance economics, adverse selection occurs when the insured has more information about their risk than the insurer, or when pricing does not accurately reflect individual risk. 9

When an insurer uses a zip code-average rate, they effectively subsidize high-risk properties with premiums collected from low-risk properties. Sophisticated commercial buyers or property owners with local knowledge (e.g., knowing their basement floods during heavy rains) will aggressively purchase undervalued insurance, while owners of truly low-risk properties (e.g., those on a slight rise within the same zip code) will opt out, perceiving the average price as too high. 9 This dynamic destabilizes the risk pool.

Recent empirical studies demonstrate that firms utilizing coarser measures of risk are at a distinct disadvantage compared to competitors utilizing granular, pixel-level models. Lenders and insurers who fail to screen for property-specific flood risk underestimate the probability of default and loss severity, leading to a concentration of bad risks on their balance sheets. 9 Conversely, insurers leveraging granular data can engage in "cream skimming"—identifying and insuring the low-risk properties within "high-risk" zones that competitors avoid, or accurately pricing the high-risk properties that competitors undercharge. 8

Table 1: Structural Deficiencies of Legacy vs. Deep AI Underwriting

Dimension Legacy Underwriting (Zip
Code/Zone)
Deep AI Underwriting
(Pixel/Parcel)
Spatial Resolution Regional averages (Zip
Code, Census Block).
Exact building footprint
(Pixel-level).
Temporal Accuracy Static maps, updated every
5-10 years.
Dynamic, real-time updates
via satellite/IoT.
Hazard Scope Primarily Fluvial and
Coastal Surge.
Fluvial, Coastal, and_Pluvial_
(Rainfall).
Risk Gradient Binary (In/Out of SFHA). Continuous probabilistic
score (1-100).
Pricing Efciency Cross-subsidization; prone
to adverse selection.
Risk-based pricing;
minimizes leakage.
Data Latency Historical claims data
(lagging indicator).
Real-time sensor/SAR data
(leading indicator).

2. Hyper-Local Computer Vision: The Structural Truth

To transcend the limitations of hazard maps, modern underwriting must interrogate the receptor of the hazard: the building itself. Deep AI solutions leverage Hyper-Local Computer Vision (CV) to extract structural attributes from aerial, satellite, and street-level imagery, creating a "digital twin" of the property that informs vulnerability assessments with unprecedented precision.

2.1 The First Floor Elevation (FFE) Imperative

In the physics of flood damage, the most critical variable is not merely the depth of water on the ground, but the First Floor Elevation (FFE) —the vertical distance between the adjacent ground grade and the lowest habitable floor. 12

The quantitative impact of FFE on loss severity is exponential. Research indicates that raising a property's FFE by just one foot above the 100-year flood elevation (E100) can reduce the Average Annual Loss (AAL) by approximately 90%. 14 Conversely, a property with a sunken living room or a basement is essentially a collection basin for losses.

Despite its criticality, FFE is notoriously absent from standard datasets. Public tax records rarely capture it, and Elevation Certificates (ECs) are expensive, manual documents usually obtained only when mandatory. 12 Legacy models often resort to default assumptions—for example, assuming all homes in a certain region have a standard 1-foot crawlspace—which leads to massive error margins in risk calculation. 15

2.2 Street View Imagery and Trigonometric Extraction

Deep AI fills this data void by utilizing street-level imagery (from sources like Google Street View, Mapillary, or proprietary fleets) to remotely calculate FFE. This process, often termed "cross-view matching," integrates ground-level photography with georeferenced satellite building footprints. 16

The Computer Vision Workflow:

1.​ Semantic Segmentation: Convolutional Neural Networks (CNNs), such as those based on the YOLO (You Only Look Once) or Mask R-CNN architectures, analyze the street view image to identify and segment key architectural features: the ground line, the foundation, the front door, windows, and stairs. 18

2.​ Depth Estimation: Because street view images are 2D representations of 3D space, the model must calculate depth. Deep learning models trained on monocular depth cues (or stereo pairs if available) generate a depth map, estimating the distance from the camera lens to the building facade. 20

3.​ Trigonometric Calculation: The algorithm applies geometric principles. Knowing the camera's height (typically fixed on the mapping vehicle, e.g., ~2.5 meters) and the pitch angle of the pixel representing the door threshold relative to the horizon, the system calculates the physical height of the entrance above the street level. 18

4.​ Stair Counting: As a robust secondary validation method, CV models are trained to detect and count the number of steps leading to the entryway. With standard building codes dictating a riser height of approximately 7 inches (18 cm), a simple count provides a highly accurate proxy for FFE. A home with six steps, for instance, has an FFE of roughly 42 inches. 15

Empirical studies validate the efficacy of this approach. Neural networks trained for lowest floor elevation estimation have demonstrated average errors as low as 0.218 meters (approx. 8.5 inches). 18 This level of precision, scalable across millions of properties without a single site visit, fundamentally alters the economics of portfolio underwriting.

2.3 Aerial Intelligence and Roofscape Analysis

While street view captures verticality, high-resolution aerial imagery (ortho-rectified and oblique) provides critical context regarding the property's horizontal vulnerability and condition. Companies like Zesty.ai and Cape Analytics have pioneered the use of CV to derive "property intelligence" that transcends tabular data. 21

Key Attributes Derived via Aerial CV:

●​ Impervious Surface Ratio: Flood risk is locally amplified by the inability of the ground to absorb water. CV algorithms analyze the parcel to calculate the exact ratio of concrete/asphalt to permeable vegetation. High imperviousness in a micro-watershed correlates directly with increased surface runoff and pluvial flood potential. 24

●​ Basement and Window Well Detection: Aerial imagery can detect the presence of window wells or basement walkouts, which confirm the existence of sub-grade living spaces. Basements significantly increase Total Insurable Value (TIV) at risk, particularly for mechanical, electrical, and plumbing (MEP) systems often housed there. 25

●​ Roof Condition as a Maintenance Proxy: While primarily used for wind and hail underwriting, the condition of the roof serves as a powerful proxy for the homeowner's overall maintenance philosophy. A "Deferred Maintenance" score, derived from detecting staining, patching, or degradation on the roof, correlates with higher claims severity across all perils, including water damage. A well-maintained home is less likely to suffer from secondary leakages or drainage failures. 23

●​ Secondary Characteristics: CV models can instantly identify the presence of solar panels, swimming pools (which can affect drainage), and the distance to local water bodies or storm drains with greater accuracy than outdated municipal maps. 21

2.4 Mitigation-Aware Scoring

One of the most powerful applications of Deep AI is the ability to recognize and reward mitigation. Legacy models rarely account for homeowner improvements because verifying them is cost-prohibitive. CV systems can detect flood vents, elevated HVAC units, or defensible space clearing (for wildfire, but indicative of care). 23 By incorporating these observable mitigations into the pricing model, insurers can offer "Mitigation-Aware Scoring," incentivizing policyholders to invest in resilience and reducing the overall risk of the portfolio. 27

3. Orbital Sentinel: Synthetic Aperture Radar (SAR) and the All-Weather Eye

While Computer Vision assesses the vulnerability of the structure, Synthetic Aperture Radar (SAR) provides the authoritative "ground truth" regarding the hazard itself. In the context of flood underwriting and claims response, the ability to "see" water is paramount. However, traditional optical satellites (e.g., Landsat, Sentinel-2) suffer from a critical flaw: floods are invariably accompanied by clouds and rain, rendering optical sensors blind during the peak of the event. 28

3.1 The Physics of Backscatter

SAR satellites (such as the Sentinel-1 constellation or commercial providers like ICEYE) overcome atmospheric opacity by transmitting microwave pulses that penetrate clouds, smoke, and heavy rainfall. The sensor measures the "backscatter"—the energy reflected back to the satellite from the Earth's surface. 28

The detection of floodwater relies on the physics of specular reflection .

●​ Open Water: A calm water surface acts like a mirror to radar waves. When the microwave pulse strikes the water, it reflects away from the sensor (specular reflection), resulting in very low backscatter intensity. In the resulting image, water bodies appear as dark, almost black pixels. 28

●​ The Urban Double-Bounce Effect: Detecting floodwater in urban environments is far more complex due to the "double bounce" phenomenon. Radar signals often strike the smooth surface of the floodwater, bounce off the vertical face of a building, and then reflect back to the sensor with high intensity. This creates bright pixels that can be mistaken for non-flooded land. Advanced Deep AI models are specifically trained to recognize these high-intensity interference patterns as signatures of urban inundation, a task that traditional threshold-based algorithms fail to perform. 31

3.2 Deep Learning in SAR Processing Pipelines

Raw SAR data is inherently noisy, characterized by a granular texture known as "speckle." To render this data usable for pixel-level underwriting, a rigorous preprocessing and analysis pipeline is required, increasingly powered by Deep Learning. 28

The Deep SAR Workflow:

1.​ Orbit Correction & Calibration: Precise orbital files are applied to correct for the satellite's position, and radiometric calibration converts raw digital numbers into physical backscatter values (Sigma Nought). 28

2.​ Deep Despeckling: Instead of simple spatial filters (like the Lee filter) which can blur edges, Deep Convolutional Neural Networks (CNNs) are employed to remove speckle noise while preserving the sharp boundaries of flood extents. 28

3.​ Terrain Flattening: Using a Digital Elevation Model (DEM), the system corrects for slope-induced geometric distortions, ensuring that a shadow from a hill is not mistaken for a body of water. 28

4.​ Change Detection: The most robust method for flood mapping is Change Detection . The system compares the "Event" image (during the flood) against a "Reference" image (archived during dry conditions). Deep learning models (such as U-Nets) analyze the difference in texture and intensity to isolate newly inundated areas from permanent water bodies. 29

3.3 Fusion of Optical and Radar Data

The cutting edge of Deep AI lies in Sensor Fusion . While SAR provides the all-weather extent, optical imagery (when available) provides superior spectral information regarding land cover and water turbidity. By combining these modalities, insurers can eliminate false positives. For example, a "dark" area in SAR might be a smooth tarmac parking lot or a shadow; by cross-referencing with optical indices like the Normalized Difference Water Index (NDWI), the model can confirm whether the pixel is truly water. 31

Research demonstrates that fusing SAR with optical data using machine learning classifiers (like Random Forest or CNNs) significantly improves classification accuracy, achieving rates exceeding 92% even in complex landscapes. 30

3.4 Operationalizing SAR for Claims Triage

For the insurer, the value of SAR lies in speed and scale. Commercial constellations like ICEYE provide near real-time flood depth and extent data, often within 24 hours of an event's peak. 34 This allows for immediate claims triage :

●​ Total Insurable Value (TIV) at Risk: Insurers can overlay the SAR flood footprint on their portfolio to instantly estimate potential losses.

●​ Resource Allocation: Adjusters can be deployed specifically to properties where the satellite confirms inundation, avoiding wasted trips to dry areas.

●​ Fraud Detection: Historical SAR data serves as an immutable record of the event. If a claimant alleges flood damage for a property that SAR data confirms was dry, the claim can be flagged for special investigation. 35

4. The Simulation Layer: Physics-Informed Machine Learning (PINNs)

While CV and SAR provide observational data, they describe the present or the past . To underwrite the future, insurers must rely on simulation. Traditional hydrodynamic models (solving the Saint-Venant equations) are physically accurate but computationally exorbitant, often taking hours or days to model a single catchment at high resolution. 36 Conversely, purely data-driven Deep Learning models are fast but can hallucinate physically impossible scenarios, such as water appearing without a source or violating mass conservation. 37

The solution is Physics-Informed Machine Learning (PINNs) —a hybrid architecture that embeds physical laws directly into the neural network.

4.1 Embedding Conservation Laws

PINNs represent a fundamental advance in modeling. In a PINN, the neural network is trained not only to minimize the error between its predictions and the training data (Data Loss) but also to minimize the "residuals" of the governing physical equations (Physics Loss). 36

The Loss Function Architecture:

LossTotal=LossData+λLossPhysicsLoss_{Total} = Loss_{Data} + \lambda Loss_{Physics}

Where LossPhysicsLoss_{Physics} penalizes the network if its output violates the partial differential equations (PDEs) of fluid dynamics, specifically:

●​ Conservation of Mass (Continuity Equation): Ensures that water does not spontaneously appear or disappear.

●​ Conservation of Momentum: Ensures that water flow velocity respects gravity, friction, and pressure gradients. 37

By constraining the search space of the neural network with these physical laws, PINNs achieve two critical advantages:

1.​ Data Efficiency: They require significantly less training data to reach convergence because the "rules of the game" are already known.

2.​ Generalizability: Unlike standard AI, which fails when encountering scenarios outside its training distribution (e.g., an unprecedented 500-year storm), PINNs remain robust because the underlying physics do not change. 39

4.2 Graph Neural Networks (GNNs) for Hydrological Routing

Floodwater does not exist in isolation; it flows through a connected network of rivers, streets, and pipes. Graph Neural Networks (GNNs) are uniquely suited to model this topology. In a GNN flood model, the landscape is represented as a graph where nodes correspond to spatial locations (pixels or parcels) and edges represent the pathways for water flow. 40

Recent architectures, such as HydroGraphNet, utilize GNNs to perform autoregressive forecasting. These models learn the spatial relationships of the watershed—how rainfall in the upper basin propagates to the urban center hours later. By encoding physical constraints (like slope and roughness) into the edges of the graph, these models can predict water depth and velocity across thousands of nodes in milliseconds, serving as ultra-fast surrogates for traditional hydraulic solvers. 42

4.3 Surrogate Modeling for Real-Time Pricing

The primary application of PINNs in underwriting is Surrogate Modeling . Instead of running a computationally heavy HEC-RAS simulation for every policy quote, an insurer can train a PINN to approximate the HEC-RAS output. Once trained, the PINN can run thousands of stochastic climate scenarios for a specific property in real-time. 36

This capability allows for dynamic, probabilistic pricing . Rather than relying on a static "Zone AE" rate, the underwriting engine can simulate the property's response to a spectrum of potential storm events—from a common afternoon downpour to a Category 5 hurricane—and generate a premium that reflects the true, integrated risk profile of that specific location. 42

5. Actuarial Transformation: From Probabilistic Averages to Deterministic Reality

The adoption of Deep AI necessitates a fundamental restructuring of actuarial models. Moving from zip code aggregates to pixel-level physics alters the core metrics of insurance profitability.

5.1 Refining the Combined Ratio

The Combined Ratio (the sum of incurred losses and expenses divided by earned premiums) is the definitive metric of insurer health. A ratio above 100% indicates an underwriting loss. In recent years, the homeowners' insurance combined ratio has deteriorated, averaging 101.5% and peaking at 110.5% in 2023. 43 Deep AI addresses every component of this ratio:

●​ Loss Ratio Reduction: By eliminating adverse selection and accurately identifying high-risk properties hidden within "safe" zones, insurers can avoid catastrophic claims. Furthermore, mitigation-aware scoring encourages risk-reducing behaviors in the insured population. 43

●​ Expense Ratio Reduction: Automated FFE extraction and SAR-based claims triage significantly reduce the need for manual inspections and field adjusting. The automation of "straight-through processing" for low-complexity quotes lowers customer acquisition costs. 8

●​ Premium Optimization: Deep AI allows insurers to identify "good risks" in "bad zones"—for example, a home in a flood zone that is elevated 4 feet above the Base Flood Elevation. Legacy insurers might reject this risk or overprice it; a Deep AI-enabled insurer can write it at a competitive, profitable rate, gaining market share. 8

5.2 Closing the Protection Gap

The "Protection Gap"—the difference between total economic losses and insured losses—is widening. In the US, average flood claims are nearly $34,000, yet only a fraction of properties are insured, often due to the high cost or unavailability of NFIP policies. 1

Deep AI enables the creation of private flood products that can compete with or supplement the NFIP. By understanding risk at a granular level, private carriers can offer coverage to properties that fall outside the rigid NFIP guidelines. Furthermore, this granularity supports Parametric Insurance . In a parametric model, a payout is triggered automatically if a specific physical parameter is met—for example, if SAR data confirms a flood depth of

30cm at the property's coordinates. This eliminates the lengthy claims adjustment process and provides immediate liquidity to the policyholder, fostering greater resilience. 46

5.3 The Role of Reinsurance and Capital Allocation

Reinsurers are the ultimate backstops of risk. They are increasingly demanding transparency into the underlying portfolios of primary carriers. A carrier that can demonstrate a portfolio underwritten with pixel-level FFE data and monitored via SAR offers a "higher quality" risk pool. This transparency can lead to more favorable reinsurance treaties and optimized capital allocation. Instead of holding capital reserves based on broad, uncertain probabilistic curves, insurers can hold reserves based on deterministic, physics-based vulnerability assessments. 23

6. Strategic Implementation and Future Trajectories

For Veriprajna's clientele, the transition to Deep AI is not a matter of if, but when . The following roadmap outlines the strategic steps for enterprise adoption.

6.1 Data Pipeline Integration

Insurers must move from static data ingestion to dynamic pipelines.

●​ Ingest: Establish API connections with orbital data providers (e.g., ICEYE, Planet) and aerial intelligence firms (e.g., Zesty.ai, Cape Analytics).

●​ Process: Deploy cloud-native environments capable of handling massive geospatial datasets. Implementation of GNNs requires graph-structured databases rather than traditional tabular SQL rows. 42

●​ Act: Integrate model outputs directly into the policy administration system (Guidewire, Duck Creek) to enable real-time rating and automated underwriting decisions. 23

6.2 Regulatory Compliance and "Glass Box" AI

As AI becomes central to pricing, regulatory scrutiny regarding bias and explainability will intensify. The "Black Box" nature of traditional deep learning is a liability. Veriprajna must advocate for "Glass Box" AI —specifically PINNs. Because PINNs are grounded in explicit physical equations (Saint-Venant), their outputs are interpretable and defensible to state departments of insurance. An insurer can demonstrate that a premium increase is due to a physically modeled hydraulic risk, rather than an opaque algorithmic correlation. 40

6.3 The "Living" Risk Model

The concept of a static "renewal" is becoming obsolete. Risk models must be "living." If a SAR satellite detects land subsidence in a region, or if a neighbor paves over a permeable lawn, the risk score of the adjacent properties should update dynamically. This continuous underwriting model allows for mid-term adjustments and proactive risk consulting—transforming the insurer from a payer of claims to a partner in resilience. 24

6.4 Conclusion

The era of underwriting flood risk based on 1980s paper maps and zip code averages is effectively over. The convergence of Hyper-Local Computer Vision, Synthetic Aperture Radar, and Physics-Informed Machine Learning allows for a transition to Pixel-Level Precision . This shift is not merely technical; it is economic and ethical. It enables the fair pricing of risk, the incentivization of mitigation, and the long-term solvency of the insurance mechanism in a climate-altered world. For Veriprajna, the opportunity is to guide the industry through this transformation, replacing the uncertainty of the past with the calculated precision of Deep AI.

Report produced by the Office of AI Strategy & Risk Architecture. Date: December 10, 2025. Client: Veriprajna.

Works cited

  1. Are Outdated Flood Zone Maps Leaving Insureds at Risk?, accessed December 10, 2025, https://sagesure.com/insurance-insights/are-outdated-flood-zone-maps-leaving-insureds-at-risk/

  2. Moving Beyond the Essentials - Page 2 of 5 - Flood Science Center, accessed December 10, 2025, https://floodsciencecenter.org/products/elected-officials-flood-risk-guide/moving-beyond-the-essentials/2/

  3. Flood Risk Information | Realtor.com®, accessed December 10, 2025, https://www.realtor.com/flood-risk/

  4. Understanding FEMA Flood Maps and Limitations | First Street, accessed December 10, 2025, https://firststreet.org/research-library/understanding-fema-flood-maps-and-limitations

  5. Many Americans Lack Flood Insurance Despite Rising Risks — Here's Why, accessed December 10, 2025, https://cnr.ncsu.edu/news/2022/11/flood-maps/

  6. Understand the differences between FEMA flood zones - First Street™, accessed December 10, 2025, https://help.firststreet.org/hc/en-us/articles/360048256493-Understand-the-difefrences-between-FEMA-flood-zones

  7. Full article: Recent advances and future challenges in urban pluvial flood modelling, accessed December 10, 2025, https://www.tandfonline.com/doi/full/10.1080/1573062X.2024.2446528

  8. Parcel-Level Intelligence | Federato, accessed December 10, 2025, https://www.federato.ai/library/post/parcel-level-intelligence-the-missing-element-in-traditional-underwriting

  9. NBER WORKING PAPER SERIES HOW ARE INSURANCE MARKETS ADAPTING TO CLIMATE CHANGE? RISK CLASSIFICATION AND PRICING IN THE MARKET FO, accessed December 10, 2025, https://www.nber.org/system/files/working_papers/w32625/w32625.pdf

  10. The effect of information about climate risk on property values - PNAS, accessed December 10, 2025, https://www.pnas.org/doi/10.1073/pnas.2003374118

  11. Who Bears Flood Risk? Evidence from Mortgage Markets in Florida - Bank of England, accessed December 10, 2025, https://www.bankofengland.co.uk/-/media/boe/files/events/2023/june/p-sastry-paper.pdf

  12. FEMA Understanding Elevation Certificates Fact Sheet - National Flood Insurance Program, accessed December 10, 2025, https://agents.floodsmart.gov/sites/default/files/media/document/2025-07/fema-nfip-understanding-elevation-certificates-fact-sheet-03-2023.pdf

  13. Federal Flood Damage Estimation Guidelines for Buildings and Infrastructure, accessed December 10, 2025, https://natural-resources.canada.ca/science-data/science-research/natural-hazards/flood-mapping/federal-flood-damage-estimation-guidelines-buildings-infrastructure

  14. Frontiers | Flood risk assessment for residences at the neighborhood ..., accessed December 10, 2025, https://repository.library.noaa.gov/view/noaa/50047/noaa_50047_DS2.htm

  15. Methodology for Virtual Damage Assessment and First-Floor ..., accessed December 10, 2025, https://ascelibrary.org/doi/10.1061/NHREFO.NHENG-2310

  16. A Novel Method for Estimating Building Height from Baidu Panoramic Street View Images, accessed December 10, 2025, https://www.mdpi.com/2220-9964/14/8/297

  17. Robust Building Identification from Street Views Using Deep Convolutional Neural Networks, accessed December 10, 2025, https://www.mdpi.com/2075-5309/14/3/578

  18. Exploring the vertical dimension of street view image: lowest floor elevation estimation, accessed December 10, 2025, https://giscience.psu.edu/exploring-the-vertical-dimension-of-street-view-image-lowest-floor-elevation-estimation/

  19. An open-source system for building-height estimation using street-view images, deep learning, and building footprints - Statistique Canada, accessed December 10, 2025, https://www150.statcan.gc.ca/n1/pub/18-001-x/18-001-x2020002-eng.htm

  20. Height/Elevation of a pixel from ground in Google Street View - Stack Overflow, accessed December 10, 2025, https://stackoverflow.com/questions/48219613/height-elevation-of-a-pixel-from-ground-in-google-street-view

  21. Real Estate Property Intelligence - CAPE Analytics, accessed December 10, 2025, https://capeanalytics.com/real-estate-property-intelligence/

  22. CAPE Analytics: Homepage, accessed December 10, 2025, https://capeanalytics.com/

  23. Resources - ZestyAI, accessed December 10, 2025, https://zesty.ai/resources

  24. An Urban Density-Based Runoff Simulation Framework to Envisage Flood Resilience of Cities - MDPI, accessed December 10, 2025, https://www.mdpi.com/2413-8851/7/1/17

  25. What Property Underwriters Can Learn From Life Insurers Part II, accessed December 10, 2025, https://www.bhspecialty.com/wp-content/uploads/2021/01/BHSI-What-Property-Underwriters-Can-Learn-From-Life-Insurers-PART2-March-2018.pdf

  26. Reports & Research | ZestyAI Insights for Insurers, accessed December 10, 2025, https://zesty.ai/resources/research

  27. DUAL Strengthens Storm Risk Underwriting and Rating With ZestyAI, accessed December 10, 2025, https://zesty.ai/resource/dual

  28. Comprehensive Guide to Floodplain Mapping Using SAR ... - FlyPix AI, accessed December 10, 2025, https://flypix.ai/floodplain-mapping/

  29. Map floods with SAR data and deep learning | Documentation, accessed December 10, 2025, https://learn.arcgis.com/en/projects/map-floods-with-sar-data-and-deep-learning/

  30. Combining SAR images with land cover products for rapid urban flood mapping Frontiers, accessed December 10, 2025, https://www.frontiersin.org/journals/environmental-science/articles/10.3389/fenvs.2022.973192/full

  31. Generating Flood Probability Map Based on Combined Use of Synthetic Aperture Radar and Optical Imagery - NASA Technical Reports Server (NTRS), accessed December 10, 2025, https://ntrs.nasa.gov/api/citations/20210016467/downloads/Generating%20Flood%20Probability%20Map%20Based%20on%20Combined%20Use%20of%20Synthetic%20Aperture%20Radar%20and%20Optical%20Imagery.pdf

  32. Automatic Flood Monitoring Method with SAR and Optical Data Using Google Earth Engine, accessed December 10, 2025, https://www.mdpi.com/2073-4441/17/2/177

  33. Full article: Flood detection and mapping through multi-resolution sensor fusion: integrating UAV optical imagery and satellite SAR data - Taylor & Francis Online, accessed December 10, 2025, https://www.tandfonline.com/doi/full/10.1080/19475705.2025.2493225?af=R

  34. Flood briefings | Resources - ICEYE, accessed December 10, 2025, https://www.iceye.com/resources/flood-briefings

  35. Flood Insights - ICEYE, accessed December 10, 2025, https://www.iceye.com/hubfs/_SOLUTIONS/2-Pager%20Flood%20Insights%20INS%20US_2024.pdf?utm_campaign=Reuters%3A%20Connected%20Claims%20USA%202024&utm_source=linkedin&utm_medium=paidsocial&utm_content=Image%20Ad%20%28Lead%20form%29%20%7C%20v1.1%20Flood%20Insights%20US%202-pager%20%7C%20Connected%20Claims%202024

  36. Physics-Informed Neural Network Surrogate Models for River Stage Prediction arXiv, accessed December 10, 2025, https://arxiv.org/html/2503.16850v1

  37. (PDF) Physics Informed Neural Networks for 1D Flood Routing - ResearchGate, accessed December 10, 2025, https://www.researchgate.net/publication/364587823_Physics_Informed_Neural_Networks_for_1D_Flood_Routing

  38. (PDF) Physics-informed neural networks for flood modelling: A conceptual framework, accessed December 10, 2025, https://www.researchgate.net/publication/393897896_Physics-informed_neural_networks_for_flood_modelling_A_conceptual_framework

  39. Enhancing Urban Flood Prediction Accuracy with Physics-Informed Neural Networks: A Case Study in Real-Time Rainfall Data Integration - University of Hertfordshire (Research Profiles), accessed December 10, 2025, https://researchprofiles.herts.ac.uk/en/publications/enhancing-urban-flood-prediction-accuracy-with-physics-informed-n/

  40. Field-theory inspired physics-informed graph neural network for reliable traffic flow prediction under urban flooding - IDEAS/RePEc, accessed December 10, 2025, https://ideas.repec.org/a/eee/reensy/v265y2026ipas0951832025006878.html

  41. Physics Informed Machine Learning for Flood Prediction, accessed December 10, 2025, https://agu.confex.com/agu/fm23/meetingapp.cgi/Paper/1382505

  42. HydroGraphNet: Interpretable Physics-Informed Graph Neural Networks for Flood Forecasting — NVIDIA PhysicsNeMo Framework, accessed December 10, 2025, https://docs.nvidia.com/physicsnemo/25.08/physicsnemo/examples/weather/flood_modeling/hydrographnet/README.html

  43. De-risking Risky Business: Climate and Insurance Markets - R Street Institute, accessed December 10, 2025, https://www.rstreet.org/commentary/de-risking-risky-business-climate-and-insurance-markets/

  44. Addressing Climate and Other Risks: The ESG and EVA Performance of Insurance Companies - ISS Insights, accessed December 10, 2025, https://insights.issgovernance.com/posts/addressing-climate-and-other-risks-the-esg-and-eva-performance-of-insurance-companies/

  45. 3 Ways AI is Transforming Property Insurance - CAPE Analytics, accessed December 10, 2025, https://capeanalytics.com/blog/ai-property-insurance/

  46. Technical Report | Allianz.com, accessed December 10, 2025, https://www.allianz.com/content/dam/onemarketing/azcom/Allianz_com/sustainability/Project-C5.23-ISF-Ghana-Parametric-Flood-Solution-Technical-Report-ForPublication.pdf

  47. How ZestyAI Models Work: A Deep Dive into Property-Level Risk, accessed December 10, 2025, https://zesty.ai/news/model-deep-dive

  48. Human-centered flood mapping and intelligent routing through augmenting flood gauge data with crowdsourced street photos - the NOAA Institutional Repository, accessed December 10, 2025, https://repository.library.noaa.gov/view/noaa/57328/noaa_57328_DS1.pdf

Prefer a visual, interactive experience?

Explore the key findings, stats, and architecture of this paper in an interactive format with navigable sections and data visualizations.

View Interactive

Build Your AI with Confidence.

Partner with a team that has deep experience in building the next generation of enterprise AI. Let us help you design, build, and deploy an AI strategy you can trust.

Veriprajna Deep Tech Consultancy specializes in building safety-critical AI systems for healthcare, finance, and regulatory domains. Our architectures are validated against established protocols with comprehensive compliance documentation.