Enterprise Resource Planning (ERP) is referring to a complex software system which helps to optimize business decisions for companies. The foundation for good business decisions is the right information and this raises the questions where to find this valuable information and how to integrate it into business processes. Companies are collecting and processing all kinds of data to find the right information to improve and make an effective business decision.

Earth observation data is becoming more and more important for business analysts all over the world. The value of earth observation data is not just measured by their resolution and visual spectrum of imagery. Satellite images contain much more information in the non-visual spectrum stored in multiple bands which require domain specific knowledge for new disruptive business models. Furthermore, satellites are continuously monitoring our planet which ensures a full coverage ratio of it and which allows to collect and archive historical data. Especially the temporal aspect of geo-data is extremely useful for business analytics such as time series and change detection queries.

Nevertheless, earth observation data analysis is still a very scientific and complicated topic and most of the business users are not able to consume raw imagery mainly because of infrastructure and complexity challenges. To simplify the information flow, an abstraction layer needs to handle the complexity transparently. It contains solutions for pre-filtering relevant images, such as cloud free imagery, and built-in algorithms created with expert knowledge in order to provide valuable and easy to understand information for the end user. This information can be delivered through micro-services to all kinds of clients such as GIS systems, business applications or custom apps. Developers can easily consume information and embed it into reports and analytic applications if they understand the value and meaning of it and the interface is state of the art. Such an abstract interface opens the door for earth observation data to a broader audience—the business and developer community. Finally, users may not even know that specific KPI’s, values or other indicators are coming from a complex temporal image processing chain.

Business industries such as public, utility, security and insurance have already been leveraging earth observation data for a longer time. They have some knowledge about satellite data but are still facing a lot of challenges such as managing IT infrastructure, service level agreements, missing standards and expert knowledge. Other industries such as the retail industry are completely new to imagery, and most of them are not using it because of insufficient experience and knowledge in conjunction with high complexity challenges.

But now, by using the information provided by the cloud service SAP HANA Earth Observation Analysis, customers can focus on their business and still integrate earth observation data without having to care about its complex processing. In the scenario below, we will describe how an insurance company can use a simple REST API for analyzing natural disasters. We are focusing on wildfires but it could easily be extended to floods, storms etc.

Wildfire events have certain important characteristics such as first time detected, duration of active fire, radius, temperature, burned area (called footprint) and so on. Since all this information can be extracted from raw satellite imagery—non-stop and for the whole planet—satellite information are an ideal source for establishing a historical database of occurred fire and wildfire events. Very often, the first step for GIS experts is analyzing such past events of natural disasters (see Fig. 1).

Fig. 1
figure 1

Historical fire events in the USA

Importantly, experts then also have the chance to learn from historical information. They need to understand: what are the characteristics of large fire events, what makes up critical areas, what are the circumstances for natural disasters, and how are these changing over time and space. Subsequently, such knowledge can be used for risk mitigation, a crucial task for an insurance company.

Each time a new fire event is detected, earth observation data can be used to assess the area and inform the insurance company about the wildfire footprint so they can apply their knowledge and business data to determine affected customers and simulate the maximum exposure (see Fig. 2).

Fig. 2
figure 2

Change detection based on NDVI

In order to calculate the area affected by a wildfire, the footprint, for example, the insurance company can use a simple REST call to a service as mentioned above with parameters for the area of interest, two timestamps (before and during the event), a threshold parameter and the return type. This call will then return a representation as specified of how the area of interest has changed in the time between both provided timestamps (see Fig. 2). By regularly checking the earth observation catalogue of the service for any new images of the affected area, the insurance company can analyze how much the wildfire has changed, and store each footprint in GeoJSON, Well-Known-Text (WKT), or another representation of vector data. This leads to a continuously calculated accurate footprint of the wildfire which the insurance company can use to calculate intersections with their business data and buffer queries to find customers near the risk area.

With the help of historical data, it is even possible to analyze the current risk status of an area of interest. In order to achieve this, neural networks can be trained with data sets from the past 30 up to 60 days, for example. The network calculates a prediction score between 0.0 and 1.0 for the next 15 days which describes the probability for a fire in that area around that time.

This information can then be displayed as a hazard map (see Fig. 3). Here, the area is divided into grid cells, and each cell is categorized somewhere between very low and very high according to the probability of a fire event occurring in this area.

Fig. 3
figure 3

Wildfire risk map

Different kinds of neural network technologies can be used such as a Long-Short-Term Memory (LSTM) or Convolutional Neural Network (CNN). Most important is the trainings data which can all be derived from satellite data. For wildfires, companies can use surface information about vegetation and fires like Normalized Differenced Vegetation Index (NDVI), Leaf Area Index (LAI), Fraction of Absorbed Photosynthetic Active Radiation (FAPAR), Dry Matter Productivity (DMP) and Burned area (BA).

For the insurance company, this kind of risk analysis is important for better loss prediction, reduced accumulation losses, optimized portfolio steering, improved claims management and risk mitigation.