Web scraping

Web scraping is the process of using programming tools to extract data from websites. Here, we use Python to download, save, unzip, clean and export weather events data.

Artificial Intelligence

Artificial intelligence is the ability of machines to learn from a repetetive process, adjust for new inputs and perfom human-like tasks. Here, we apply the Convolutional Neural Network using the U-Net architecture to model weather events. In this project, we generate heatmaps, provide them as inputs and use them to predict weather events based on the images.

Machine Learning

Machine learning is a branch of artificial intelligence that automates analytical model buulding where systems can learn from data, identify patterns and make decisions with minimal human intervention. Here, we apply regularized gradient boosting using XGBoost, a decision-tree-based ensemble machine learning algorithm.

Poisson Regression
Model

Poisson regression model is a Generalised Linear Model (GLM) that is used to model count response variables. Here, we apply poisson regression models to model the number of weather events that occur within a given period of time.

Heatmaps

Heatmaps are an important tool, especially in insurance, as an eeffective tool for rapid visual impact. It is an important risk management tool - a true meaning of "a picture is worth 1000 words". Here, we create heatmaps using python to visualise occrence of weather events in the USA over time.