If you have been keeping an eye on our blog over the past couple weeks you might have already seen some technical insights into our day-to-day work. This time I’m going to tell you about one of the industries we work with. Maritime industry. There are a numerous data science challenges, which are Maritime related. One of which is identifying the biggest bottle-necks in ports.
Our adventure has its beginnings at the World Port Hackathon 2016 in Rotterdam where our team won one of the main prizes. We presented an IoT(Internet-of-Things) based solution for real-time tracking of general port activities. By analysing data from various sensors we are able to provide instant alerts of unexpected situations, like bollards failures, as well as predict future failures and analyse overall efficiency of processes in the port. Such information can be used by decision makers and improve their judgment of current situation.
We’ve gathered a lot of valuable and positive feedback since then and have spent some time building a demo app, which can be found here:
Using our maritime demo
In this demo you can choose between different European ports to check out insights from AIS data analysis for port. For instance, we are able to identify currently moored ships and divide them by their type and deadweight tonnage - DWT (https://en.wikipedia.org/wiki/Deadweight_tonnage).
With those information we can compare cargo throughput efficiency e.g. on average there is two times more vessels in St. Petersburg than in Gdańsk, but ships in Gdańsk relatively can carry more goods. Taking those data into account you could assume that port in Gdańsk is much more suitable for bigger ships.
NOTE! All widgets below are interactive. For instance, you can select/unselect legend labels to show/hide specific groups.
You can filter insights by different ship types: Tug, Tankers, Pleasure & Passenger, Fishing and Cargo you can also see that there is proportionally much more fishing going on in Gdańsk. In St. Petersburg there is much more tankers, though we all know that Russia has one of the largest petroleum industry in the world, this difference shows in raw traffic data as well.
If you combine data analysis with a domain experts knowledge, like information about fishing periods or type of cargo being delivered from/to specific country you might get valuable insights when benchmarking and comparing between different ports.
Our solution is fully customisable and can easily be extended to other ports around the world. It’s also possible to monitor the efficiency of specific terminals in a given port.
We are going to expand our demo and add more broad and extended marine traffic analysis in the future. If you would like to get insights about other ports or different period of time please contact us at: firstname.lastname@example.org
Why ports should use data science?
Ports, especially those in Europe, have very limited growth possibilities due to geographical features and dense city infrastructure. In order to scale up they have to become more effective in what they do. They have to improve their process with technological innovations, like IoT or Data Science.
In case you are still wondering how data science can be successfully applied in this industry, in 2015 Maersk created a data science team that converted historical Excel sheets about container repairs into an interactive application. Maersk can now monitor global repair activities in real time and delve into detailed historical analysis of costs in particular regions, cities or of providers. Thanks to that knowledge they were able to optimise repair strategies and save USD 2 million a year.
The data proved that locally sourced floorboard materials were not leading to a greater frequency of repeat repairs as suspected, but that the cargo being loaded there – marble and concrete – is a factor. This knowledge alone will save Maersk Line USD 2 million a year.
Source - Maersk Blog
This is just the beginning of innovation for Maritime industry. Much more can be done in operational optimisations, for instance, bunker optimisation and empty container forecasting. These are multi-billion dollar cost areas for data science to optimise.