The AI model developed by Igor Krashenyi, a senior Ciklum Researcher, and his teammate Denis Sakva for the TGS Salt Identification Challenge at Kaggle has earned a gold medal. The performance of the model was in the top 10 out of 3000+ participants.
Kaggle is a platform for predictive modelling and analytics competitions in which data scientists compete to develop the best models for predicting and describing the datasets uploaded by companies and users.
TGS, the world’s largest geoscience data company, launched the competition to identify salt regions in seismic images with the help of machine learning. Several areas of Earth with large accumulations of oil and gas also have huge deposits of salt below the surface. The process of finding these salt deposits is quite hard and seismic images require human analysis to interpret salt bodies, which is highly variable and subjective and can cause dangerous situations for oil and gas company drillers.
TGS Salt Identification Challenge:
Build an algorithm that automatically and accurately recognizes subsurface salt deposits.
All the participants in the Kaggle competition were given similar seismic images sized 101×101 and each pixel had to be classified as either salt or sediment. In addition to the seismic images, the depth of the location was provided for each image. Here are some examples:
Igor stated that the task had several challenging complexieties:
- The image size (101×101 pixels) couldn’t be directly used to train the segmentation network.
- The data was not raw. It was cropped from a big mosaic. Small local patches usually needed more global information.
- To better organize data, images were collected into connected puzzles with missing pieces:
This is not the first time we have participated in a Kaggle competition, but this time we are extremely happy to build a model that performed so well. Such competitions allow you to go deeper and utilize all the latest research that appears in the deep learning and data science community as well as to see how good you are compared with scientists and engineers from around the world. You can’t hide anything from the leaderboard!Igor Krashenyi , senior Ciklum Researcher at Ciklum
The biggest influence on the training pipeline and model architecture came from findings in such deep learning papers:
- Squeeze-and-Excitation Networks
- The Lovász-Softmax loss: A tractable surrogate for the optimization of the intersection-over-union measure in neural networks
- CBAM: Convolutional Block Attention Module
- Concurrent Spatial and Channel Squeeze & Excitation in Fully Convolutional Networks
- Hypercolumns for Object Segmentation and Fine-grained Localization
- An Intriguing Failing of Convolutional Neural Networks and the CoordConv Solution
Igor and his teammate developed a model in three months that scored 0.892384 of Mean Intersection-Over-Union metrics. Here is Igor’s explanation of some technical specs on his approach to designing the model.
Also learn how Ciklum’s R&D team built a deep learning model for SeeTree, a leading agritech company: