AI to detect ripe fruits with human-level accuracy

AI to detect ripe fruits with human-level accuracy

Watch the story

SeeTree is a leading company in the agritech field, looking to bring technology into the world of farming. The primary focus of SeeTree is to monitor the fruiting status of trees in order to increase their productivity. This requires a system for the automatic detection of ripe fruits using machine learning techniques.

SeeTree partnered with Ciklum to approach this task through deep learning methods considered the state-of-the-art in tackling image recognition tasks.

The prototype with near human-level performance was built in only two months.

  1. ok_iconCreated with Sketch.
    Build an AI prototype for analyzing fruits with near human-level performance
  2. ok_iconCreated with Sketch.
    Make system re-trainable for different kinds of fruits
  3. ok_iconCreated with Sketch.
    Serve as a starting point for the ripeness identification systems in the future

Ciklum’s team of skilled R&D experts had to find a solution to overcome the following challenges:

  1. Small dataset (~500 annotated photos). It was impossible to collect fresh data because oranges were not in season.
  2. High level of noise in dataset annotation. Annotations to some images didn’t contain required data. Some bounding boxes were not precise enough.
  3. Dataset contained photo duplicates with different labelling.
  4. High occlusion level in some pictures, made it difficult to separate instances.

Data Preparation

Duplicate photos, images with unlabelled oranges or invalid polygons, etc. were filtered out from the dataset. The resulting dataset was divided into train/validation/test sets stratified by the number of oranges per image.

It is worth noting that the fruits in the photos had a wide range of colours but mostly different shades of yellow, orange and green. This actually complexified the task as it is harder to distinguish between different shades of a colour in different lighting conditions than separate colours
Igor Krashenyi, PhD
Senior Research Engineer, Ciklum
Model architecture

The best result was obtained with the Faster R-CNN architecture family. This model was a state-of-the-art approach for solving object detection tasks at the time. Faster R-CNN is a two-stage object detection system in which the first stage generates a sparse set of candidate object locations and the second stage classifies each candidate location as one of the foreground classes or as background using a convolutional neural network.

Ciklum’s R&D team’s research and experiments showed that Faster R-CNN with FPN Inception-ResNetV2 backbone is the most suitable for the problem and had the best performance.

Faster R-CNN architecture

The best result was with the Faster R-CNN with FPN InceptionResNetV2 backbone. Performance using the mAP metric was about 85%. The figure below shows predicted bounding boxes (red squares – labels, green squares – predicted bounding boxes).

Ciklum’s prototype had the following advantages:

  • able to detect ripe fruits with high accuracy, even if occluded;
  • although built to detect oranges, it can be easily re-trained to detect other kinds of fruits;
  • outperforms humans in most cases.
Watch video
We had a pretty good idea of how to make fruit tree farming more efficient and more profitable. [...] Despite all the challenges, in a short span of time Ciklum provided us with a model that identifies fruit in an image as well as a human can
Ori Shachar
CTO, SeeTree
Next story

A wearable health monitoring device prototype

Read more
What's your challenge? Let us deliver the talent and expertise to help you solve it.
Upload file

(File requirements: pdf, doc, docx, rtf, ppt, pptx)

By submitting completed “Contact Us” form, your personal data will be processed by Ciklum Group and its subsidiary entities worldwide. Please read our Privacy Notice for more information. If you have any questions regarding your rights or would subsequently decide to withdraw your consent, please send your request to us.

Join our team
Do you have a passion for engineering? We’ve been looking for you.
Check open vacancies