Detecting diseases in sugarbeet using an RGB camera and Deep Learning

In the one year project “RGB sensor based measurements of blight in sugarbeets” (In Danish: “Sensor RGB baseret måling af bladsvampeangreb i sukkerroer “) funded by Sukkerroeafgiftsfonden (“The Sugarbeet tax fund”) and in collaboration with the Department of Agroecology, Aarhus University, a deep neural network was trained to segment images of sugarbeet. The network was trained on images collected from four strip field experiments in the 2022 growing season. Image collection was carried out using a field robot with a high quality RGB camera mounted nadir. Images were collected for 9 weeks, and a subset og images were subsequently manually annotated and used as a training set for fine-tuning the pre-trained network.

Results

The preliminary results show a good segmentation of green sugarbeet leafs (89% accuracy) and of the sugarbeet disease rust (80%), with a slight tendency to overestimate the area of the latter. The network has some trouble with distinguishing meldew from relections from the built-in flash, however, the latter was classified as green leaf in the training set rather than being given its own class.

Applying the network to all the collected images, show a clear pattern of the rust initially developing at a few hotspots, which continueue to develop and spread throughout the season. The infection seem to move along the stripes and to a lesser extend across them.

These preliminary results were published in “NBR Faglig beretning 2022” (non-peer reviewed). For more details, see “Kamera og kunstig intelligens til vurdering af sygdomstryk i sukkerroer” in “NBR Faglig beretning 2022”.

For more information, contact Anders Krogh Mortensen or René Gislum.

Data availability

The collected images are available here.

ESHRE 2019 Presentation – Automatic morphological grading of human blastocysts with time-lapse imaging and artificial intelligence

Mikkel Fly Kragh presented his work on automating human embryo grading at the European Society of Human Reproduction and Embryology (ESHRE) conference this year in front of an audience of around 2,000. The abstract was submitted in February and selected for oral presentation the 25th of June in Vienna, Austria. The full abstract is available here.

PhD defence: Anders Krogh Mortensen

Anders’ PhD defence presentation can be seen in the video below.

PhD defense of Anders Krogh Mortensen for the thesis “Estimation of Above-Ground Biomass and Nitrogen-Content of Agricultural Field Crops using Computer Vision”.

The thesis investigates yield estimation derived from RGB images and coloured 3D point clouds. Image segmentation methods based on image processing, handcrafted feature extraction, and deep learning is investigated. Furthermore, a novel method for segmenting lettuce in 3D coloured point clouds is proposed. Several yield models based on the segmented crops are investigated.

The research findings have shown that recent advances in deep learning can be transferred to segmentation of (mixed) crops. It was further shown, that (simple) growth models can be improved using crop coverage to explain the local variations in the crop.

PhD student:
Anders Krogh Mortensen: http://pure.au.dk/portal/en/anmo@agro…

PhD Supervisors:
Associate Professor: René Gislum: http://pure.au.dk/portal/en/rg@agro.a…
Professor Henrik Karstoft: http://pure.au.dk/portal/en/hka@eng.a…

This PhD project is part of the VIRKN project supported by a grant from the Green Development and Demonstration Program (GUDP) granted by the Danish Ministry of Environment and Food and the Future Cropping project supported by a grant from Innovation Fund Denmark.
– VIRKN: http://mst.dk/erhverv/groen-virksomhe…
– Future Cropping: https://futurecropping.dk/
– GUDP: http://mst.dk/erhverv/groen-virksomhe…
– Innovation Fund: https://innovationsfonden.dk

PhD defence: Mikkel Fly Kragh

Today, Friday, April 20th 2018, Mikkel Fly Kragh successfully defended his PhD thesis titled Lidar-Based Obstacle Detection and Recognition for Autonomous Agricultural Vehicles.
Mikkels PhD study was part of the now completed SAFE project, which sought “to develop autonomous agricultural machinery that will be able to harvest green biomass and cultivate row crops – without animals or humans being exposed to any type of safety risk”. Mikkels work focused on detecting and recognizing obstacles using lidar-sensing and sensor fusion with cameras. He achieved this by both incorporating state-of-the-art methods from computer vision to his specific domain as well as developing his own novel methods, such as a conditional random field for lidar-camera fusion.

Mikkel will continue to work part time in the department, as he is starting an industrial postdoc position at the company Vitrolife A/S.

Mikkels PhD defence presentation can be seen in the video below.

PhD defence: Peter Christiansen

Today, Friday, November 3rd 2017, Peter Christiansen successfully defended his PhD thesis titled TractorEYE: Vision-based Real-time Detection for Autonomous Vehicles in Agriculture.
Peter’s PhD study was part of the now completed SAFE project, which sought “to develop autonomous agricultural machinery that will be able to harvest green biomass and cultivate row crops – without animals or humans being exposed to any type of safety risk”. Peter’s work focused on detecting obstacles and traversable areas using RGB and thermal cameras. He achieved this by both incorporating state-of-the-art methods from computer vision to his specific domain as well as developing his own novel methods, such as DeepAnomaly.

Congratulations to Peter from everyone in the Vision Group. Thank you for your collaboration, source of inspiration and our discussions throughout the years. We hope to see you back in the group soon.

Peter’s presentation at his defence can be seen in the video below.

Innovative afternoon at SEGES

Today Rasmus, Søren and Mads were presenting image-processing related results from the SmartGrass, CloverSense and RoboWeedMaps projects. Thanks to everyone who showed up and for the discussions afterwards.

Note: All videos are in Danish.

Metadiscussion of Deep Learning in Agriculture

Automatic recognition of weeds

Automatic clover-grass estimation

SAFE project demonstration day

The SAFE project officially ended 21st September 2017 with a demonstration day at HCA Airport in Odense, Denmark.

All project participants showed their achievements throughout the 4-year project, and several semi-autonomous robots were showcased both indoors and outdoors.

From Aarhus University, we contributed with a dataset demonstration video indoors, and online real-time obstacle detection and avoidance on an outdoor robot, using a combination of stereo camera and lidar. In the image below, our perception system is mounted on top of the small robot on the left.

Copyright © 2018 AU Signal Processing Group