In the one year project “RGB sensor based measurements of blight in sugarbeets” (In Danish: “Sensor RGB baseret måling af bladsvampeangreb i sukkerroer “) funded by Sukkerroeafgiftsfonden (“The Sugarbeet tax fund”) and in collaboration with the Department of Agroecology, Aarhus University, a deep neural network was trained to segment images of sugarbeet. The network was trained on images collected from four strip field experiments in the 2022 growing season. Image collection was carried out using a field robot with a high quality RGB camera mounted nadir. Images were collected for 9 weeks, and a subset og images were subsequently manually annotated and used as a training set for fine-tuning the pre-trained network.
The preliminary results show a good segmentation of green sugarbeet leafs (89% accuracy) and of the sugarbeet disease rust (80%), with a slight tendency to overestimate the area of the latter. The network has some trouble with distinguishing meldew from relections from the built-in flash, however, the latter was classified as green leaf in the training set rather than being given its own class.
Applying the network to all the collected images, show a clear pattern of the rust initially developing at a few hotspots, which continueue to develop and spread throughout the season. The infection seem to move along the stripes and to a lesser extend across them.
Mikkel Fly Kragh presented his work on automating human embryo grading at the European Society of Human Reproduction and Embryology (ESHRE) conference this year in front of an audience of around 2,000. The abstract was submitted in February and selected for oral presentation the 25th of June in Vienna, Austria. The full abstract is available here.
Anders’ PhD defence presentation can be seen in the video below.
PhD defense of Anders Krogh Mortensen for the thesis “Estimation of Above-Ground Biomass and Nitrogen-Content of Agricultural Field Crops using Computer Vision”.
The thesis investigates yield estimation derived from RGB images and coloured 3D point clouds. Image segmentation methods based on image processing, handcrafted feature extraction, and deep learning is investigated. Furthermore, a novel method for segmenting lettuce in 3D coloured point clouds is proposed. Several yield models based on the segmented crops are investigated.
The research findings have shown that recent advances in deep learning can be transferred to segmentation of (mixed) crops. It was further shown, that (simple) growth models can be improved using crop coverage to explain the local variations in the crop.
The plant seedlings dataset, which was re-published last week, is now subject to a Kaggle image-classification competition. We encourage all to take a look at the dataset and commit their solution to the competition.
Today, Friday, November 3rd 2017, Peter Christiansen successfully defended his PhD thesis titled TractorEYE: Vision-based Real-time Detection for Autonomous Vehicles in Agriculture.
Peter’s PhD study was part of the now completed SAFE project, which sought “to develop autonomous agricultural machinery that will be able to harvest green biomass and cultivate row crops – without animals or humans being exposed to any type of safety risk”. Peter’s work focused on detecting obstacles and traversable areas using RGB and thermal cameras. He achieved this by both incorporating state-of-the-art methods from computer vision to his specific domain as well as developing his own novel methods, such as DeepAnomaly.
Congratulations to Peter from everyone in the Vision Group. Thank you for your collaboration, source of inspiration and our discussions throughout the years. We hope to see you back in the group soon.
Peter’s presentation at his defence can be seen in the video below.
Today Rasmus, Søren and Mads were presenting image-processing related results from the SmartGrass, CloverSense and RoboWeedMaps projects. Thanks to everyone who showed up and for the discussions afterwards.
The SAFE project officially ended 21st September 2017 with a demonstration day at HCA Airport in Odense, Denmark.
All project participants showed their achievements throughout the 4-year project, and several semi-autonomous robots were showcased both indoors and outdoors.
From Aarhus University, we contributed with a dataset demonstration video indoors, and online real-time obstacle detection and avoidance on an outdoor robot, using a combination of stereo camera and lidar. In the image below, our perception system is mounted on top of the small robot on the left.