The following samples show some promising results of an automated weed instance detector that works in cereal fields and is able to distinguish monocot (red) from dicot (blue) weeds. It utilizes the focal-loss introduced by Facebook AI Research (FAIR), which balance the error contribution based on how easy detectable they are.
Even though the final aim is to classify each weed instance, it is sometimes not possible due to the small size of the weeds. In this case a mono/dicot discrimination is still valuable to determine a suitable weed control strategy.
Yara Rizk from the Department of Electrical and Computer Engineering at the American University of Beirut (AUB) visits Aarhus University to present some of the work done in the Human Machine Interaction and Machine Learning lab at AUB.
Anders’ PhD defence presentation can be seen in the video below.
PhD defense of Anders Krogh Mortensen for the thesis “Estimation of Above-Ground Biomass and Nitrogen-Content of Agricultural Field Crops using Computer Vision”.
The thesis investigates yield estimation derived from RGB images and coloured 3D point clouds. Image segmentation methods based on image processing, handcrafted feature extraction, and deep learning is investigated. Furthermore, a novel method for segmenting lettuce in 3D coloured point clouds is proposed. Several yield models based on the segmented crops are investigated.
The research findings have shown that recent advances in deep learning can be transferred to segmentation of (mixed) crops. It was further shown, that (simple) growth models can be improved using crop coverage to explain the local variations in the crop.
“Man starter med at skaffe sig overblik over hvor ukrudttet er ude i ens kornmark, og så sprøjter man bagefter, men kun de steder, der er ukrudt. Sådan lyder den besnærrende enkle opskrift, som et hold forskere fra Aarhus Universitet lige nu arbejder på i samarbejder med flere teknologivirksomheder.”
The plant seedlings dataset, which was re-published last week, is now subject to a Kaggle image-classification competition. We encourage all to take a look at the dataset and commit their solution to the competition.
Inspired by yesterday’s visit to Agritechnica and the booths by Bayer/Xarvio/Bosch demonstrating precision weed localization and spraying, I decided to make a small video of the our current weed-decrimination capabilities:
The plant seedlings dataset, made in collaboration with University of Southern Denmark and Aarhus University in Flakkebjerg, has how been moved to this site.
The dataset contains images of approximately 960 unique plants belonging to 12 species at several growth stages. Take a look at it here.
Today, Friday, November 3rd 2017, Peter Christiansen successfully defended his PhD thesis titled TractorEYE: Vision-based Real-time Detection for Autonomous Vehicles in Agriculture.
Peter’s PhD study was part of the now completed SAFE project, which sought “to develop autonomous agricultural machinery that will be able to harvest green biomass and cultivate row crops – without animals or humans being exposed to any type of safety risk”. Peter’s work focused on detecting obstacles and traversable areas using RGB and thermal cameras. He achieved this by both incorporating state-of-the-art methods from computer vision to his specific domain as well as developing his own novel methods, such as DeepAnomaly.
Congratulations to Peter from everyone in the Vision Group. Thank you for your collaboration, source of inspiration and our discussions throughout the years. We hope to see you back in the group soon.
Peter’s presentation at his defence can be seen in the video below.