Back in june 2017, I presented my work on detecting thistles using CNNs. Here is a recap of what I presented. The paper is somehow stuck in the publishing procedure, so while I am waiting for the result, the conference proceeding is all that is available for now: http://www.efita2017.org/wp-content/uploads/2017/09/EFITA_WCCA_2017_proceedings.pdf#page=187
We recently presented our preliminary results in large-scale mapping of grass-clover leys at “Græsland 2018”, the largest grass-clover event in northern Europe, hosted by DLF.
Using an ATV, numerous georeferenced images were collected a couple of days before the first harvest of the 2018 season. Using an automated segmentation of each image into grass (blue), clover (red), and soil (green), the spatial distribution of the three is qualitatively visualized across the field. The three sample images exemplify the corresponding distribution at the three points in space.
In the future, by mounting the camera directly on front of the harvester, the farmer can monitor his entire fields when harvesting, without the need for driving an ATV. This allows the farmer to optimize his fertilization strategy based on the condition and spatial clover/grass distributions of his fields.
For more information, contact Søren Skovsen.
Our previous work in pixel-wise classification of high-resolution RGB images of grass-clover leys into clover, grass, and weeds demonstrated state of the art accuracy.
Extending this work into a two-step classification scheme with corresponding hierarchical labels, we have demonstrated an extended segmentation of clovers into the present species in the dataset: red clover (Trifolium pratense) and white clover (Trifolium repens).
The quantitative results of the segmentations on the hand-annotated test set are stated in the paper from the ICPA proceedings.
Qualitative results of the hierarchical segmentation on two images are shown below.
For more information, contact Søren Skovsen.
The following samples show some promising results of an automated weed instance detector that works in cereal fields and is able to distinguish monocot (red) from dicot (blue) weeds. It utilizes the focal-loss introduced by Facebook AI Research (FAIR), which balance the error contribution based on how easy detectable they are.
Even though the final aim is to classify each weed instance, it is sometimes not possible due to the small size of the weeds. In this case a mono/dicot discrimination is still valuable to determine a suitable weed control strategy.
For more information, contact Mads Dyrmann
Watch the PhD Stud Yara Rizk present ECE, AUB for our group at ECE, AU
Presented by Ph.D. Student Yara Rizk, American University of Beirut (AUB) at Aarhus University, Department of Engineering
Yara Rizk from the Department of Electrical and Computer Engineering at the American University of Beirut (AUB) visits Aarhus University to present some of the work done in the Human Machine Interaction and Machine Learning lab at AUB.
Anders’ PhD defence presentation can be seen in the video below.
PhD defense of Anders Krogh Mortensen for the thesis “Estimation of Above-Ground Biomass and Nitrogen-Content of Agricultural Field Crops using Computer Vision”.
The thesis investigates yield estimation derived from RGB images and coloured 3D point clouds. Image segmentation methods based on image processing, handcrafted feature extraction, and deep learning is investigated. Furthermore, a novel method for segmenting lettuce in 3D coloured point clouds is proposed. Several yield models based on the segmented crops are investigated.
The research findings have shown that recent advances in deep learning can be transferred to segmentation of (mixed) crops. It was further shown, that (simple) growth models can be improved using crop coverage to explain the local variations in the crop.
Anders Krogh Mortensen: http://pure.au.dk/portal/en/anmo@agro…
This PhD project is part of the VIRKN project supported by a grant from the Green Development and Demonstration Program (GUDP) granted by the Danish Ministry of Environment and Food and the Future Cropping project supported by a grant from Innovation Fund Denmark.
– VIRKN: http://mst.dk/erhverv/groen-virksomhe…
– Future Cropping: https://futurecropping.dk/
– GUDP: http://mst.dk/erhverv/groen-virksomhe…
– Innovation Fund: https://innovationsfonden.dk
“Man starter med at skaffe sig overblik over hvor ukrudttet er ude i ens kornmark, og så sprøjter man bagefter, men kun de steder, der er ukrudt. Sådan lyder den besnærrende enkle opskrift, som et hold forskere fra Aarhus Universitet lige nu arbejder på i samarbejder med flere teknologivirksomheder.”
Today, Friday, April 20th 2018, Mikkel Fly Kragh successfully defended his PhD thesis titled Lidar-Based Obstacle Detection and Recognition for Autonomous Agricultural Vehicles.
Mikkels PhD study was part of the now completed SAFE project, which sought “to develop autonomous agricultural machinery that will be able to harvest green biomass and cultivate row crops – without animals or humans being exposed to any type of safety risk”. Mikkels work focused on detecting and recognizing obstacles using lidar-sensing and sensor fusion with cameras. He achieved this by both incorporating state-of-the-art methods from computer vision to his specific domain as well as developing his own novel methods, such as a conditional random field for lidar-camera fusion.
Mikkel will continue to work part time in the department, as he is starting an industrial postdoc position at the company Vitrolife A/S.
Mikkels PhD defence presentation can be seen in the video below.
A drone video from a day where we collected ~16,000 images/140ha in winter cereals from Ørtoft in Northern Jutland: