A drone video from a day where we collected ~16,000 images/140ha in winter cereals from Ørtoft in Northern Jutland:
Inspired by yesterday’s visit to Agritechnica and the booths by Bayer/Xarvio/Bosch demonstrating precision weed localization and spraying, I decided to make a small video of the our current weed-decrimination capabilities:
/ Mads Dyrmann
The plant seedlings dataset, made in collaboration with University of Southern Denmark and Aarhus University in Flakkebjerg, has how been moved to this site.
The dataset contains images of approximately 960 unique plants belonging to 12 species at several growth stages. Take a look at it here.
Today, Friday, November 3rd 2017, Peter Christiansen successfully defended his PhD thesis titled TractorEYE: Vision-based Real-time Detection for Autonomous Vehicles in Agriculture.
Peter’s PhD study was part of the now completed SAFE project, which sought “to develop autonomous agricultural machinery that will be able to harvest green biomass and cultivate row crops – without animals or humans being exposed to any type of safety risk”. Peter’s work focused on detecting obstacles and traversable areas using RGB and thermal cameras. He achieved this by both incorporating state-of-the-art methods from computer vision to his specific domain as well as developing his own novel methods, such as DeepAnomaly.
Congratulations to Peter from everyone in the Vision Group. Thank you for your collaboration, source of inspiration and our discussions throughout the years. We hope to see you back in the group soon.
Peter’s presentation at his defence can be seen in the video below.
Today Rasmus, Søren and Mads were presenting image-processing related results from the SmartGrass, CloverSense and RoboWeedMaps projects. Thanks to everyone who showed up and for the discussions afterwards.
Note: All videos are in Danish.
Metadiscussion of Deep Learning in Agriculture
Automatic recognition of weeds
Automatic clover-grass estimation
The SAFE project officially ended 21st September 2017 with a demonstration day at HCA Airport in Odense, Denmark.
All project participants showed their achievements throughout the 4-year project, and several semi-autonomous robots were showcased both indoors and outdoors.
From Aarhus University, we contributed with a dataset demonstration video indoors, and online real-time obstacle detection and avoidance on an outdoor robot, using a combination of stereo camera and lidar. In the image below, our perception system is mounted on top of the small robot on the left.