RoboWeedMaps

To combat weeds, farmers have been spraying their fields with herbicides for decades. Until now, however, they have used a standard herbicide mixture, which are spread over the entire field. This is not particularly smart from either a financial point of view or an environmental or biological perspective because herbicides are expensive, can have implications for the water quality in streams and groundwater, and lead to resistance in the weeds. Not to mention that a standard mixture of herbicides may have a good effect on some species of weeds, but a less effective impact on others.

Today, several Decision Support Systems exist, which can recommend optimal herbicide dosages for a given weed population in a field thereby reducing the farmer’ herbicide expenditures by at least 40%. However, in order to use this system, it is necessary to inspect the field and assess the field’s weed population.

In RoboWeedMaPS the aim is to bridge the gap between the potential herbicide savings and the required weed inspections by using image analysis for automating the weed inspection part. Thus, it will only be necessary to collect images from representative parts of the field, after which the computer will determine which weeds are present in the field and which actions to take for controlling the weeds. By combining an automatic, camera-based weed recognition with a decision support tool, the farmer can get advice on the optimal use of his equipment while saving herbicide.

The images are automatically uploaded in the cloud, where they are analysed by several specialised algorithms for the composition of the weeds compared with the competitive ability of the crops. This is where Big Data come into the picture because the whole system relies on an enormous ‘weed data base’, where the researchers use deep learning to teach the computer to recognise different types of weeds.
Although it might sound simple, it is actually quite difficult to automate the process, particularly because weeds in the fields hardly ever resemble those in the weed botanical books.

The project has now reached the stage where the cameras can take high-resolution images with an accuracy of 4 pixels per millimetre, even when driving at 50 km/h in the field, and where the computer has satisfactorily learned to recognise 27 types of weeds from a data base withthousands of images. The computer is trained with far more species of weeds, but the number of training images is still too small to achieve strong recognition.
The aim is to make everyday life easier for farmers, so that the computer itself finds where the weeds are located, what type they are, and which type of herbicide should be used at precisely that spot in the field. The computer will thus control the dosage when farmers are spraying their fields, and even regulate different types of herbicides and dosages depending on the type of weed – an important part of the future smart farming.

Automatic weed detection with bounding boxes

Partners

AU logo AgroIntelli logo IPMwise logo
Datalogisk A/S logo Danfoil logo i-gis logo

Publications

2018

Nima Teimouri, Mads Dyrmann, Per Rydahl Nielsen, Solvejg Kopp Mathiassen, Gayle J. Somerville, and Rasmus Nyholm Jørgensen. (2018). Weed Growth Stage Estimator Using Deep Convolutional Neural Networks. Sensors (Paper)

2017

Mads Dyrmann, Rasmus Nyholm Jørgensen, Henrik Skov Midtiby. (2017). Detection of Weed Locations in Leaf-occluded Cereal Crops using a Fully-Convolutional Neural Network. Advances in Animal Biosciences Volume 8, Issue 2 (Papers presented at the 11th European Conference on Precision Agriculture (ECPA 2017) (Paper)
Mads Dyrmann. (2017). Automatic Detection and Classification of Weed Seedlings under Natural Light Conditions. University of Southern Denmark (Paper)
Morten Stigaard Laursen, Rasmus Nyholm Jørgensen, Mads Dyrmann, Robert Poulsen. (2017). RoboWeedSupport – Sub Millimeter Weed Image Acquisition in Cereal Crops with Speeds up till 50 Km/h. International Journal of Biological, Biomolecular, Agricultural, Food and Biotechnological Engineering (ICSWTS 2017) (Paper)
Mads Dyrmann, Peter Christiansen. (2017). Estimation of plant species by classifying plants and leaves in combination. Journal of Field Robotics (Paper)
Thomas Mosgaard Giselsson, Rasmus Nyholm Jørgensen, Peter Kryger Jensen, Mads Dyrmann, Henrik Skov Midtiby. (2017). A Public Image Database for Benchmark of Plant Seedling Classification Algorithms. arXiv (Paper)
Rasmus Nyholm Jørgensen, Mads Dyrmann. (2017). Automatisk ukrudtsgenkendelse er ikke længere science fiction. Plantekongres '17 (Paper)

2016

Mads Dyrmann, Henrik Skov Midtiby, Rasmus Nyholm Jørgensen. (2016). Evaluation of intra variability between annotators of weed species in color images. 4th International Conference on Agricultural and Biosystems Engineering (Paper)
Mads Dyrmann, Anders Krogh Mortensen, Henrik Skov Midtiby, Rasmus Nyholm Jørgensen. (2016). Pixel-wise classification of weeds and crops in images by using a Fully Convolutional neural network. 4th International Conference on Agricultural and Biosystems Engineering (Paper) (Presentation, video)
Mads Dyrmann, Henrik Karstoft, Henrik Skov Midtiby. (2016). Plant species classification using deep convolutional neural network. Biosystems Engineering (Paper)
M. S. Laursen, R. N. Jørgensen, H. S. Midtiby, K. Jensen, M. P. Christiansen, T. M. Giselsson, A. K. Mortensen and P. K. Jensen. (2016). Dicotyledon Weed Quantification Algorithm for Selective Herbicide Application in Maize Crops. Sensors (Paper)

2015

Mads Dyrmann. (2015). Fuzzy C-means based plant segmentation with distance dependent threshold. Proceedings of the Computer Vision Problems in Plant Phenotyping (CVPPP) (Paper)

2014

Mads Dyrmann, Peter Christiansen. (2014). Automated Classification of Seedlings Using Computer Vision: Pattern Recognition of Seedlings Combining Features of Plants and Leaves for Improved Discrimination. Aarhus University (Paper)