In the one year project “RGB sensor based measurements of blight in sugarbeets” (In Danish: “Sensor RGB baseret måling af bladsvampeangreb i sukkerroer “) funded by Sukkerroeafgiftsfonden (“The Sugarbeet tax fund”) and in collaboration with the Department of Agroecology, Aarhus University, a deep neural network was trained to segment images of sugarbeet. The network was trained on images collected from four strip field experiments in the 2022 growing season. Image collection was carried out using a field robot with a high quality RGB camera mounted nadir. Images were collected for 9 weeks, and a subset og images were subsequently manually annotated and used as a training set for fine-tuning the pre-trained network.
The preliminary results show a good segmentation of green sugarbeet leafs (89% accuracy) and of the sugarbeet disease rust (80%), with a slight tendency to overestimate the area of the latter. The network has some trouble with distinguishing meldew from relections from the built-in flash, however, the latter was classified as green leaf in the training set rather than being given its own class.
Applying the network to all the collected images, show a clear pattern of the rust initially developing at a few hotspots, which continueue to develop and spread throughout the season. The infection seem to move along the stripes and to a lesser extend across them.
In collaboration with North Carolina State University, Texas A&M and United States Department of Agriculture, our recent work on realtime weeds recognition and quantification was well received in the OpenCV Spatial AI Competition. You can read the details of our work here.
Our latest work in the Smartgrass project was recently published in the Sensors special issue Sensing Technologies for Agricultural Automation and Robotics. With an optimized model and extensive high quality images, new state of the art prediction results in grass clover mixtures was presented. Using a specially developed ATV-mounted camera, the method was applied on 29.848 in-field images to sparsely map 225 hectare for the local legume content.
This is the presentation regarding the paper with title of ” Initial evaluation of enriching satellite imagery using sparse proximal sensing in precision farming “. The paper was presented on SPIE Remote Sensing (RS) 2020 online by Sadaf Farkhani.
Mikkel Fly Kragh presented his work on automating human embryo grading at the European Society of Human Reproduction and Embryology (ESHRE) conference this year in front of an audience of around 2,000. The abstract was submitted in February and selected for oral presentation the 25th of June in Vienna, Austria. The full abstract is available here.
We recently presented our preliminary results in large-scale mapping of grass-clover leys at “Græsland 2018”, the largest grass-clover event in northern Europe, hosted by DLF.
Using an ATV, numerous georeferenced images were collected a couple of days before the first harvest of the 2018 season. Using an automated segmentation of each image into grass (blue), clover (red), and soil (green), the spatial distribution of the three is qualitatively visualized across the field. The three sample images exemplify the corresponding distribution at the three points in space.
In the future, by mounting the camera directly on front of the harvester, the farmer can monitor his entire fields when harvesting, without the need for driving an ATV. This allows the farmer to optimize his fertilization strategy based on the condition and spatial clover/grass distributions of his fields.
Our previous work in pixel-wise classification of high-resolution RGB images of grass-clover leys into clover, grass, and weeds demonstrated state of the art accuracy.
Extending this work into a two-step classification scheme with corresponding hierarchical labels, we have demonstrated an extended segmentation of clovers into the present species in the dataset: red clover (Trifolium pratense) and white clover (Trifolium repens).