July 22, 2024
If you’ve ever taught a child to ride a bike, and you tell them, “Watch out for that rock”!
Guess what happens.
Yep, they ride straight into the rock.
“Look at the gap” is a better instruction.
Most green-on-green camera sprayers use algorithms to identify the weeds, one species at a time, and turn on a nozzle to spray that weed.
Amazing technology.
But what if we flipped it on its head? What about if we taught the cameras to identify the crop, and tell it to spray any other green plant that isn’t the crop?
This theory was put to the test recently by University of Western Australia researcher Michael Mckay with help from the AHRI team of Mike Ashworth, Roberto Lujan Rocha and Monica Danilevicz. They tested a number of models after just 24 hours of training, and the best one achieved 84% precision.
This approach improves how the weed identification models learn and makes it easier to develop algorithms to identify a number of weed species, without the need to develop a separate model for every weed species.
Weed identification is a fast-moving space, and research like this makes it move even faster.
If you’re not canola, you’re a weed
This research involved testing three different deep learning models called ResNet18, ResNet34, and VGG-16. If you’ve heard of these before, chances are you’re right into this stuff.
For the rest of us, essentially these models were trained to do two things.
- Identify canola plants in the field, and
- Assign all other plants in the field as weeds.
The image below shows what the models see. Using images from crops, the models identified crop plants in orange, and any other plants identified were considered as weeds and assigned a blue colour.
Three measures
The models were tested for:
- Precision – as the name suggests, how accurately the model can identify an object. It measures how well the model goes at not labelling a canola plant as a weed.
- Recall – how well the model identifies all of the objects in the image. The model’s ability to find all the canola’s in the field, even if it means mislabelling some weeds.
- IoU – intersection – the number of overlapping pixels.
Table 1: The performance of the models at identifying canola averaged across three field sites in Western Australia.
Resnet-18 | Resnet-34 | VGG-16 | |
Precision | 84% | 84% | 83% |
Recall | 87% | 87% | 82% |
IoU | 76% | 77% | 71% |
The ResNet-34 model performed the best, slightly ahead of Resnet-18, and much better than VG-16.
An important aspect to note is that light conditions have a big impact on how accurate these models can be, so capturing images in a range of field conditions to train these models is important. A simple solution may be to use lights, day and night, while precision spot spraying.
Summary
This research has used an innovative approach to identify crop plants and assign everything else as a weed for precision spot spraying. This is both an alternative way of building these clever machines, and it advances the development of all precision spot spraying machines, improving the rate of development of algorithms for weed detection. Watch this fast-moving space.
Posted in: AHRI Insight, Herbicide evolution and technology