A drone spots a problem in your field. Now what?

Drone-mounted cameras measuring NDVI reflectance can identify crop problems, but not necessarily the cause. Researchers are looking for other tools that might

Matt Johnson of M3 Aerial Productions says canopy reflectance tells you where you need to take a closer look.

It’s been more than a few years since drones made their debut in agriculture, but questions about whether they’re worth producers’ time aren’t going away.

Most agricultural drones are set up to measure canopy reflectance using the normalized difference vegetation index (NDVI), a graphical indicator that assesses crop health by measuring canopy reflectance. Drones carrying GoPro cameras can take NDVI readings that can be analyzed in a matter of hours, offering producers a birds-eye view of crop performance.

But is canopy reflectance a practical tool for in-crop management?

The answer is complicated, says University of Manitoba soil science department head Paul Bullock, who delivered a presentation on his NDVI research at a recent seminar.

NDVI measures how much red and near-infrared light a crop reflects. Based on the principle that a healthy, growing crop absorbs red light to use in photosynthesis, healthy crops will reflect near-infrared light but not much red, while a struggling crop won’t absorb as much red light.

NDVI is calculated by taking the measure of near-infrared spectral reflectance and subtracting how much red light is being reflected. “The NDVI value gets larger as you get more biomass and green vegetation,” explains Bullock.

An NDVI map of barley crop at the University of Tasmania in Australia. Black/purple/dark blue indicate bare ground. Green indicates normal to stressed barley with minimal fertilization, and yellow/red indicates lush/healthy/dense barley with high fertilization. photo: University of Tasmania

Theoretically, NDVI readings could help producers assess which areas of the crop need a “boost,” such as a nitrogen fertilizer application. But Bullock says NDVI represents only a single dimension and can’t offer a specific prescription for a troubled area.

“Maybe that area has herbicide burn, a pest or salinity issues,” said Bullock in his presentation. “At any particular scale, the sensor is pretty reliable at picking up the differences (in crop health), but it’s dumb. It has no clue why the differences are occurring. That’s what’s missing in some of the precision agriculture tools — stripping down through all the reasons why this area isn’t growing as well as everywhere else.

“The canopy reflectance is not measuring N in the leaves, just reflectance.”

Field trials

Bullock is working to address the “dumb sensor” issue by matching field data with canopy reflectance data and looking at whether specific information about the crop can be drawn from canopy reflectance imagery. NDVI is a commonly used vegetation index, but it isn’t the only one — Bullock’s team has plans to work with at least 70 others— and he says it has limitations. His goal is to find out which indices are most robust in representing crop performance, such as biomass and vegetation water content.

In 2016 and 2017 he worked with grad students to gather data on a high-yielding wheat plot, which they then compared with canopy reflectance data gathered via several instruments including a drone-mounted multispectral camera, to see which indices were best at predicting yield and biomass.

“On the ground you have an enthusiastic grad student who is collecting a multitude of crop information on hundreds of plots, and you look at the correlations between the ground measurements and the canopy reflectance from the drone images and say, ‘If we want to know the total above-ground biomass, which index works the best? Then, if you want to look at yield, what would work the best?’”

This summer they hope to perform similar experiments in corn, and eventually canola and a legume, to see if there’s consistency in how canopy reflectance works in different crop types.

“The idea is that if it’s really reliable, you can apply this over larger areas with some confidence that those measurements are useful,” explains Bullock. “As long as you can isolate cropland from satellite images, if you have indices that work, and you can get nitrogen and biomass information, the more value that satellite then has. That’s work not many people have done.”

Ground-truthing still key

Matt Johnson is president of M3 Aerial Productions, a Winnipeg-based company that specializes in NDVI, digital elevation mapping and drone crop scouting.

He believes NDVI data is valuable but, it can’t yet stand alone in terms of helping producers make in-crop management decisions. “There are a lot of articles saying that drone data is not enough. Obviously it isn’t enough,” he says. “You need an understanding of your field, your soil. Canopy reflectance tells you where you need to take a closer look.”

NDVI differs from plain photographic imagery of a field, says Johnson, because an RGB colour camera on a drone give you an aerial perspective on the crop, but NDVI is able to highlight areas of stress before they show up in crop colour.

“The drone sees the differences. It’s not telling you that there’s a water problem here and a high spot over here so it’s dry, or there are bugs over here, it just shows you that clusters of plants are struggling and from there you can go and take a closer look. You can’t tell from a colour picture, because the plant’s colour may not have changed yet,” he says.

Johnson’s company owns six drones, but also provides training services and can subcontract pilots from a large network across the country. They work with farmers directly, or, more often, are contracted by agronomists to fly for them and then perform analyses by the side of the field. M3 Aerial uses a platform called Senterra to gather NDVI imagery, and their sensor is a single near-infrared band. While not as sophisticated as the multi-spectral imagery that Bullock’s program uses, Johnson says it’s the difference between 90 per cent versus 95 to 98 per cent accuracy, and that’s more than what’s needed by farmers.

Once aerial data is gathered on a field at 400 feet, drones can be sent in lower, three to four feet above the field, to look for signs of, say, sclerotinia, or overly dry or wet soil, or agronomists can go into the field the old-fashioned way — on foot — to take soil or plant samples.

“We have the ability to make the decision to determine the rate you want to spray if you have variable-rate technology,” Johnson says. “You can make those decisions on-site if you can follow up your flight with further analysis, based on an agronomic approach.”

At $800 to $2,000 per day, hiring a company like M3 Aerial is expensive, but Johnson says that mapping a quarter-section takes between 20 minutes and an hour, so it’s time-consuming. Added to this, non-licensed drone operators aren’t legally allowed to fly above 300 feet, and mapping a field using a quadcopter takes twice as long at 300 feet as 400 feet. “When you start to look at it you realize there’s a reason why service providers charge the rates they do,” he says.

The next step is for the industry to learn how to use canopy reflectance to identify specific problems and prescribe solutions.

For the time being, though, ground-truthing — whether that means flying low or walking into the field to get a closer look at crop and soil conditions — is still a requirement because the imagery isn’t there yet, Johnson says. “But it’s down the road for sure.”

About the author


Julienne Isaacs

Julienne Isaacs is a Winnipeg-based freelance writer and editor. Contact her at [email protected]



Stories from our other publications