The next revolution in farming isn’t about chemistry or genetics. It’s about scouting, and the good news is that the enabling technology has come several giant steps closer in just the last two years, thanks to unmanned aerial vehicles (popularly called UAVs or drones) which offer a visual platform for scientific crop monitoring.
Drones are becoming a big piece of the precision ag puzzle, and Dr. Kevin Price of Des Moines, Iowa, says they will completely change how we scout.
“What this technology does is help build a map to look for problem areas in the field,” Price explains. “The scout gets the co-ordinates and walks out into the field to get a first-hand look at what’s going on.”
Drones will also add huge new efficiencies, says agronomist Greg Adelman of Southey, Sask. “While a person out walking the field with a metre stick or GPS can assess 160 acres in an hour, I’m assuming the periday peak efficiency would be something like 6,000 acres with a fixed wing UAV.”
Adelman is a Canadian affiliate of Price’s company RoboFlight, a scouting data management company that takes images from UAVs and stitches them together into a composite image called an orthomosaic. Information gleaned from these images will help farmers make snap decisions about managing their crops, while the precision lends them tremendous potential to reduce costly inputs.
This really goes back to the dawn of the aircraft age at the beginning of the First World War. Initially, the primitive biplanes were used for reconnaissance, with camera crews dispatched to monitor enemy lines from the air. They took thousand of pictures and watched for troop movements, changes in the lines or massing of equipment. Command officers used this information to plan both their offence and defence according to the incoming data.
Farmers are no strangers to this idea either. Some have used aerial photography and satellite imagery to monitor their crops, although getting this data has often been expensive and inconvenient.
Now, what these drones offer is an inexpensive alternative that can fly at the farmer’s need. With the advent of the Global Positioning System, navigation software and smaller, high-resolution digital cameras, the stage is set to incorporate drones into farming.
“Companies build implements designed to do precision applications of chemicals and seeds and, as you’re driving through the field, the tractor is adjusting the rate of fertilizer or herbicide based on geographic co-ordinates that are fed into the sprayers by an on-board computer,” Price says. “If you have a map that tells the tractor where it is and what’s there, then the software can decide whether to turn on the sprayers or not.”
This kind of precision depends on highly detailed mapping, which is what the drones do. The first drones in agriculture were model airplanes with a camera mounted in a jury-rigged box on the wing. The plane flew along a programmed flight path transmitted from a computer to an on-board GPS sensor. The data directed the plane through a series of points and instructed the camera to snap images along the way.
Today’s navigation system is still the same, but the cameras have greater resolution and the current aircraft are a lot more suitable to the task. RoboFlight, for instance, uses the electric-powered RF70 airframe.
“This aircraft is amazing,” Price says. “It’s got multiple bays for mapping units and it will cruise for 45 minutes to an hour and 20 minutes depending on the load. It’s made of high-density EPP so it’s not like beer cooler foam. We’ve taken that plane and crashed it from 200 feet in the air, picked it up and put it back in the air again. It’s highly durable. We’ve flown it in 50-mile-an-hour winds and we’ve had it up to 100 miles an hour with a tailwind and it was still flying stable.”
So that’s what the airplane does. The next part of the package is the camera equipment that it carries in any of the payload bays. This is the farmer’s eye in the sky and can see things we’ve never seen before. In digital imagery, the picture you see is actually made up of thousands of points called pixels. The quality of the image (i.e. the resolution) is a direct result of how small an area of ground is represented in one pixel. In a satellite image each pixel represents about one square metre on the ground at best.
“Now we’re talking two centimetres,” Price says. “We’re looking at individual plant leaves and we’re able to assess the pigmentation of the plants. I can tell you the geometry of the leaves in three-dimensional space to see which way they’re oriented.”
Not only is the resolution much finer but we’re now able to see different wavelengths of light too. This gives us even more useful information as to what’s going on in that field.
Price recalls one farm client who had Canada thistle in a field, and who spent $4,000 to spray the entire 120 acres to knock the weed out.
“Well, once we got through flying the field a day or two after he sprayed, we could still see Canada thistle,” Price says. We found that he had only needed to spray 0.6 acre but he had sprayed 120. We were able to map the location of all the plants and flying the field and processing of the data cost $506. He could have easily gone in and spot sprayed and saved himself a tremendous amount of money.”
The first UAV cameras were digital units that you could get from any camera store. They were small, they required no film magazine and no motor drive so they were extremely light compared to the old film cameras. Additionally, because the images were digital, they were easily uploaded to a computer, where several images could be stitched together into a composite.
Now we’re sending specialized colour infrared cameras up there to give us eyes that can filter out certain wavelengths and see the world in terms of visible and infrared radiation.
“By putting the two of them together, you compute an index called the Normalized Difference Vegetation Index or NDVI that’s highly sensitive to chlorophyll concentration,” Price says. “Anything that affects the plant, anything that changes the concentration of chlorophyll on the ground, the NDVI will pick it up.”
The NDVI is much more sensitive than our own eyes in detecting some of the subtle differences in the way light is reflected back to it from the canopy. Changes in the plant’s pigmentation are sometimes the result of stress on the plant, such as nitrogen or water deficiency, disease or insects. The resulting images, Price says, can help farmers make much better management decisions.
“It can measure your biomass, it can measure the photosynthesis and it can measure the stress level of the plant,” adds Adelman. “You can actually see where the plant is stressed up to two weeks before you see symptoms. I’ve seen RoboFlight data where they could see nitrogen deficiency two weeks before symptoms showed up, so if you can see it that quickly you can address the problem before it’s showing symptoms and reduce yield loss in that field.”
The third part of the system is the computer power to take the data and quickly put it together into a ready-to-read package.
“That’s what our company is really all about,” Price says. “What we’re doing right now is working with the portals for allowing people to get the data to us in a very efficient manner. Basically you pull the SV card out of the camera, plug it into your computer and your computer will automatically download it to our shop. We process it and have it back to you.”
As the technology matures the data will get better and better and the computer capabilities will improve in step. What this means is that farmers and agronomists will become even better tuned to the behaviour of land on a section-by-section basis. If we can see plants are under stress, in time we hope to develop the algorithms that will tell us why the plants are stressed. We’ll be able to see different types of weeds, different insect pests at work as well as be able to identify specific diseases before they become a major problem. Precision agriculture will become more and more precise.
“This new UAV technology will be like auto steer,” concludes Saskatchewan farmer Brad Hanmer. “Within five years it was mainstream and I think this is the next step for RTK technology. We can make even more management decisions based on science and less on intuition.”