How technology is spotting our future wheat varieties
Since the seeds were planted last December, Research Assistant Josh Ball has been capturing large and diverse data from entire wheat fields within just one hour; a task that would normally take many times longer. His secret is that he focuses not on the crop but on his sharp-eyed drone and lets the camera watch the wheat.
Achieving greater wheat harvests despite our changing climates requires extensive crop improvement programmes. We do this through monitoring wheat varieties growing in the field, targeting the traits we need and breeding these qualities back into our most productive lines. The challenge is that observing and measuring these visible traits accurately and consistently over large separate sites is tricky – for a human.
“Together with Prof Simon Griffiths, we have applied machine learning, computer vision, remote sensing, and robotic systems to wheat phenomics to address challenging problems in crop research and crop production,” said Ji Zhou, Group leader at the Earlham Institute.
Imagine you are looking at one trait, plant height for example, you’re recording it across hundreds of plants over multiple fields and you’re doing all of this repeatedly over the growing season.
It would take a huge amount of time, be tough to do accurately by eye and the occasional human error might creep in. Until recently however, these visual tests have been the only way researchers could identify important phenotypes, the observable plant traits, causing a bottleneck in the crop improvement process. Now small robot sentinels observe our fields day and night, while keen-eyed drones routinely watch our crops from overhead.
Josh is a drone pilot for the aerial crop analysis project AirSurf. During the growing season, he routinely runs drone flight ‘missions’ over fields to capture thousands of aerial images of the developing crops.
“Normally these things would be done by people walking out into the field and doing it by hand,” said Josh, “we’re helping the biologists save time with high-quality imagery and analysis results quantified through computer vision and machine learning techniques. It’s about accuracy that’s also cost effective.”

What does the drone see?
During its flight, the drone captures two main types of visible features; morphological and colour based. Morphological might involve factors such variation in crop size, shape or lodging. Colour based features on the other hand might be standard RGB or infrared spectra. It can also include vegetative index which shows plant transpiration or the effects of drought.
The drone records all of these features in a series of photographs across the field. These images are taken from across the growing season and combined into a detailed orthomosaic – a geometrically corrected aerial map of the fields.
From these maps a 3D reconstruction is generated which can be analyzed by software from the Zhou lab. The resulting models can digitally determine informative traits such as height or crop cover.
Faster recording, more reliable data.
The convenience of drone technology is that it can rapidly capture information accurately in an automated grid flight pattern of the field. It takes relatively little time to complete a full scan of an area and can easily move between separate sites.
Unlike the human eye, the footage from the drone’s vision-based recognition is calibrated on each mission to improve scoring constancy in changing natural light.
Once the recording is complete, uploading this information directly from the device negates the need for a person to manually enter this data. The information comes in two formats, 2D imagery and 3D point cloud that represents the whole field, ready for analysis by software developed by scientific programmer, Alan Bauer.
“Once I’ve flown for a week, I can take the data back, plug it into the computer and start the analysis,” said Josh, “because of this we can achieve a relatively quick turnaround, as we do not have to manually input large amounts of data”.
All of these steps save time and reducing the chance of human error. It means researchers gain more time to focus on other innovative areas of their work, opens the opportunity to record more datasets of field samples and be confident in the information they’re investigating.
Informed choices for breeders, better crops for farmers.
This faster, scalable, more reliable information capture approach is not only convenient, it’s urgently needed.
Changing climates and emerging disease pressures are growing threats to farmers’ yields. When a farmer chooses the varieties of crops they will grow for the year ahead, they rely on scoring information of how different crops perform under these pressures. With increasing uncertainty around growing season conditions, often the best varieties to invest in for the year ahead are those which are resilient as well as high yielding.
Yet to breed resilience into our highest performing cultivars requires many years. Breeding companies grow and assess thousands of plant each year in search of these desirable traits. This involves capturing a huge amount of phenotypic and environmental data in the hunt for resilient plants.
Technologies such as Airsurf mean that this search for the wheat lines farmers and consumers need is that bit more streamlined.
“The world needs to develop resilient crops.” said Designing Future Wheat Group Leader Simon Griffiths, “resilience can be expressed at any time in the growing season and this technology allows us to track these key moments in crop development.

Other field automation
The AirSurf drones and software are just one approach to robotised observation of our crops. They can be complimented by other in-field environmental monitoring sensors such as those in the CropQuant project.
CropQuant involves placing a series of small sensors across a field that are able to detect factors such as temperature, humidity, soil temperature and moisture simultaneously throughout the season through the Internet of Things-based CropSight system.
“This allows us to track the growth patterns of individual wheat varieties and how they differ based on microclimates across the field,” said Josh, “it is often not feasible to send a person out every day to do the same thing.”
Across Designing Future Wheat
Airsurf and CropQuant are already being used to assess the phenotypic attribute trials of wheat varieties from the Designing Future Wheat program. These scorings are being used in the Breeders’ Toolkit.
More technologies similar to Airsurf and CropQuant are already in line for future deployment. With increasing demand from breeders for high throughput field monitoring systems, combinations of these emerging automated platforms are being used to complement each other
“There’s a continuing development of all of these projects and the opportunity to tie all of these technologies together,” said Josh, “so the CropQuant devices in the field can tie in with the AirSurf information to increase the accuracy”
On being asked what the day to day life of a drone pilot is like over the summer season, Josh smiles.
“When you tell your friends and family that you’re in a field flying a drone for a job…” says Josh with a shrug “Standing in the sunshine is certainly an enjoyable part of my day.”
Josh Ball is part of Ji Zhou’s lab at the Earlham Institute. The Zhou lab develops high throughput approaches to analyzing phenotypic traits through robotics and software development. The Earlham Institute is core funded by the Biotechnology and Bioscience Research Council (BBSRC) and a partner of the Designing Future Wheat programme.
