How new tech aims to improve soil carbon measurement
Measuring soil organic carbon accurately has always been difficult.
But with the rise in interest in carbon trading in agriculture, as well as the other benefits from managing carbon stocks in soils, companies are looking at alternative methods of measuring soil organic carbon that meet the key criteria of accuracy and affordability.
Robust measurement of soil carbon stocks – total soil carbon, not just soil carbon percentage – and how it changes over time is critical to providing evidence to governments, food businesses and other corporates that soils really are removing carbon dioxide from the atmosphere and storing it.
Techniques being developed to provide that evidence include sensor technology, improving the throughput and accuracy of physical soil sampling, and advanced data analytics.
Historically, soil organic carbon has been measured through physical laboratory analysis of soils, but there are a number of limitations, according to Jacquie McGlade, chief scientist and co-founder of Downforce Technologies.
Cost and in-field variability are key limitations: how do you measure that variability while keeping costs in check?
Traditionally, variability might be measured by taking a number of cores in one location, but they are often put into one sample bag, explains Prof McGlade.
“That’s counterintuitive, because you’re trying to understand variability and by mixing it, you are losing important information.”
W-pattern individual samples can give some appreciation of variability, but cost often reduces the number of samples analysed, she says.
“Sometimes it can be just one sample per hectare. We’ve done a lot of hyperlocal sampling and you can detect as much variability in 10sq m as the whole field.
“That’s a challenge for sampling, as you can sample carefully and spend a lot of money and not be very clear on what it is representative of,” she suggests.
Variation also occurs within the soil profile vertically.
“As you go down the soil profile, you lose more of the effects of the crop and more of the effects of the microbiome come in. What careful researchers do is take a core and not disturb it and then cut it into 5cm layers in the laboratory.”
She likens some current sampling protocols to trying to determine a photographic image using effectively just a few pixels from several million data points.
The answer to improving that resolution lies in harnessing big datasets, according to Prof McGlade.
Using big data
Regular satellite data, for example, is readily available.
“You can tell a lot about crops from satellites, and while it is more difficult to say much about soils, it is possible. Soils have lots of features that can help you learn about carbon.”
It’s not just satellite data. Data taken by proximal sensors, measuring electromagnetic conductivity, gamma radiation and spectral emissions, such as visible near infrared, also provide a lot of useful information, which can be linked to global positioning systems to provide highly accurate measurements, she says.
“Each of these different data sources have their own strengths for what they are good for – for example, spectral data is good for soil carbon, while gamma radiation tells you about cations such as potassium.
“However, it is very expensive to get these surveys done unless you own the instrument, so at Downforce we’ve created a data fusion process.
“We take the best data from multiple locations and those stored in public libraries to create global datasets. In some countries, such as the UK, we use national datasets to add even more detail.”
From that a virtual model – a digital twin – is created, which can be updated using real-time data, and uses simulation, machine-learning and reasoning to predict how, in this case, soil organic carbon has or will change in the field, she explains.
“To do that you need to understand the science of why soils change, and bring in other data such as climate data, to bring it all together in the digital twin.”
The result is a powerful tool, which, she says, can look back using satellite and other data to show how soil carbon and other soil characteristics have changed over time in 10sq m patches of fields every 10 days.
Downforce Technologies is primarily using the tool with supply chains to help assess how farms are doing on the journey to net zero.
Working at such a large scale brings the cost down to about £1/ha, although costs to individual farmers would be higher.
“For the first time we can provide an accurate measure of carbon removals,” claims Prof McGlade.
“Farmers are interested in whether they are net positive or negative, and that leads to questions about how to reduce emissions.
“For the retailer it provides an overall picture of all the farms together and the data to prove whether the supply chain is net zero.”
It also gives an indication of resilience for farmers, she adds.
“So, if it is very hot or dry on land that is not very resilient, you can see it loses soil carbon very quickly – that interannual fluctuation is a strong measure of soil health.”
In Australia, that information is being used by farmers to help obtain business loans, while on a management level the analysis can help determine whether interventions are making a difference in a short time frame.
“Sometimes, within three to six months you can see a difference, but it depends on the soils.”
Initial results on farm have also shown that rotation, even different varieties of wheat, can have an effect on carbon build-up in soils.
“We can see that if you do certain things in your rotation you can accumulate carbon in the soil. Sometimes that means the last crop in the rotation has a bumper yield because there is so much more carbon built up.
“So we can become field- and crop-specific about your likely outcomes for soil carbon and soil health.”
For the full article, follow this link to Farmers Weekly