Agricultural Intelligence

February 24, 2025

Artificial Intelligence is touted to be the most significant technology development since the internet. But given plants and animals are not recharged via USB what might be the practical reality of AI on farming businesses? The Compass pulls out the Magic 8 ball and gives it a shake.

Data, data, data. Barely a day goes by without a headline spruiking new tech that is underpinned by data. Self-landing rockets, drone warfare, customer service bots and large language models all come from technology’s improved ability to generate and process tomes of data into something useful.  

GPS steering is a well-established example. How does a tractor stay on course? An onboard computer processes constant streams of data from the Navigation System Receiver, Inertial Measurement Unit, sensors, cameras and user settings to send information to the steering system. So much data!

At the same time as drinking from the data firehose, we never seem to have enough of it. Look at any yield map. Interesting data on the face of it, but why did one part of the paddock run 2 tonnes to the hectare when the rest went 4? Was it disease? Suboptimal germination? Patchy finishing rains? Water logging? The lime spreader missing bits 3 years ago? Without the data, educated guesswork prevails.

What about Artificial Intelligence? Seemingly overnight, AI has sprung up. Surely this will be the panacea for all our problems? Just point AI at an Excel spreadsheet and it should be able to run the farm!

If only it was so easy. Although AI has become the buzz word, the principles behind it are quite old.

Take CBH’s recent partnership with Deimos developing a grain sample visual analysis machine – the latest attempt over decades to reach the holy grail of automated quality assessment.

Underlying visual analysis, there is a core component of AI–machine learning – which is a set of techniques that allow model algorithms to learn patterns from data and make predictions.  For example, to recognise a sprouted wheat kernel, the machine needs to ‘see’ sprouted wheat kernels via sensors and learn how they are different from anything else potentially found in a wheat sample –including a live ladybird, a gum leaf and a sprouted barley kernel. Easy for a human that has had a lifetime recognising such things - not as easy for a machine that recently emerged from the technological womb. Accordingly, ‘training’ AI is a significant development cost.  

If this seems complex, it is, and that’s the point. For a bespoke purpose such as automated assessing grain quality, the investment in solving the myriad of challenges has been significant over multiple decades.  

Which brings us to business models.

In simple commercial terms, data (or at least, useful data) is good for one thing, making better decisions, and there are three ways to do that:

1.     Improve speed

2.     Increase accuracy (granular, reliable, precise- you get the drift)

3.     Improve efficiency (lower cost)

Tech developers generally trade-off between the three.

If farmer Bob ran every sheep in his flock over calibrated weighing scales, he would have very accurate data of sheep weight. But that is hard work and difficult to do often. Instead, what if Bob flew the (fictional) Sheep-Weigh 4000 drone over the flock, equipped with laser sensors that measured individual sheep dimensions, and fed that data into a model that would estimate weight per animal to within a 10% margin for error? Less accurate (but accurate enough), faster … but probably not cheaper. Although the idea seems simple, the investment required to overcome the technical challenges would be colossal.

The way many big tech companies recover their investment costs is through enormous scale (think Spotify, Amazon, Facebook). The many billions in investment are diluted across many millions, if not billions, of consumers. This model does not work with solutions for small niche markets requiring expensive hardware (eg laser equipped sheep weighing drones).

Although we are already seeing AI solutions available in everyday software tools - such as accounting packages scanning invoices and forecasting cashflow - generally, ‘desktop’ level software won’t move the farm profitability needle much.

The big ticket items will come from the porcupine of sensors and cameras sitting on equipment generating torrents of agronomic data requiring hefty cloud-based processing muscle to help save tens of thousands in input costs whilst lifting yields (theoretically). High value, but high cost.

Agricultural tech companies recognise this issue and are playing with pricing models to avoid sticker shock. For example, a well-known ag company’s US pricing for its precision spraying AI technology includes hardware costs, annual subscription AND $4 for every acre it travels across. A mix of fixed, time based and variable pricing - introducing complexity to help disguise the true cost perhaps?

The other way to make upfront costs more palatable is bolting AI gear into the latest and greatest machine which is already so expensive that another $100k or so is hardly noticeable. If a farmer ends up with the hardware whether they wanted it or not, they might as well pay the yearly subscription to make it work.

It is always wise to park the latest tech excitement bus and consider the confluence of two forces - what information is valuable to the farmer’s business model versus the cost of that information via the tech provider’s business model.  

That’s one decision that AI shouldn’t make.