webcam-b

Smart farming with ‘AI at the edge’

Cambridge Consultants has announced to bring artificial intelligence (AI) to the edge of the network, using low-cost, low-power devices to perform complex machine learning tasks

‘AI at the edge’ is set to enable AI to solve many of the real-world challenges, out in the field. The approach is demonstrated by Fafaza, a precision crop spraying technology that performs plant recognition and individual treatment in real time.

Precision agriculture means harnessing technology to optimise production. It relies on precise granular data at the individual plant level, on the scale of large industrial farms, supporting everything from weed identification to crop health and yield estimation. This understanding can inform real-time actions, for example, the application of herbicide to an individual weed. This is the challenge that Fafaza addresses: deploying AI ‘at the edge,’ on the back of a moving tractor and without the need for connectivity.

Fafaza is designed to spot broadleaved weeds amongst the grass and to treat individual target leaves with herbicide. The system identifies, classifies and applies treatment in real time while moving at tractor speed. The Cambridge Consultants team chose this tough ‘green on green’ challenge to demonstrate the potential of state-of-the-art machine vision and AI.

Although AI techniques have been able to achieve plant recognition for a number of years, the challenge has been in moving from powerful specialist platforms with delayed processing of data, to processing and acting in real time: this is ‘AI at the edge’. To be technically practical, a system must be fast enough to distinguish and identify plants using ambient light and to apply treatment while the plant is still in view. To be commercially viable, a system must be rugged and affordable.

Fafaza has been developed to run on off-the-shelf components, including a low-cost camera that can capture images at around 20 frames per second and an AI platform that costs less than US$100. Major processor vendors continue to invest heavily in devices that can run AI inference algorithms, bringing costs down further. These developments are opening up new areas for real-time AI processing in the field, without the need to rely on a communications infrastructure or the cloud.