Blog - Agri-EPI Centre | precision farming & innovation


Bringing you news, opinion and innovation in technological advances in agriculture, horticulture and aquaculture, check out the Agri-EPI blog.
Exploring precision farming, including engineering, technology and innovation in UK agriculture across crops, land management and livestock, our blog includes input from our broad sector membership and academic partners the length and breadth of the UK.
Offering you ideas and innovation from national and international projects and initiatives, don’t miss out!

Autonomous and robotics solutions for agriculture and horticulture

By: Duncan Ross

Farming has been embracing evolving autonomous technology for many years. Milking robots are now commonplace and accounted for 30% of all new installations in the UK in 2015 (Heyden, 2015). By that time, robotic milkers had been adopted by 5% of UK farms (ibid.).

Satellite navigation and variable rate application of fertiliser and seed, and chemical spraying using “green on brown” (only spraying green weeds identified in stubbles) or more latterly “green on green” (identifying target weed species in grass leys), are developments that have been based on existing machinery platforms which growers are comfortable with as they are seen to be familiar.

The leap to the next level of robotics and autonomy is a step most growers have yet to take, as barriers to adoption including integration, costs, and skills which all hamper uptake. Despite this, Agri-Tech developers are keen to move their products forward in their capability, learning from grower experiences and interactions, and breaking down those barriers.

One of the major reasons for robotics adoption is access to labour, both seasonal and full time, with rising wage pressures and competition from other sectors in the economy. This is especially apparent in the horticulture sector with many operations still requiring large numbers of people.

There is a vast range of alternative robotics solutions being created which can be categorised into different types:

Large autonomous platforms that perform the same functions as conventional tractor/implement combinations but without a driver, such as those from John Deere and CNH and a smaller offering from AgXeed. TAFE are developing an autonomous electric drive train tractor, and Hands-Free Farm have been converting conventional Iseki tractors to be autonomous during research projects at Harper Adams University, both adapting conventional smaller machinery.

Scouting for incidence of stress (heat, pests, disease, or weeds) has led several companies to develop combined or standalone solutions. Companies are also investigating how to mount sensors on robotic platforms to capture more representative data from pest and spore traps that are currently left in one position in a field.

Weeding such as Small Robot Company’s combined solution using two separate robots, one to map a field and another to treat it with low powered lasers. Standalone solutions from NAIO, BAKUS use machine vision and AI to identify and cultivate weeds, whilst Earthrover uses light systems for control. FarmDroid works in a different way as it plants crop and maps the precise location so that it can return post establishment and weed around the plant. Nissan have developed a Duckrobot that swims in paddy field and removes weeds.

Spraying of orchards with the GUSS robotic platform which is a direct replacement of tractor and driver. Robots that can identify pests and disease with artificial intelligence and on-the-edge processing will allow those infected areas to be treated and not the entire crop, saving significant cost in agrochemicals as well as being more environmentally sustainable.

Crop scouting being developed by Antobot to count fruit numbers in orchards and strawberry tunnels, assessing maturity and yield, providing data beneficial for resourcing of staff and accurate prediction of produce to marketing cooperatives and retail supply chains.

Soil sampling on robotic platforms from E-Nano and GMV NSL will give far greater granularity on soil nutrient status and possible organic matter content. By precision mapping a field, a robot can return to the same spot several years later and sample again, to ascertain how regenerative management practice may have improved soil health.

Harvesting is probably the hardest area to crack but also the greatest need for growers to save labour input. This could be picking top fruit with developments by Tevel, Octinion, Agrobot and RootAI, soft fruit with SAGA, Dogtooth and Field Robotics, asparagus with Muddy Machines, and broccoli with Earthrover. Currently the degree of computer processing power needed to replicate human hand-eye coordination means all the platforms are slow compared to existing picker rates and need further development and refinement before they can gain parity and be considered a viable alternative to an experience picker.

However, an area that is seen as really labour saving is the logistics platforms being developed by BurroAI, Antobot and Fox Robotics, where the harvested fruit is moved around the plantation by a robot, delivered to a central point and returning to the harvesting location with empty trays. This prevents the need for harvest staff carrying fruit to the central point and allows them to do what they do best which is keep on picking, thus maximising use of available labour.

Automation and robotics will have a wide impact on the agriculture and horticulture sector in the future, replacing humans in menial tasks, simultaneously creating higher skilled jobs attractive to different people. Data capture and processing will allow growers to have far more visibility of their growing crops, providing information for better decision making on targeted interventions of irrigation, fertiliser, Agro-chemicals, and labour resource. This will enhance financial and environmental farm businesses and assist the drive towards a net-zero agricultural sector.


Get in touch to find out how we can support you: Go to Project Enquiries or email

Hyperspectral UAV

Agri-EPI Centre has invested in the Hyperspectral UAV.

Compared to multispectral imagery, hyperspectral imagery measures energy in narrower and more numerous bands, thus giving much more information on target. Hyperspectral image data is 3D cube, where each pixel holds a full spectrum across the range. Since spectra are as unique as ‘fingerprints’ to target, hyperspectral imagery can unveil features that multispectral imagery may miss out on.

Hyperspectral imaging technology has been under research for decades and has been demonstrated to be very powerful in many application areas including agriculture. Especially in recent years, with a more robust and rugged imaging product embedded onto the UAV platform, agri-tech has seen revolutionary improvements.

The HySpex turnkey UAV solution with Mjolnir VS-620 and Lidar includes all the necessary hardware and software for flight planning, data collection, data processing and calibration. The system is provided with a UAV platform, 3-axis gimbal mount for the hyperspectral unit with Lidar and corresponding spectral calibration, radiometric calibration and geometric calibration. The geometric calibration includes a sensor model for VNIR and SWIR hyperspectral sensor heads, subpixel co-alignment of the 2 sensor heads, boresight calibration of the 2 sensor heads and internal IMU system, boresight calibration of the Lidar unit and internal IMU system.

There’s a broad application potential, including assisting in the development of products in the following application areas:
• Drought/water/nutrient stress monitoring
• Plant pathogens detection
• Analysis of soil properties/Determination of soil types
• Land mapping
• Yield forecasting
• Land management

UAV System (XQ-1400S BFD HySpex Edition):
1. <25 kg MTOW with Mjolnir and gimbal
2. Up to 25 min flight endurance with 8 kg payload
3. Fitted with high performance GNSS/GPS and IMU to enable data to be captured to high geolocation accuracy
4. Fitted with advanced 3-axis digital gimbal to compensate for the pitching

Sensing System (HySpex Mjolnir VS-620, Velodyne VLP-32C) :
1. Fully-integrated co-aligned hyperspectral visible and near-infrared (VNIR) and short-wave infrared (SWIR) (400 – 2500nm) and LiDAR sensors, along with in-flight data capture and storage system
2. Spectral coverage of 400 – 2500 nm, with spectral resolution of 3 nm in VNIR and 5.1 nm over SWIR range. Bit resolution 12bit in VNIR and 16 bit in SWIR.
3. Double resolution data in the VNIR range
4. High-resolution (0.33 degree) LiDAR sensor, with 360° surround view with real-time 3D data

They Hyperspectral UAV has potential use as groundtruth technology for other technologies/systems as well.

For information on renting out our technical assets please contact