Gripper pear harvesting robot by Vision+Robotics WUR

Unique interplay between plant knowledge and robotics for innovations in precision fruit cultivation

13 March 2024

The Next Fruit 4.0 project involves around 35 partners all working to advance the development of technological solutions such as robotics for precision fruit cultivation. Key themes for the project include digitalisation, precision crop protection, labour optimisation, robotisation and, above all, profitability. Vision+Robotics researcher Jochen Hemming on the challenges of pruning, the search for a suitable camera, and the first successful robotic harvest of pears.

The Next Fruit 4.0 is a follow-up to PPP Precision Horticulture, which was a public-private partnership project also known as Fruit 4.0. The Fruit 4.0 project showed what could be achieved in the Dutch fruit cultivation sector through the use of new technology and data management. Wageningen University & Research (WUR) and the Delphy Improvement Centre are now implementing The Next Fruit 4.0 as a follow-up project on behalf of stakeholders including the Dutch Fruit Growers Organisation (NFO) and FME (the Dutch employers’ organisation for the technology industry). Researchers from Vision + Robotics, powered by WUR, are providing crucial contributions to the project through several work packages. Overall, the project is a broad partnership made up of representatives from the private sector and fruit growers, all working together to improve the sustainability of fruit cultivation and the supply chain, and to maximise yields and minimise costs. A notable feature of The Next Fruit 4.0 is a financial contribution made by the Washington Tree Fruit Research Commission, which is facilitating collaboration with US universities and private sector actors.

Six work packages

The project is divided into six work packages:
1. Sensing
2. Management information
3. Robotisation
4. Preconditions
5. Implementation, economic validation and innovation adoption and
6. An innovation circle
In practice, this means detecting trees, branches, fruits and blossoms for the purpose of precision crop protection, for example. It also includes the use of sensor technology to detect stress, disease and pests, and to monitor crops and products (both pre-harvest and post-harvest). And it means using grippers for the roboticised pruning and harvesting of pears in particular, and the pruning of redcurrant bushes.

Grippers for robotic pruning and harvesting

Dr Jochen Hemming, senior research associate in computer vision & robotics at Vision + Robotics, is responsible for the robotisation work package and explains why pears and redcurrants were chosen specifically. “At the global level there’s a relatively strong focus, both scientifically and commercially, on robots for picking apples. But the harvest period in the Netherlands is just six to eight weeks, and in fact more pears than apples are now grown in Benelux. Also, the project is being funded by fruit growers who are members of the NFO, as well as the top sector Horticulture & Propagation Materials, and Dutch industry. So for all those reasons, pears were chosen. As part of the cost minimisation aspect of the project we’re looking at multifunctional applications of robots and grippers, so it makes sense to look at pruning as well as harvesting.

Multifunctional pruning and harvesting robot from The Next Fruit 4.0

Multifunctional robot prototype from The Next Fruit 4.0

A shortage of qualified workers is making it increasingly difficult to perform both those tasks. And because the participating fruit growers include redcurrant growers, the scope was broadened to the pruning of redcurrant bushes. The sensorics – meaning the combination of cameras, sensors and grippers – are very complex for both of those types of pruning, which is why the researchers at The Next Fruit 4.0 are collaborating with researchers at the Digital Orchard programme at OnePlanet.” Hemming is incidentally one of the few researchers within Vision + Robotics focusing specifically on robotic arms and grippers. Most of the other researchers focus mainly on machine vision, artificial intelligence (AI) and spectral image analysis.

The search for an affordable camera

Just like in the Digital Orchard programme, the problem for Hemming is the fact that mature industrial technologies are not always entirely suitable and sufficiently robust for agricultural applications. This applies to robotic arms as well as sensors and computers. Moisture, dust, light/sunlight and temperatures below 5 degrees Celsius seem to pose the biggest challenges. That’s an issue because pruning is often done during colder periods with high humidity.

“Roboticising secateurs isn’t the problem. We managed that pretty quickly. But detecting which branches need to be pruned is proving much more challenging than we initially thought. The obvious choice of technology for this is a stereo camera, such as the familiar and affordable Intel RealSense or StereoLabs ZED cameras. These are appropriate in terms of their cost and computing power, but they struggle when it comes to recognising thin objects such as twigs and support wires, and dealing with direct sunlight and moving branches and fruits. Alternative cameras are generally too expensive in this context. Meanwhile, LiDAR sensors generate a lot of data that can’t easily be processed in real time. LiDAR data also doesn’t include information on colour, which is often essential for identifying ripe fruits or the type of wood to be pruned. We looked at and were offered quite a few cameras, such as the apiCAM from the start-up photonicSENS, or Photoneo’s high-resolution 3D camera. But the search for the most suitable camera is still ongoing. What we need are sensorics that can detect and distinguish both depth and colour, and can deal with the aforementioned practical challenges of orchards. Ideally, the camera should be priced at less than €1,000.”

The extra challenge of pruning currant bushes

To a consumer, pruning a pear tree may seem similar to pruning a redcurrant bush. Industry insiders know that’s not the case, and Jochen Hemming now knows it too. “One of the big issues is that branches belonging to adjacent berry bushes will grow through each other. Identifying the specific structure of a bush is proving challenging in terms of vision and robotics, because the sensorics and underlying algorithms need to determine which branch belongs to which bush. If you can do that, pruning a berry bush is actually easier than pruning a pear tree. The technique used by fruit growers to prune berry bushes is less specific and targeted than the method used for pear trees. From a technological perspective, we’re ready to take on challenges that we wouldn’t have considered possible five to 10 years ago. This is partly because we now have greater computing power and, in particular, because we now have artificial intelligence and deep-learning algorithms.” Hemming is also looking at how the sensor set developed by OnePlanet for the Digital Orchard might help with the development of suitable sensorics.

Decoupling sense and act?

One possible way of decoupling data collection from actual pruning is to do the detection (sensing) in a first pass. The data can then be processed centrally so that you’re not constrained by computing power. Using algorithms and, if necessary, input from pruning experts, you can then generate a pruning prescription map for a second pass. “This way, you solve two challenges at once. You’re less constrained in terms of computing power and data processing, and you no longer have to decide within a split second what to prune and how to do it. The flip side, however, is that circumstances may have changed between the time of observation and the time of pruning. Not so much in terms of the trees or bushes themselves, but in terms of the weather and wind.”

Successful robotic harvest of pears

Turning our attention away from robotic pruning and back to the robotic harvesting of pears, last autumn researchers and private sector partners successfully harvested the first Conference pears with the use of a robotic arm at the Randwijk research lab, WUR’s research site for fruit cultivation. Jochen Hemming spoke about this at the NFO Science Day.

Field trial of pear harvesting robot prototype in PPS The Next Fruit 4.0. Please note: this video was recorded as part of a research activity. A supervisor with emergency-stop was assigned and present at all time. Maximum robot speed was limited for safety reasons.

“To pick an apple, the robot arm performs a twisting and pulling motion using its gripper and suction nozzle. But to pick a pear, the robot arm has to perform an upward picking motion. We do this through a lifting motion using a mechanism integrated into the gripper. It’s an important and crucial innovation and means the robotic arm doesn’t have to perform that lifting movement itself.”

“Speaking of robotic arms, I’ve detected gradually growing interest in agriculture and horticulture among major manufacturers such as ABB Robotics and Kuka. They are open to agricultural applications but still find this a complex environment to navigate. The Japanese company Denso, for example, is enthusiastic and also already active in agriculture and horticulture: they have a robot for harvesting tomatoes in greenhouses, for example.”

Industry needs are shifting towards knowledge

Hemming and his team have noticed a gradual shift in what industry and machine manufacturers want, moving away from technology itself and more towards knowledge. “That includes agronomic knowledge such as an awareness of the best pruning strategy. But they also need to understand return on investment calculations with regard to automation and robotisation. And knowledge of crop growth models, digital twins and simulation. This might include a dynamic crop growth model in 3D, and predicting where an apple or pear will grow. I think we at Vision + Robotics, powered by WUR, are the ideal partner for private sector actors in that sense because we can make the connection between the plant, modelling and artificial intelligence, and robotics.”

Jochen Hemming Vision Robotics

dr. J (Jochen) Hemming

Senior Researcher Computer Vision & Robotics in Horticulture

Contact dr. J (Jochen) Hemming