Autonomy and AI: The next big thing in viticulture? - Fruit & Vine

Autonomy and AI: The next big thing in viticulture?

Fruit & Vine was invited to a vineyard in North Essex to hear from a collaboration who are working together to bring robotics and artificial intelligence to vineyards. Neale Byart reports.

Automation and autonomy are slowly working their way into viticulture, and the next level could just be robotics powered by artificial intelligence, according to the combined bright minds of Extend Robotics and the Queen Mary University of London.

These two establishments have teamed up with Saffron Grange Vineyard in north Essex and are working on a three-year plan to bring robots with AI to the vines of Saffron Walden and beyond. The project is funded by Defra’s Farming Futures Automation and Robotics Competition, and Innovate UK.

The vineyard

Saffron Grange Vineyard produces premium quality English sparkling wines in Saffron Walden, Essex. They take great pride in marrying traditional viticulture techniques with the very latest technological advances to significantly impact growth and productivity.

Owner Paul Edwards began planting it in 2008, with the most recent vines going into the ground this summer and more planned for next year. Of the 16 hectares available, 9ha are currently under vine.

The chalk seam upon which Saffron Grange sits is the very same one that runs through the south of England and onto the renowned wine regions of northern France.

The team provided a demonstration of how an operator wearing off-the-shelf goggles could manipulate the robot using gestures

The logo for the vineyard features a woolly mammoth – a curious choice of mascot, you might think. In fact, the reason it appears on the company branding is that the remains of a woolly mammoth (tusk and teeth) have been found on land at the edge of the vineyard. “To us, the mammoth represents strength, opportunity, adaptability and durability. Their presence on our land for tens of thousands of years is a reminder of the will needed to survive during dramatic climate and geological change,” Paul told Fruit & Vine.

Explaining why he decided to get involved in this research project, Paul continued: “We expect the vines to last at least 25 years in the ground, giving us plenty of cause to invest in making it an efficient process.

“At Saffron Grange we can now sustain the growth year-on-year with around 30,000 bottles produced from the 2023 harvest and we have planted more vines to allow us to produce 50–60,000 bottles in the coming years.

“This growth is going to mean a significant increase in labour and the costs that are associated with that. We are on track to grow more vines, produce more wine and increase sales and we need to look at how we can drive the business forward, expand and take advantage of new technology to help that.

“As the number of bottles produced increases, the costs do too; which is why I am so happy to be a part of this project to bring robotics and AI into the industry and we are working hard to share every bit of intellectual property needed to make this successful, as well as understanding what changes we need to make in the vineyard to encompass this. In that regard we have put half a field aside so that we can grow grapes in a slightly different way if needed. I’m really excited about the prospects of the next couple of years and working with the teams from Extend Robotics and Queen Mary University of London.”

Extend Robotics

Extend Robotics is a UK-based start-up specialising in virtual reality-based teleoperation systems for remote manipulation of robots.

The team’s expertise in precision manipulation and perception systems are essential to developing modular robotic hardware systems with human equivalent manipulation robotic arms and cameras. Azmat Hossain from Extend Robotics told Fruit & Vine: “We are a robotics company, but also a software developer. We believe that robots are not just hardware or machines, they need to have intelligence and be trained, and this is what we are working on. We are revolutionising the control of robotic systems by building a fully immersive, intuitive and accurate human/robot interface.

“The technology is affordable and easy to integrate and allows operators to intuitively operate robots using immersive gesture control in real time and from anywhere in the world.

“We work with many partners across different industries with an aim to have robots working side-by-side with humans. We are dreamers, innovators and a disruptive group looking at the challenge of building something quite extraordinary, something that can really provide value to the industry and our partners, not just for the UK, but for the world.”

Chang Liu, the CEO of Extend Robotics, added: “We are offering a virtual reality interface that could be used for delivering automation into future farming – in this case viticulture, which is one of the more difficult agriculture-related uses for this technology. The objective for this project is firstly to be able to monitor the vine growth and plant health to help improve the quality of the wine, but also to develop the virtual reality interface to deliver precision harvesting and pruning as well as promote automation in real world farming.

“Of course, we must combine this with the fit-for-purpose custom hardware that can make this all happen. Queen Mary University’s team is working on the monitoring and the hardware development while we are primarily focused on the software side. Delivering automation in the field is a difficult technical challenge with so many complexities and variabilities but we are looking to develop a robot that almost anyone can use. It is very intuitive and using a VR headset you can use gestures to remotely operate the robot, and you can see things in 3D.

“As the robots are used, single robot to single operator initially, data can be collected, and this data can then be used to ‘teach’ the robots and drive the automation forwards so that AI can start to automate things further. This will lead to ‘fleet’ operations whereby a single operator can oversee many robots, while keeping the human in the loop. As it is all cloud-based, you could have farmers on the other side of the world operating a fleet of robots on this side meaning seasonal workers could work year-round on different continents from their home base, or picking could be carried out round the clock, eliminating local labour shortages.”

Queen Mary University of London

The university is bringing its expertise in remote sensing and image spectral analysis to the project, which is critical to its success. Professor Lei Su explained more about its involvement.

“My team is working on instrumentation, particularly AI integrated spectral analysis. If we placed a glass of red wine and a glass of orange juice in front of a person, they could easily tell the difference. This is our brain carrying out spectral analysis with our eyes being able to see different colours in the visible light range.

The robot approached a vine, extended the arm toward a bunch of mock grapes, which it grabbed, cut and removed

“If we were to place a glass of water and a glass of ethanol in front of a human, they look identical as they are both clear liquids. What we are working on with AI spectral analysis is a system that can look further than the visible light range at both ends, in both the infrared and ultraviolet ranges and beyond. Using AI spectral analysis, the two clear fluids can be clearly identified as either water or ethanol due to their different wave lengths. This precise technology can be used on the grapes to help ensure that they are harvested at exactly the right time to ensure the very best quality wine. It can also be used on the leaves of the vine to look for disease”.

Dr Ketao Zhang, senior lecturer of robotics at the university, further explained that the aim of the project is to combine robotics and AI to find the solutions to the challenges faced. “Mechanically we have the wheels to allow the robots to move around the farm, we have the robotic arms to pick and cut the grapes from the vine, but they lack the human touch to select the right amount of pressure required to pick them without damage. We have sensors, of course, but we are also focusing on an electronic skin that provides an artificial touch to allow the robot to ‘feel’ the pressure being exerted. What we have here today is a concept and we hope to have the first commercial prototype ready in around two years.”

Manipulating the robot

The team provided a demonstration of how an operator wearing off-the-shelf VR goggles could manipulate the robot using gestures – the fruit of a year’s worth of development. The robot approached a vine, extended the arm towards a bunch of mock grapes, which the arm grabbed, cut and removed – and all done by an operator wearing VR goggles. He may have only been a couple of metres away, but the point is that he could just as easily have been on the other side of the world.

Saffron Grange owner, Paul Edwards (seventh from left) is pictured with some of his vineyard team, members of the research group at Queen Mary University of London, and Extend Robotics representatives

The prototype obviously needs refinement and further development to realise the potential, but what is in no doubt is the skill and the enthusiasm displayed by the teams from the university and the robotics company, and there can be little doubt that by the time they reach the end of their three year plan, if the progress made in the first 12 months is anything to go by, there could be a robot in a field in Essex, operated by a picker anywhere in the world, well before the decade is out.

Read more machinery news


© Fruit & Vine 2024. All Rights Reserved.

Website Design by Unity Online

Sign Up To   Fruit & Vine

Our free* magazine subscription offers a host of benefits to those in the wine making, fruit growing, juice and cider producing industries - find out more, and sign up today, below.

Sign Up Today *Fruit & Vine is sent free-of-charge to qualifying industry professionals in the UK.