Agricultura Scientia 21: No 2 1-9(2024) https://doi.org/10.18690/agricsci.21.2.1 *Correspondence to: E-mail: jurij.rakun@um.si 1 An Autonomous Field Robot Farmbeast – The Field Robot Event 2023 Edition Gregor POPIČ 1 , Urban NAVERŠNIK 1 , Jaša Jernej RAKUN KOKALJ 1 , Erik RIHTER 2 , Jurij RAKUN 2* 1 University of Maribor, Faculty of Electrical Engineering and Computer Science, Koroška cesta 46, 2000 Maribor, Slovenia 2 University of Maribor, Faculty of Agriculture and Life Sciences, Pivola 10, 2311 Hoče, Slovenia ABSTRACT In contemporary agricultural automation, the demand for highly adaptive autonomous systems is rapidly increasing. Addressing this need, we introduce the latest iteration of FarmBeast, an advanced autonomous robot designed for precise navigation and operation within the complex terrain of cornfields. This paper details the technical specifications and functionalities of FarmBeast, developed by a Slovenian student team from the University of Maribor for the international Field Robot Event (FRE) 2023. The enhanced version features significant hardware and software upgrades, including a completely new robotic platform, a multichannel LIDAR system, an Xsens IMU, and advanced algorithms for efficient row navigation and weed removal. These integrated technologies aim to improve the efficiency and reliability of agricultural processes, reflecting the broader trend towards digitization and precision farming. Participation in international competitions like FRE provides a valuable platform for students to apply interdisciplinary knowledge, fostering the development of practical skills and understanding the interconnectedness of various scientific disciplines. As highlighted in the results section, FarmBeast performed notably compared to other 14 robots, securing top-five finishes in navigation, plant treatment, and obstacle detection tasks, demonstrating its capabilities in dynamic agricultural settings. Keywords: precision agriculture, robotics, sensors, algorithms INTRODUCTION In the realm of contemporary agricultural automation, the demand for highly adaptive autonomous systems is rapidly increasing (Bogue, R., 2019). Addressing this need, we introduce the latest iteration of 'FarmBeast,' an advanced autonomous robot engineered for precise navigation and operation within the complex terrain of cornfields. This paper unveils the technical details and functionalities of FarmBeast, an autonomous field robot developed by the Slovenian student team from the University of Maribor for the international Field Robot Event (FRE) 2023. The enhanced version features significant hardware and software upgrades, including a multichannel LIDAR system, an improved power distribution PCB, and advanced algorithms for efficient row navigation and weed removal. The integration of these technologies aims to improve the efficiency and reliability of agricultural processes, reflecting the broader trend towards digitization and precision farming. The development of autonomous agricultural robots, like FarmBeast, is part of a broader shift towards precision agriculture, which leverages digital technologies to optimize farming practices (Bose, P., 2020). Precision agriculture involves the use of various sensing systems, mobile applications, Internet of things (IoT), and other technologies to enable selective and precise treatment of crops. This approach leads to significant savings in input materials, reduces environmental impact, and increases crop yields (Rakun, 2023). FarmBeast is an ongoing student project that started in 2018, based on initial Cornstar concept that started back in 2008 (Berk et al., 2016). The robot has undergone continuous improvements to enhance its capabilities in terms of speed, reliability, and robustness. In 2023, the robot base was completely rebuilt, providing a higher degree of stability and An Autonomous Field Robot Farmbeast – The Field Robot Event 2023 Edition 2 usefulness on the field. These enhancements are also supported by sophisticated software algorithms that utilize the LiDAR data for navigation and obstacle avoidance (Bernad et al., 2018). Participation in international competitions like the Field Robot Event (FRE) provides a valuable platform for students to apply interdisciplinary knowledge from fields such as computer science, electrical engineering, mechanical engineering, and agricultural sciences. The integration of robotics in agricultural STEM education not only equips students with practical skills but also fosters an understanding of the interconnectedness of various scientific disciplines (Bernad, Rihter, & Rakun, 2024). The objectives set by FRE 2023 challenge the participating robots to demonstrate their capabilities in five main tasks. These tasks are designed to test the robots' navigation, plant treatment, obstacle recognition, and overall performance in dynamic agricultural settings. The specific tasks include: Navigation: Robots must autonomously navigate through a maize field, following a specified path while avoiding obstacles and maintaining accuracy and speed. Treating (spraying) the plants: Robots must navigate and selectively treat plants with a spraying mechanism, demonstrating the ability to recognize gaps where plants are missing. Sensing and recognizing possible obstacles: Robots are tested on their ability to recognize and classify obstacles such as humans and animals. Static and dynamic obstacles: Robots must detect and appropriately respond to static and dynamic obstacles while navigating through the field. Freestyle: Teams showcase their creativity and technical skills by presenting an innovative agricultural application of their robot, judged on originality, technical complexity, and performance (Field Robot Event, 2023). Based on these objectives the paper is structured in the following way: section two describes the basic hardware and software of the FarmBeast autonomous robot, section 3 extends this; by describing the fundamental algorithmic principles, section four discusses the results and section 5 sums up the paper with selected conclusions and guidelines for future work. MATERIALS AND METHODS Robotic base The FarmBeast robot, depicted in Fig. 1, is an advanced agricultural robot designed to automate the weeding and spraying process (Kajbič, M., 2023). This robot is equipped with a versatile attachment that allows for the targeted removal of weeds through three distinct methods: mechanical mulching, thermal elimination using lasers, and the application of various phytopharmaceutical preparations. Additionally, it includes a separate spraying device for in-row application, mounted on the back of the robot. The robot leverages machine vision and artificial intelligence to localize and identify weed species, optimizing the selection of the appropriate removal method. Designed for use within a single inter-row space, the FarmBeast also belongs to a broader category of agricultural robots capable of performing similar tasks across multiple rows simultaneously. In terms of specifications, the FarmBeast robot features a modular design that allows for easy adjustments and the addition of various components as needed. It is equipped with a wheeled drive system that uses four-wheel Ackermann steering, providing high manoeuvrability, including the ability to turn around its geometric center. The robot is powered by DC brushless motors, which provide sufficient power to navigate slopes of up to a 50% incline. Its pneumatic suspension system can handle the robot's maximum load without exceeding 20% of the total suspension travel. The suspended part of the robot, along with the prescribed load capacity, totals 80 kg. The design process for the FarmBeast robot considered several construction limitations. The robot must maintain a load capacity of 30 kg while ensuring that the suspended mass does not exceed 80 kg. The choice of a wheeled drive with Ackermann steering maximizes manoeuvrability, which is critical for operations in varied and often uneven agricultural terrains. The robot's modular construction allows for simple assembly and reconfiguration to adapt to different tasks, ensuring versatility in various agricultural applications. Additionally, the inclusion of an onboard compressed air source to power pneumatic actuators is essential for the operation of various agricultural attachments. Extensive testing of the FarmBeast robot validated its design and functionality. All vital components met the required specifications during real-world tests, confirming the robot's robustness and reliability. The Ackermann steering system provided excellent manoeuvrability, allowing the robot to perform precise operations even in confined spaces. The pneumatic suspension system effectively maintained stability and traction, which is crucial for operations on uneven ground. Furthermore, the modular design and attachment mechanism proved effective, allowing for quick and secure changes of various agricultural tools. The FarmBeast robot exemplifies a significant advancement in agricultural automation, combining machine vision, AI, and modular design to offer a flexible, efficient, and environmentally friendly solution for weed control. This robot's innovative features and robust design An Autonomous Field Robot Farmbeast – The Field Robot Event 2023 Edition 3 make it a valuable tool in modern precision farming, enhancing productivity while reducing environmental impact. Figure 1: FarmBeast, an autonomous field robot Sensors The FarmBeast robot is a complex device equipped with multiple sensors, including encoders for wheel odometry, the Xsens Mti 610 IMU, the Velodyne VLP16 multichannel LiDAR, and the Realsense 435if RGBD camera. However, the core of the robot's navigation system relies on the synergy between two pivotal sensors: the Velodyne VLP16 LiDAR and the Xsens Mti 610 IMU. These sensors form the backbone of the robot's sensing and navigational intelligence, enabling it to respond with unparalleled accuracy to terrain challenges and the variable demands of precision agriculture competitions. Velodyne VLP16 LiDAR sensor The FarmBeast relies on the Velodyne VLP16 LiDAR sensor for its core spatial perception capabilities. Utilizing laser technology to create a semi-three-dimensional map of its surroundings, the sensor provides high-definition object and surface detection crucial for navigating through dense rows of corn. With the ability to capture 300,000 points per second, the VLP16 is essential for dynamic agricultural applications where reliability and precision are key to success. Figure 2: The front of the FarmBeast robot with a mounted Velodyne VLP-16 multichannel LiDAR Xsens Mti 610 IMU Complementing the LiDAR, the Xsens Mti 610 IMU delivers essential information about the FarmBeast's orientation and motion. Combining sensors for velocity, acceleration, and magnetic orientation, the IMU plays a crucial role in precise steering and stabilization during critical manoeuvres. The use of the Xsens Mti 610 is fundamental for executing the turns in patterns determined by competition organizers, ensuring that the robot maintains correct orientation relative to the complex geometry of the fields. Figure 3: The Xsens IMU unit Together, these sensors allow the FarmBeast to navigate autonomously and adapt swiftly to abrupt environmental changes, characteristic of competitive scenarios and real- world agricultural applications. This introduction lays the groundwork for an in-depth discussion on the innovations and technical solutions that FarmBeast brings to the field of agricultural robotics. Field Robot Event 2023 - Tasks Overview and Scoring System Figure 4: A map of the FRE field presenting the structure, plants and dimensions with the red arrows indicating the selected pattern for driving (source: FRE 2022 tasks description, www.fieldrobot.com) An Autonomous Field Robot Farmbeast – The Field Robot Event 2023 Edition 4 This section describes in short, the focus of each task and presents the pointing system used in it. All 4 + 1 tasks had to be performed on the field, with specific goals set by each task. The Fig. 4 presents a basic information about the competition field, while Fig. 5 presents am image of actual conception and training fields from the FRE 2023. Figure 5: An actual test field at the FRE 2023 held at the Faculty of Agriculture and Life Sciences, University of Maribor, Slovenia Task 1: Navigation Objective - Robots navigate autonomously through a maize field, following adjacent rows and a specific turning pattern after track 5. Scoring - Distance travelled along the given path. A bonus factor for reaching the field's end in less than 3 minutes. Penalties for crop damage (2% of total row length distance per damaged plant). Task 2: Treating (spraying) the plants Objective - Robots navigate through the maize field, spraying plants when detected and stopping in areas without plants. Scoring - Points for detecting empty regions and total distance travelled. Bonus for actual spraying accuracy, evaluated using water-sensitive paper (WSP) with weights based on WSP dryness. Penalties for crop damage and false positive detections (2% of total row length distance per damaged plant). Task 3: Sensing and recognizing possible obstacles Objective - Robots detect and classify obstacles (deer, human, unknown) from images placed in front of them. These images were submitted by all the competing teams, where only one image per team was then randomly selected. This resulted farness as the robots only detected on “familiar” image and the rest were classified by AI. Scoring - 5 points for correct classification (true positive). -5 points for incorrect classification (false positive). Task 4: Static and dynamic obstacles Objective - Robots navigate the field while detecting and responding to static (deer) and dynamic (human) obstacles. Scoring - Points for the path travelled (0.5 points per unit distance). 10 points for successful detection of obstacles. -10 points for unsuccessful detection of obstacles. Task 5: Freestyle Objective - Teams showcase their robot's capabilities in a creative performance related to agricultural applications. Scoring - Points awarded for agronomic idea (originality), technical complexity, and robot performance (0-10 points each). The total points were calculated by: 𝑇𝑇 𝑇𝑇 𝑇𝑇 𝑇𝑇 𝑙𝑙 𝑝𝑝 𝑝𝑝𝑝𝑝 𝑝𝑝 𝑝𝑝𝑝𝑝 = 𝑃𝑃 1 𝑝𝑝 𝑜𝑜 𝑝𝑝𝑜𝑜 𝑝𝑝𝑝𝑝 𝑜𝑜 𝑜𝑜 𝑝𝑝𝑝𝑝 𝑜𝑜 + 𝑃𝑃 2 𝑝𝑝 𝑡𝑡𝑡𝑡ℎ 𝑝𝑝 𝑝𝑝 𝑡𝑡𝑜𝑜 𝑜𝑜 − 𝑡𝑡 𝑝𝑝𝑐𝑐 𝑝𝑝 𝑜𝑜 𝑡𝑡 𝑐𝑐 𝑝𝑝 𝑝𝑝𝑜𝑜 + 𝑃𝑃 3 𝑝𝑝 𝑡𝑡𝑜𝑜 𝑝𝑝𝑝𝑝 𝑜𝑜 𝑐𝑐 𝑜𝑜 𝑝𝑝 𝑡𝑡𝑡𝑡 Overall scoring Points from each of the first four tasks are combined. Each task contributes up to 25% of the points for the overall assessment. Points for each task are calculated based on the ratio of points won by the team to points won by the winning team, adjusted to avoid negative scores. ALGORITHMS To enable the robot to achieve significant results, it must first navigate within a semi-predetermined field of corn. For this purpose, two main programs were developed: one for navigation through the field and another for turning the robot in specific patterns determined by competition judges a few minutes before the event. The navigation program decides whether the robot's position needs correction by moving left or right within the corn maize. It begins by collecting data from the Velodyne VLP16 LiDAR sensor. These data points are filtered before use in the main navigation program through multiple layers of filters based on the RANSAC work of D. Kuramin (2023). First, noise is removed using a voxel grid, which down samples the original points by combining points within a specified area. Next, the normals are computed to perform ground plane segmentation. A RANSAC plane detection is also used to enhance ground detection with minimal jitter. The filtered point cloud data must be limited to the area we want the robot to detect. For this purpose, two areas in space were defined with a trapezoidal prism shape, fitting the dimensions of the competition rows. These areas are placed where we expect the middle of a corn row to be, and the remaining point cloud points in this area are assumed to be from the corn row, allowing us to compute the robot's required trajectory. The whole approach is summarized by a flow chart on Fig. 6. An Autonomous Field Robot Farmbeast – The Field Robot Event 2023 Edition 5 Figure 6: Flowchart of the navigational algorithm First, the number of points remaining in each of the two areas is counted and compared to a minimum point threshold determined empirically. Based on these comparisons, different modes of operation are activated. If both areas contain points, the average x and y positions of the points in each area are calculated, providing two center points to determine the robot's direction. The path correction logic is basic: an area is defined within which the robot can move straight, determined by the space available in the row for lateral movement. Given the robot's large size, this area was made small, roughly 4 cm. If only one trapezoidal prism area has enough points, it indicates either a gap in the corn on one side or that the robot is at the maize's outer limit with only one side of corn. In this case, the center of the points in the populated area is mirrored to the other side to simulate both sides being full of points, allowing the robot to proceed accordingly. If both areas lack points, it suggests the end of a row. However, the robot should not assume it is at the end of a row if both areas have points below the threshold, which can occur mid-row. To address this, a counter is implemented. If the robot detects no points in both areas for 40 consecutive cycles, it is assumed to be at the row's end. If points are detected in either area during any cycle, the counter resets. Upon reaching the end of the row, the robot switches to the turning program. Turning of the robot For turning, we use an Xsens IMU. The sensor provides data in quaternions (Hamilton, W. R., 1844), but due to time constraints and better comprehension, we initially implemented the program using Euler angles (Euler, L. 1776). Moving forward, we plan to switch to quaternions to make the code more concise. The competition requires the robot to turn in a specific pattern, such as alternating left and right turns, but this pattern can change after a few iterations. The specific pattern is provided by the event organizers shortly before the task begins. Our turning code is divided into two 90-degree segments to allow the robot to skip rows if needed. This involves an initial 90-degree turn, then the robot can skip one or two rows before completing the second 90-degree turn. When the robot receives a turn command, it first gathers data from the IMU, converts it to Euler angles, and stores this orientation in a variable. Based on this value, we calculate the target orientation for a 90-degree turn to the left or right. For instance, if the current orientation is 130 degrees, a left turn would target 40 degrees, and a right turn would target 220 degrees. The robot checks the required direction and monitors for the corresponding orientation. To address occasional inaccuracies, we added a tolerance of ±5 degrees. When the robot is within this tolerance, the turn is considered complete. Any slight offset in orientation is corrected by the navigation algorithm. The calculations for offsets were straightforward for the second and third quadrants but more complex for the first and fourth quadrants due to the 0-360 degree transition, necessitating a more sophisticated algorithm. State machine An autonomous robot like FarmBeast runs multiple algorithms simultaneously, making it challenging to track which algorithms are active at any given time and potentially causing interference between programs. The solution to this issue is a state machine, which meticulously monitors state transitions and identifies the active programs. This approach also allows for a predefined sequence of algorithms, preventing unintended events and ensuring seamless coordination among various functions. We used a ROS-based state machine called SMACH (ROS documentation, 2023), which offers graphical monitoring of the active state. This feature simplifies tracking active states and programs. The state machine updates ROS parameters, enabling interaction between different programs. At the end An Autonomous Field Robot Farmbeast – The Field Robot Event 2023 Edition 6 of each state, the current state variable is set to false, deactivating all algorithms used in that state, while the next state variable is set to true, activating subsequent algorithms. This setup significantly reduces the risk of the robot switching to an unwanted algorithm, such as initiating a turn in the middle of a row or activating the YOLO algorithm during a turning process. Object detection and safety Object detection is performed using the YOLO (You Only Look Once) algorithm (Redmon et al., 2016), a pioneering real- time object detection system highly regarded in computer vision. YOLO executes the entire object detection process in a single pass through the neural network, providing fast and accurate detection of objects in images or videos. By dividing the input image into a grid and predicting bounding boxes and class probabilities for each grid cell, YOLO minimizes computational redundancy, achieving remarkable efficiency without compromising accuracy. The basic principles of YOLO are illustrated in the Fig. 7 below. Figure 7: The basic principles of the YOLO algorithm (Analytics Vidhya, 2021) YOLO's real-time performance has made it a cornerstone technology in applications such as surveillance, autonomous vehicles, and augmented reality. Each version of YOLO has introduced enhancements in speed, accuracy, and versatility, solidifying its status as a preferred solution for object detection tasks. Effectively deploying the YOLO algorithm requires training neural networks on a comprehensive database of images to derive accurate weights. This process involves a substantial dataset of images containing the target objects, meticulously annotated to delineate the objects precisely. Additionally, a smaller set of test images is necessary to evaluate the accuracy of the trained weights. Although time- intensive, this preparatory phase is crucial for YOLO's efficacy. Once the weights are obtained through rigorous training, they can be integrated with the YOLO algorithm, enabling swift and accurate object detection in both images and video streams. Initially, we searched online for available image databases to save time. We aimed to differentiate between people and deer, so we gathered a set of images for each and annotated the objects. We used approximately 800 images per category, totalling three detection categories: people, deer, and others. Images without deer or people were labelled as the third option. Our algorithm employs the driving algorithm alongside wall detection, as the images we wanted to classify were positioned on a flat surface. We used the LiDAR sensor to detect this surface, pausing the driving algorithm momentarily for YOLO analysis. To ensure correct classification, we repeated the YOLO process five times. After classification, a speaker on the robot audibly announces the identified class. Depending on the class, the robot either continues driving if the object is removed or starts driving backward. RESULTS The FarmBeast robot participated in the Field Robot Event 2023, competing in five distinct tasks with 14 different international teams / robots. Each task was designed to test specific capabilities of autonomous agricultural robots, including navigation, plant treatment, obstacle recognition, and freestyle performance. The following sections detail FarmBeast's performance in each task, compared to other robots in real world operation, with points awarded for each placement. The videos of competing robots from the FRE2023 day one, two and three are added the reference section and present the competition in greater detail. Task 1: Navigation In Task 1, the robots were required to navigate autonomously through a maize field, following a predetermined pattern. The objective was to cover as much distance as possible within three minutes. The scoring system penalized any damage to crops. FarmBeast achieved 4th place with a total of 80 points. Here are the results of the top five teams for Task 1: Carbonite Schulerforschungszentrum Überlingen: 1st place (100 points) • FREDT Technische Universität Braunschweig: 2nd place (90 points) • Wageningen Robatic Bulls Eye: 3rd place (85 points) • FarmBeast FKBV: 4th place (80 points) • CERES Team and Team FloriBot Hochschule Heilbronn: 5th place (75 points) Task 2: Treating (spraying) the plants Task 2 required robots to autonomously navigate the maize field, spraying plants when detected and stopping the treatment in areas without plants. Points were awarded for An Autonomous Field Robot Farmbeast – The Field Robot Event 2023 Edition 7 accurate detection and treatment, with penalties for any missed detections or crop damage. FarmBeast secured 5th place with a total of 75 points. The top five results were: • Carbonite Schulerforschungszentrum Überlingen: 1st place (100 points) • Wageningen Robatic Bulls Eye: 2nd place (90 points) • FREDT Technische Universität Braunschweig: 3rd place (85 points) • Karlsruhe KAMARO Betelgeuse: 4th place (80 points) • FarmBeast FKBV: 5th place (75 points) Task 3: Sensing and recognizing possible obstacles In Task 3, robots had to detect and classify obstacles (a deer, a human, and other objects) from a set of images placed in front of them. Points were awarded for correct classifications, with penalties for misclassifications. FarmBeast shared 4th place with Carbonite Schulerforschungszentrum Überlingen, both scoring 80 points. The competition was particularly tight in this task, with multiple teams performing well: • Karlsruhe KAMARO Betelgeuse: 1st place (100 points) • TH OWL: 2nd place (90 points) • Milano Grasslammer, Osnabrück Team Acorn Acorn, TU Denmark DTU Maizerunners Thomas, and Wageningen Robatic Bulls Eye: 3rd place (85 points) • Carbonite Schulerforschungszentrum Überlingen and FarmBeast FKBV: 4th place (80 points) Task 4: Static and dynamic obstacles Task 4 focused on safety, requiring robots to navigate the field while detecting and responding to static and dynamic obstacles. The difference beween the two was that the dynamic obstacles were removed, and the robot could continue, while the static did not and the robot had to move in reverse and continue in to the next row. Points were awarded for successful obstacle detection and avoidance, with penalties for failures. FarmBeast achieved 5th place, scoring 75 points. The top five teams in this task were: • TU Denmark DTU Maizerunners Thomas: 1st place (100 points) • FREDT Technische Universität Braunschweig: 2nd place (90 points) • Team FloriBot Hochschule Heilbronn: 3rd place (85 points) • Wageningen Robatic Bulls Eye: 4th place (80 points) • FarmBeast FKBV: 5th place (75 points) Task 5: Freestyle In the freestyle task, teams were invited to showcase their robot's capabilities in a creative and application-oriented performance. Points were awarded based on agronomic idea, technical complexity, and robot performance. FarmBeast did not participate in the freestyle task due to technical issues. The top performers were: • Wageningen Robatic Bulls Eye: 1st place (100 points) • Karlsruhe KAMARO Betelgeuse: 2nd place (90 points) • Osnabrück Team Acorn Acorn: 3rd place (85 points) • FREDT Technische Universität Braunschweig: 4th place (80 points) • CERES Team: 5th place (75 points) Overall results Combining the scores from all tasks, FarmBeast secured 5th place overall in the competition with a total of 310 points. The overall rankings are as follows: • Carbonite Schulerforschungszentrum Überlingen: 1st place (450 points) • Wageningen Robatic Bulls Eye: 2nd place (440 points) • FREDT Technische Universität Braunschweig: 3rd place (435 points) • TU Denmark DTU Maizerunners Thomas: 4th place (435 points) • FarmBeast FKBV: 5th place (310 points) CONCLUSION FarmBeast's performance across various tasks highlighted its strengths in navigation, plant treatment, and obstacle detection. However, there is room for improvement in future competitions, particularly in the freestyle category. The rapid advancements in agricultural robotics, as demonstrated by FarmBeast, underscore the pressing need for continued innovation in precision farming technologies. These technologies not only enhance productivity and efficiency but also reduce the environmental impact of agricultural practices. As the field evolves, it is crucial to develop new studies and job profiles that cater to the unique demands of precision agriculture. Moreover, an educational system that integrates robotics, computer science, engineering, and agricultural sciences is essential to equip the next generation of professionals with the skills needed to drive this transformation. Interdisciplinary programs and hands-on training through participation in competitions like the Field Robot Event can provide students with invaluable An Autonomous Field Robot Farmbeast – The Field Robot Event 2023 Edition 8 experience, preparing them for future roles in this dynamic sector. By fostering collaboration between academia, industry, and agricultural practitioners, we can ensure that the development of precision farming technologies continues to meet the challenges of modern agriculture. This holistic approach will not only advance the field of agricultural robotics but also contribute to sustainable farming practices that can meet the global food demands of the future. REFERENCES 1. Analytics Vidhya. (2021). Implementation of YOLOv3 Simplified. Retrieved from https://www.analyticsvidhya.com/blog/2021/06/i mplementation-of-yolov3-simplified/ 2. Berk, P., Bernad, P., Brinšek, Ž., Cebe, M., Cimerman, J., Rakun, J., & Vajngerl, Ž. (2017). Cornstar. In H. Floto & H. W. Griepentrog (Eds.), Proceedings of the 14th Field Robot Event 2016, Gut Mariaburghausen, Haßfurt, Germany, June 14th-17th, 2016: Conducted in conjunction with the DLG-Feldtage/ DLG Field Days (pp. 72-78). University of Hohenheim, Technology in Crop Production. 3. Bernad, P., Zajc, A., Friš, R., Mlinarič, J., & Rakun, J. (2019). Farmbeast. In H. Floto & H. W. Griepentrog (Eds.), Proceedings of the 16th Field Robot Event 2018, Bernburg-Strenzfeld, Germany, June 12th-14th, 2018: Conducted in conjunction with the DLG-Feldtage/ DLG Field Days (pp. 113-124). University of Hohenheim, Technology in Crop Production. 4. Bernad, P., Rihter, E., & Rakun, J. (2024). Cultivating interdisciplinary futures: Integrating robotics in agricultural STEM education. In K. Skala (Ed.), MIPRO 2024: 47th ICT and Electronics Convention: May 20-24, 2024, Opatija, Croatia: Mipro proceedings (pp. 1451-1455). Croatian Society for Information, Communication and Electronic Technology - MIPRO. 5. Bogue, R. (2019). Robots poised to transform the agricultural sector. Industrial Robot: An International Journal, 46(4), 519-526. https://doi.org/10.1108/IR-03- 2019-0057 6. Bose, P. (2020, February 28). Precision Agricultural Robotics. AZoRobotics5 https://www.azorobotics.com/Article.aspx?ArticleID=113 7. Euler, L. (1776). Nova methodus motum corporum rigidorum determinandi. Novi Commentarii academiae scientiarum Petropolitanae, 20, 208-238. 8. Field Robot Event 2023. (2023, March 10). Tasks for Field Robot Event 2023. https://www.fieldrobot.com 9. Hamilton, W. R. (1844). On quaternions; or on a new system of imaginaries in algebra. The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science, 25(163), 10-13. 10. Kajbič, M. (2023). Snovanje modularnega avtonomnega kmetijskega robota [Master's thesis, University of Maribor]. Digital Library of the University of Maribor – DKUM. 11. Kuramin, D. (2023, March 15). Application of Random Sample Consensus for detection of planes in a point cloud. https://github.com/kuramin/Ransac_Plane_Detection 12. Rakun, J. (2022). Priložnosti in prednosti digitalno podprtega kmetijstva. In B. Potočnik (Ed.), ROSUS 2022: Računalniška obdelava slik in njena uporaba v Sloveniji 2022: Zbornik 16. strokovne konference (pp. 7- 18). University of Maribor Press. 13. Redmon, J., Divvala, S., Girshick, R., & Farhadi, A. (2016). You Only Look Once: Unified, Real-Time Object Detection. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 779-788. https://doi.org/10.1109/CVPR.2016.91 14. ROS Documentation. (2023, March 27). smach/state_machine. http://docs.ros.org/en/jade/api/smach/html/python/s mach/state_machine-pysrc.html 15. FRE2023 – first day. (2023, June 20). [Video]. YouTube. https://www.youtube.com/live/WWrXZsNrRiU?si=BzU k6EikBgyiCPtm 16. FRE2023 – second day. (2023, June 20). [Video]. YouTube. https://www.youtube.com/live/4y5sENg_AcE?si=Vhe6j BNBD24Y62DJ 17. FRE2023 – third day. (2023, June 20). [Video]. YouTube. https://www.youtube.com/live/UkWhcrm_iEg?si=Y1Gd pU6o05yIY5SY An Autonomous Field Robot Farmbeast – The Field Robot Event 2023 Edition 9 Avtonomni kmetijski robot FarmBeast: različica FRE2023 IZVLEČEK Zahteve sodobnega kmetijstva narekujejo potrebo po visoko prilagodljivih avtonomnih robotskih sistemih. Kot enega izmed možnih odgovorov predstavljamo najnovejšo različico naprednega avtonomnega robota FarmBeast, ki omogoča avtonomno navigacijo in natančno delovanje v zahtevnih naravnih okoljih, kot so na primer koruzna polja. Članek opisuje tehnične specifikacije in funkcionalnosti robota FarmBeast, ki ga je razvila ekipa slovenskih študentov z Univerze v Mariboru z namenom sodelovanja na mednarodnem dogodku Field Robot Event (FRE) 2023. Izboljšana različica robota vključuje pomembne nadgradnje strojne in programske opreme, kot so povsem nova robotska platforma, uporaba večkanalnega LiDAR sistema, Xsens notranje merilne enote (IMU) in napredni algoritmi, ki omogočajo učinkovito navigacijo med vrstami ter funkcionalnosti za odstranjevanje plevela. Te integrirane tehnologije izboljšujejo učinkovitost in zanesljivost kmetijskih procesov, kar odraža širši trend digitalizacije in vse večjo uporabo tehnologij preciznega kmetijstva. Sodelovanje na mednarodnih tekmovanjih, kot je FRE, nudi pomembno platformo za študente, ki ob uporabi interdisciplinarnih znanj razvijajo praktične veščine ter razumejo medsebojno povezanost različnih znanstvenih disciplin. V članku so predstavljeni tudi rezultati evalvacije razvitih rešitev, kjer se je FarmBeast med 14 različnimi robotskimi sistemi odlično izkazal in se uvrstil med prvih pet ekip v disciplinah navigacije, obdelave rastlin in zaznavanja ovir, kar potrjuje njegove sposobnosti za uporabo v dinamičnih kmetijskih okoljih. Ključne besede: precizno kmetijstvo, robotika, senzorji, algoritmi