Robotic System: Revolutionizing Oyster Sorting

Robotic System

Introduction

The seafood industry has historically relied on manual methods for tasks such as sorting and packaging, particularly when handling oysters. While this approach provides certain benefits, it is also riddled with inefficiencies, contamination risks, and labor-intensive processes. As the global demand for seafood continues to rise, the need for automation becomes paramount. Enter the “Robotic System,” a state-of-the-art innovation designed to address these challenges. By integrating computer vision and advanced machine learning algorithms, this system automates complex tasks with precision, reliability, and speed.

This blog explores the significance of the robotic system in revolutionizing oyster sorting and delves into its technological framework, key components, dynamic features, and the transformative impact it promises for the seafood processing industry. By the end, you’ll understand why the future of food processing lies in automation.

Vision Module: The “Eye” of the Robotic System

Robotic System – High Resolution Image Acquisition

At the core of the robotic system lies its vision module, which acts as the “eye” of the system. This module is equipped with a high-resolution RGB camera designed specifically for capturing dynamic images of oysters on a conveyor belt. The camera’s eye-to-hand configuration enhances precision, ensuring reliable object detection under varying conditions such as motion blur, lighting fluctuations, and occlusions.

With its lens parallel to the conveyor belt, the RGB camera maintains a consistent field of view, enabling accurate detection of oysters’ position and angle during movement. This setup ensures seamless integration between the vision system and robotic hardware, allowing the system to adapt swiftly in real-time operations.

YOLOv8-OBB and MobileNetV4 Integration

The YOLOv8-OBB algorithm adds a new dimension to object detection by incorporating oriented bounding boxes (OBB), which predict not only the position but also the rotational angle of oysters. This innovation addresses the limitations posed by traditional horizontal bounding boxes, ensuring grasping accuracy regardless of the oyster’s orientation.

Meanwhile, MobileNetV4 plays a crucial role in optimizing computational efficiency. By reducing the number of model parameters and minimizing computational load, MobileNetV4 ensures that the robotic system operates smoothly even in resource-constrained environments. Together, these technologies make the vision module the backbone of the robotic system, supporting accurate and efficient oyster sorting.

Robotic System—Engineering the Robot Body

Robotic Arm and Gripper Design

The robotic system boasts two primary components: the AUBO-i3 robotic arm and a flexible pneumatic gripper. With six degrees of freedom, the AUBO-i3 arm facilitates intricate multi-directional movements across three-dimensional space. This flexibility allows the arm to reach oysters positioned at challenging angles and locations, ensuring precise and efficient grasping.

Attached to the arm is a specialized pneumatic gripper designed to handle oysters delicately yet securely. Its four-fingered design provides optimal grip, reducing the risk of slippage or damage. By incorporating a robust open-and-close mechanism controlled through pneumatic actuation, the gripper offers rapid responsiveness, further enhancing sorting efficiency.

Robotic System – Motion Planning with ROS Framework

Schematic diagram of the relationships between various parts of the robotic system.
Schematic diagram of the relationships between various parts of the robotic system.

The Robot Operating System (ROS) serves as the system’s nerve center, coordinating motion planning, hardware control, and communication between components. The ROS framework employs real-time data exchange to facilitate seamless trajectory planning, grasp point determination, and obstacle avoidance. Its distributed architecture ensures adaptability and stability, making the robotic system suitable for dynamic environments.

Robotic System—Object Detection and Grasping Accuracy

Dataset Creation and Augmentation

For reliable detection performance, a comprehensive oyster dataset was curated and augmented using advanced techniques such as geometric transformations, brightness adjustments, and noise addition. These measures expanded the dataset to include a wide range of orientations and conditions, enhancing the model’s adaptability to real-world scenarios.

The YOLOv8-OBB algorithm was trained on this augmented dataset, achieving remarkable detection accuracy across various environments. By incorporating the oyster’s rotational angle into the detection process, the algorithm addresses the unique challenges posed by irregularly shaped aquatic products.

Static Grasping Success Rates

Static grasping tests validated the system’s efficiency, achieving a success rate of 95.54%. These tests involved stationary conveyor belts with varying oyster densities. Despite challenges posed by densely packed oysters, the robotic system consistently performed accurate detection and grasping, underscoring its reliability in static scenarios.

Dynamic Grasping Optimization

Trajectory Prediction with Kalman Filters

Dynamic oyster sorting introduces additional complexities, as the conveyor belt carries oysters at varying speeds. To tackle these challenges, the robotic system employs Kalman Filters (KF) combined with Low-Pass filters for trajectory prediction. These algorithms refine detection results by accounting for motion dynamics and suppressing high-frequency noise, ensuring stability and precision.

The KF predicts the oyster’s position by integrating historical data with real-time observations, while the Low-Pass filter smoothens predictions to reduce noise-induced errors. Together, they enhance the system’s ability to adapt to dynamic environments, making it an invaluable asset for automated seafood processing.

Results of Dynamic Grasping

Experimental tests revealed impressive results across multiple conveyor belt speeds. While success rates were highest at lower speeds (up to 84%), they declined at higher speeds due to the increased complexity of dynamic operations. These findings highlight areas for future optimization, particularly in improving motion prediction algorithms and hardware responsiveness.

Implications for the Seafood Industry

The integration of robotic systems into seafood processing offers transformative benefits. By automating labor-intensive tasks, these systems reduce contamination risks, enhance operational productivity, and improve food safety standards. The economic advantages are equally significant, with reduced labor costs and increased throughput promising a high return on investment.

Beyond oysters, this technology has broader applications across various food industry sectors, including fish filleting, meat packaging, and vegetable sorting. Its scalability and adaptability make it a valuable tool for industries seeking to modernize and streamline their operations.

Challenges and Areas for Improvement

Despite its impressive capabilities, the robotic system faces certain limitations. Multi-object detection instability and dynamic grasping accuracy are among the key challenges. Addressing these issues requires enhancements in training datasets, hardware configurations, and prediction algorithms.

Conclusion

The introduction of computer vision-based robotic systems marks a turning point in seafood processing. By automating tasks that were traditionally labor-intensive, these systems redefine operational efficiency and food safety standards. Through innovations like YOLOv8-OBB and MobileNetV4, the robotic system addresses longstanding challenges in oyster sorting and packaging, setting new benchmarks for productivity.

With ongoing advancements in machine learning and robotics, the future of food processing lies in fully automated solutions capable of adapting to dynamic environments. By investing in such technologies, industries can ensure better-quality products, greater efficiency, and improved safety for consumers worldwide.

References

  • Hao-Ran Qu, Jue Wang, Lang-Rui Lei, Wen-Hao Su. “Computer Vision-Based Robotic System Framework for the Real-Time Identification and Grasping of Oysters.” Applied Sciences 2025, 15, 3971. DOI: 10.3390/app15073971

License

This blog is licensed under the Creative Commons Attribution 4.0 International (CC BY 4.0). You are free to share and adapt this content, provided proper attribution is given. Learn more at https://creativecommons.org/licenses/by/4.0/.