Sensing and Decision Making

Off

Team members: Prof. Lyudmila Mihaylova, Dr Peng Wang, Shenglin Wang, The University of Sheffield

The emergence of ‘collaborative robots’ promises to transform the manufacturing sector, enabling humans and robots to work together in shared spaces and physically interact to maximise the benefits of both manual and robotic processes. Operators interact with robots often in a space protected by cages. CSI:Cobot contributes towards safe and reliable spaces with secure human-robot interaction. Sensors and intelligent sensing have a key part of this process. 

The safe sensing and decision making work package aims at identifying behaviours of the robot and the operator. The main focus is on developing a reliable visual monitoring system that is composed of a camera system for perception and approaches for detection, classification and tracking of the robot actions. The system includes machine learning algorithms. A mask Region based Convolutional Neural Network (mask R-CNN) module for object detection and decision making has been integrated together with the sensor data. The structure of the mask R-CNN module is shown in Fig. 1. The system monitors a collaborative workspace, using images collected from the camera to identify and track humans and robots in the operating space. Bounding boxes applied to track entities are used to identify potential collisions when objects overlap or come in close proximity in the image space. The bounding boxes are also transformed into the physical world for other potential applications. A separate ruleset for decision-making is required to determine how to interpret proximity information and bounding-box overlaps. 

Figure 1. The mask R-CNN framework for object detection and classification

Mask R-CNN is able to detect objects of interest (the operator and the robot), classify actions and indicate the areas where the objects are located via bounding boxes. 

Mounting the camera on the roof of the cell enables the detection of the operator and the robot in a horizontal two dimensional space. This facilitates substantially the formulation of the criteria for safe collaborative operation. 

The system detects the bounding boxes of the operator and the robot, and overlap (indicated by red rectangles in Fig. 2) between the two bounding boxes that is denoted as OVERLAP. The coordinates of these bounding boxes are transformed from the image space to the physical world and the area of the bounding boxes and the OVERLAP can therefore be calculated. The safety criteria are then defined as:

Safe: If the area of OVERLAP is below the safe threshold S, as shown in Fig. 1(a);

Potential: If the area of OVERLAP is between the safe threshold S and the dangerous threshold D, as shown in Fig. 1(b);

Dangerous: If the area of OVERLAP is over the dangerous thresholdD, as shown in Fig. 1(c).

C:\Users\uos\Desktop\new_safe.jpg

(a)

C:\Users\uos\Desktop\new_potential.jpg

(b)

C:\Users\uos\Desktop\new_dangerous.jpg

(c)

Figure 2.  Safety criteria: (a) Safe; (b) Potential; (c) Dangerous. The red rectangles indicate the overlapping areas.

The developed system provides a “safety envelope” that aims at separating the robot and the operator and avoiding potential risks. The identification of hazardous behaviours and upon which decisions are made to either slow down the robot or warn the operator of potential risks hold the promise of 1) ensure safe collaborative operation; 2) increased flexibilities in the collaborative mode such as responsive collaboration. This will also serve as a prerequisite to enable robots to move and work safely alongside humans in open space, for meeting the modern demands of mass-customisation, higher product variability and quality expectations, and faster product cycles.

The intelligent sensing, detection and monitoring module has been integrated into a Digital Twin framework demonstrator work package and is well linked with the other work packages of the project. 

References 

Peng Wang, Shenglin Wang, James Law, Lyudmila Mihaylova, Towards Safe Human-robot Collaborative Operation - Practical guidance towards Safety, Body of Knowledge (BoK), 2021, Sheffield.

Centres of excellence

The University's cross-faculty research centres harness our interdisciplinary expertise to solve the world's most pressing challenges.