High-Level Robot Programming Based On CAD: Dealing With Unpredictable Environments
High-Level Robot Programming Based On CAD: Dealing With Unpredictable Environments
Abstract:
Findings – It was found that it is possible to generate a robot program from a common
CAD drawing and run it without any major concerns about calibration or CAD model
accuracy.
Practical implications – Since today most of the manufacturing companies have CAD
packages in their facilities, CAD-based robot programming may be a good option to
program robots without the need for skilled robot programmers.
1.1. Motivation
Increasingly, companies are changing and reinventing their production systems.
Traditional manufacturing systems (often based on fixed automation and manual work)
are being replaced by flexible and intelligent manufacturing systems, enabling
companies to continue to be competitive in the global market (Kopacek, 1999). This
competitiveness is reflected in the companies’ capacity to respond/react quickly to
market demands, producing more and better quality products at competitive prices.
Owing to its flexibility, programmability and efficiency, industrial robots are seen as
a fundamental element of modern flexible manufacturing systems. Nevertheless, there
are still some problems that hinder the utilization of robots in industry, especially in
small and medium-sized enterprises (SMEs). SMEs have difficulty finding skilled
workers capable of operating with robots. Therefore, new and more intuitive ways for
people to interact with robots are required to make robot programming more accessible,
easier and faster. The goal is that the instructor can teach a robot in a manner similar to
that used by humans to teach each other, for example using CAD drawings, gestures or
through verbal explanation (Neto et al., 2010a).
1.2. Objectives
Robot programming through the typical teaching method (using the teach pendant) is a
tedious and time-consuming task that requires technical expertise. The goal is to
develop methodologies that help users to program a robot in an intuitive way, quickly,
with a high-level of abstraction from the robot specific language, and, if possible,
without speeding too much money.
In this paper, a CAD-based system to program a robot from a 3D CAD drawing,
allowing users with basic skills in CAD and robot programming to generate robot
programs off-line, is presented. In addition, the 3D CAD package, Autodesk Inventor,
that interfaces with the user is a well-known generic CAD package, widespread on the
market at a relative low-cost. Starting from the CAD model of the robotic cell in study,
the way the user generates a robot program is as simple as “drawing” the desired robot
paths in the CAD environment. Later, the information needed is automatically extracted
from the CAD environment, analysed and converted into robot programs. Note that the
robot programs are not extracted neither from a computer aided manufacturing (CAM)
software nor from a computer aided robotics (CAR) software. It means that we are
proposing a simple CAD integrated solution for the robotics field.
Commercial CAR packages are powerful tools, which enable modelling, simulation
and robot programming. Nevertheless, they have some disadvantages that hinder their
use in companies, especially in SMEs. By comparing commercial CAR packages with a
CAD-based robot programming system similar to that presented in this paper (Neto et
al., 2010b), it was found that the CAD-based system has some relative advantages:
• Low-cost. Since the construction of CAD models and the robot programming task
are performed in the same environment/platform (Autodesk Inventor) the programming
task becomes easier and cheaper;
• Short learning curve;
• Simplicity of use. The most time consuming task, the construction of the CAD
model, is present in both systems.
• The CAD models do not reproduce correctly the geometry of the real scenario;
• Inaccuracies created in the robot calibration process;
• Inefficient fixtures that do not ensure the static character of the workpieces;
• A “foreign” object is introduced in the real environment.
In these cases, we can say that we are in the presence of a dynamic and unpredictable
environment.
To perform successful manipulation robots depend on precise information about
objects in their surrounding. In an unpredictable environment, such information cannot
be given to the robot a priori, robots have to autonomously and continuously acquire
information about their surrounding environment to support their decision making and
react to unanticipated events. Sensory feedback allows a robot to recognize your work
environment for itself, for example producing corrections (on-line) in pre-programmed
robot paths (Figure 1). In fact, the integration of sensors into robotic platforms reduces
the setup time, the need for accurate robot trajectory programming and promotes
flexibility and the autonomous behaviour of robotic systems (Bolmsjö and Olsson,
2005; Johansson et al., 2004).
a) b) c) Robot
Planned Robot path
path
path
Collision
P1 P2 P1 P2 P1 P2
Figure 1 (a) – planned path for a specific environment; (b) – a “foreign” object is introduced into the
environment and collision occurs; (c) – sensory feedback is introduced, helping the robot to deal with the
unpredictable environment (robot path is adjusted)
We validate our methods with two different real-world experiments for two different
tasks, seam tracking and for applications that require the robot to follow a geometric
profile while maintaining a contact force.
2. Related Work
In recent years, CAD technology has become economically attractive and easy to work
with so that today millions of SMEs worldwide are using it to design and model their
products. Already in the 80’s, CAD was seen as a technology that could help in the
development of robotics (Bhanu, 1987). Since then, a variety of research has been
conducted in the field of CAD-based robot planning and programming.
Pires et al. (2004) proposes to extract robot motion information from a CAD DXF
file and converting it into robot commands for welding purposes. A review of CAD-
based robot path planning for spray painting is presented by Chen et al. (2009). Another
study presents a method to generate 3D robot working paths for a robotic adhesive
spray system for shoe outsoles and uppers (Kim, 2004). Nagata et al. (2007) proposes a
robotic sanding platform where the robot paths are generated by CAD/CAM software.
An example of a novel process that benefits from the robots and CAD versatility is the
so-called incremental forming process of metal sheets (Schaefer and Schraft, 2005).
Feng-yun and Tian-sheng, (2005) presents a robot path generator for the polishing
process, where the cutter location data is generated from the postprocessor of a CAD
system. As we have seen above, a variety of research has been done in the area of CAD-
based robot planning and programming. However, none of the studies so far deals with
a “global” solution for this problem.
Unpredictable environments pose a significant challenge because of their complexity
and inherent uncertainty. Over the last few years, important studies have been carried
out to deal with uncertainty in the robotics field: using models of “ideal” environments,
sensory feedback, and implementing reasoning methods into robotic platforms
(Bruyninckx et al., 1991; Nayak and Ray, 1990). These concepts have evolved and
recently, researchers have been successful in developing skills that can handle the
complexity of dynamic and predictable environments (Kenney et al. 2009; Mendes et
al., 2010). A number of authors have devoted attention to sensor simulation, trying to
mimic as closely as possible the behaviour of a real sensor, and thus integrating it (the
virtual sensor) within a CAR platform (Cederberg et al., 2002; Brink et al., 1997;
Bolmsjö and Olsson, 2005). Moreover, sensor information has been used to update
robotic cell models in real-time, allowing to avoid problems such as collisions,
kinematic singularities and exceeding of joint limits (Brink et al., 1997; Johansson et
al., 2004).
The concept of seam tracking applied to robotic welding has been studied over the
last two decades (Nayak and Ray, 1990). Recently, important work has been carried out
in the integration of sensors to assist the robotic arc welding process (Fridenfalk and
Bolmsjö, 2002; Bolmsjö and Olsson, 2005).
Starting from a 3D CAD model of the robotic cell in study, the way the user generates a
robot program can be as simple as “drawing” the desired robot paths in the CAD
environment. Furthermore, to define the robot end-effector pose (position and
orientation), it is necessary to know, not only the robot path positions but also the end-
effector orientations in space. Therefore, after drawing the robot paths, simplified tool
models should be placed along the paths. These models will define the orientation of the
robot end-effector in each segment of the path (Figure 2).
The information needed to program the robot will be extracted from the CAD
environment by using an application programming interface (API) provided by
Autodesk. This API allows the extraction of the points that characterize each of the
different lines used to define a robot path; straight lines, splines and arcs. Moreover, the
API also gives information about the transformation matrix of each part model
represented in the CAD environment. The transformation matrix contains the rotation
matrix and the position of the origin of the part model to which it refers, both in relation
to the origin of the CAD assembly model. Later, the information extracted from the
CAD is converted into robot programs (Video 1, 2010). A diagram with the procedure
to extract 3D data from CAD and their conversion into a robot program is presented in
Figure 3.
Start
End
Figure 3 extracting 3D data from CAD
Add-In Standalone
(EXE) EXE
Inventor Application
Add-In VBA
(DLL) Apprentice Server
Inventor
Data
Yes
Define a Line3D as a
ScketchLine3D
B
CT UBT UCT (1)
B
To find UT , we must compute the rotation matrix that defines frame {U} relative to
{B} , B
U R , and the vector that locates the origin of frame {U} relative to {B} , B PUorg .
So, we know that:
UB R B PUorg
B
U T (2)
0 0 0 1
B
Given the characteristics of a rotation matrix, U R= UB R T , and as we know U
T , the next
B
B U
step is to calculate PUorg . Considering a generic vector/point defined in {U} , P ; if
we wish to express this point in space in terms of frame {B} we must compute:
B
P UB R U P B PUorg (3)
For the specific case of the initial path point in Figure 6, Pini , since the API gives U Pini ,
from (3) we can write Pini relative to {B} :
B
Pini UB R U Pini B PUorg (4)
Rewriting (3):
The left side of (4) must be zero, so, from (4) we have:
B
PUorg UB R U PBorg UB RT U PBorg (6)
B
Now, we can rewrite (1) and achieve C T . The same methodology can be used to
achieve BDT and any other transformation.
ZD XD
YD
ZC
YC
XC
ZE
XE
Tool Model
Robot Path
YE Pini
ZB
YB
ZU
XB
XU
U
YU PBorg
Identification of risk areas (paths). This is done by analyzing the CAD model and
manually defining those areas in the drawing.
Discretization of the risk path in equally spaced intervals.
Calculation of end-effector orientations for each interpolated path point. The new
path is smoother than the initial (Figure 7).
a) b)
Pj 2 Pj 2
Pj Pj 1 Pj
Figure 7 (a) – end-effector pose before interpolation; (b) – end-effector pose after interpolation
T
Consider r(k)= rx (k) ry (k) rz (k) a generic end-effector position generated at the
discrete time k and defined in Pj Pj+2 , (Figure 7). Pj , Pj+1 and Pj+2 are known end-
effector poses, extracted from the CAD drawing (see section 4.1.2). For the profile in
Figure 7 (possible area of risk) we will separate the interpolation in two sections, S1 and
S2 ; S1 PjPj+1 and S2 Pj+1 Pj+2 . The calculations are presented for section S1
but for other sections the procedure is the same. So, r(k) is calculated using both the
known data points from CAD ( Pj , Pj+1 ) and the profiling velocity v(k) :
T
v(k ) vx (k ) v y (k ) vz (k ) (8)
W Pj 1 Pj (9)
From (8) and (9), each directional velocity profile is obtained by:
Wi
vi (k ) v(k ) , (i x, y, z ) (10)
W
From (10), using a sampling width Δt , the interpolated position r(k) is given by:
(i x, y, z )
ri (k ) ri (0) vi (k ) k t , (13)
(k 1,..., n 1)
k 1 k 1
sin 1 sin
n 1 n 1 (14)
Qk Q0 Qn , k 1 n 1
sin sin
Where:
cos1 Q0 Qn (15)
calculate the end-effector orientation in the form of quaternions or Euler angles; from
(13) the interpolated positions r(k) ; and finally from (14) the interpolated
orientations (quaternions) Q k .
P x, y, z , q1, q 2, q3, q 4
B B
(16)
P and ri ( k ) T and Qk
C
4. Experiments
Two different experiments are discussed, and in both cases, robot programs are
generated off-line from a CAD drawing. In the first experiment, seam tracking, robot
paths are adjusted with the information received from a laser camera attached to the
robot. In the second experiment, a robot follows a geometric profile while maintaining a
contact force, robot paths are adjusted with the information received from a force/torque
(F/T) sensor attached to the robot wrist.
To better visualize the robot path adjustments provided by sensory feedback, the
robotic space was forced to become a more “viewable” unpredictable environment by
purposely making a rough calibration process. Often, calibration errors arise from the
little time and attention devoted to the robot calibration process.
The computer is running a CAD package (Autodesk Inventor) and the developed
software interface, which receives data from CAD, interprets the data received and
generates robot programs. The robot can be remotely controlled and managed by the
software interface, which uses an ActiveX named PcRob for such purposes. The laser
camera is connected with the robot controller via serial port.
Welding Machine
PcRob
3D CAD
Package
Software
Interface Camera
Controller
Risk Path
Z
X
Figure 9 CAD assembly model of the workpieces to be welded (butt joint). Note: a robot program will be
generated from this model
1 Definition/calibration of the robot tool to match with the robot reference frame.
2 The laser camera is configured with information about the welding joint and the
desired vertical and/or horizontal distances (tool standoff) that the torch must
maintain to the welding joint.
3 Features from the workpiece profile are extracted and matched against the predefined
joint templates and tolerances.
4 The automatic end-effector adjustment is achieved by a closed loop position control
that promotes compensation of the errors in Y and Z directions. Correction data are
acquired with a sample rate of 5 Hz.
6
Corrected path (mm)
0
0 10 20 30 40 50 60 70 80
Distance (mm)
The computer is running Autodesk Inventor and the developed software interface. This
interface generates robot programs from CAD and manages the force control system,
acquiring data from the F/T sensor and sending motion commands (adjustments) to the
robot. The software interface communicates with the robot using a software component
named MotomanLib. The ActiveX component JR3PCI is used to acquire force and
torque data from the F/T sensor. The robot pose is adjusted with a sample rate of 20 Hz.
As in the previous experiment, the robot program is generated from a CAD drawing
(Figure 13). The real work environment is an unpredictable environment due to the
“uncertainty” that comes from an inaccurate calibration process and due to the surface
roughness of the workpiece. The robot tool should follow a geometric profile while
maintaining a contact force. In order to facilitate the analysis of experimental results, a
ball-shaped tool was mounted on the robot’s end-effector.
MotomanLib
3D CAD
Package
Software
JR3PCI
Interface
Force (N)
-10 -10
-20 -20
-30 -30
100 50 100 150 200 100 50 100 150 200
(mm)
Position (mm)
Position (mm)
5 5
Position
0 0
-5 -5
0 50 100 150
150 200
200 0 50 100 150 200
Time (sec) Time (sec)
Figure 14 experimental results by using a Fuzzy-PI controller (at left) and PI controller (at right)
Axial Force
In order to avoid the above mentioned problems the operator should ensure that the
workpieces are inside the working area of the robot, no collisions occur and kinematic
singularities are identified.
During effective robot operation, if a failure or malfunctioning is detected, two
different situations can be considered: task abortion or activation of a reactive task.
After aborting the process, the restarting of the system can be a complicated issue,
depending on the type of robotic task. For example, for an arc welding application,
restarting the system requires at least placing the torch at the point where the robot
stopped.
A CAD-based robot programming system with capacity to deal with dynamic and
unpredictable environments was presented. Results showed that the proposed platform
opens new possibilities for intuitive robot programming. It means that an untrained
operator can generate a robot program for a specific task within minutes. Moreover,
since the construction of the CAD models and robot programming task are performed in
the same platform the entire robot programming process becomes easier and cheaper.
This is very important for SMEs that produce small batches of products and need to
constantly reprogram the robotic cells. In addition, sensory feedback enables the robot
to be more flexible when confronted with product changeover. By adding sensory
feedback to the robotic platforms we ensure that the robot manoeuvres in an
unpredictable environment, damping possible impacts and increasing the tolerance to
positioning errors from the calibration process or from the construction of the CAD
models.
Future work will be required to proceed with the development of methodologies
which would facilitate sensor integration in robotic platforms, especially for when
robots are programmed off-line.
References
Bhanu, B. (1987), “CAD-based robot vision”, IEEE Computer, Vol. 20 No. 8, pp. 12-
16.
Bolmsjö, G. and Olsson, M. (2005), “Sensors in robotic arc welding to support small
series production” Industrial Robot, Vol. 32 No. 4, pp. 341-5.
Brink, K., Olsson, M., and Bolmsjö, G. (1997), “Increased autonomy in industrial
robotic systems: a framework” Journal of Intelligent and Robotic Systems, Vol. 19,
pp. 357-73.
Bruyninckx, H., De Schutter, J. and Allotta, B. (1991), “Model-based constrained
motion: a. modelling, specification and control” IEEE 5th International Conference
on Advanced Robotics, Pisa, pp. 976-81.
Cederberg, P., Olsson, M. and Bolmsjö, G. (2002), “Virtual triangulation sensor
development, behavior simulation and CAR integration applied to robotic arc-
welding” Journal of Intelligent and Robotic Systems, Vol. 35, pp. 365-79.
Chen, H., Fuhlbrigge, T. and Li, X. (2009), “A review of CAD-based robot path
planning for spray painting”, Industrial Robot, Vol. 36 No. 1, pp. 45-50.
Feng-yun, L. and Tian-sheng, L. (2005), “Development of a robot system for complex
surfaces polishing based on CL data”, The International Journal of Advanced
Manufacturing Technology, Vol. 26, pp. 1132-7.
Fridenfalk, M., and Bolmsjö, G. (2002), “Design and validation of a sensor guided robot
control system for welding in shipbuilding” International Journal for the Joining of
Materials, Vol. 14 No. 3/4, pp. 44-55.
Johansson, R., Robertsson, A., Nilsson, K., Brogardh, T., Cederberg, P., Olsson, M.,
Olsson, T. and Bolmsjö, G. (2004), “Sensor integration in task-level programming
and industrial robotic task execution control” Industrial Robot, Vol. 31 No. 3, pp.
284-96.
Kenney, J., Buckley, T. and Brock, O. (2009), “Interactive Segmentation for
Manipulation in Unstructured Environments” IEEE International Conference on
Robotics and Automation, Kobe, Japan, pp. 1337-82.
Kim, J.Y. (2004), “CAD-based automated robot programming in adhesive spray
systems for shoe outsoles and uppers” Journal of Robotic Systems, Vol. 21 No. 11,
pp. 625-34.
Kopacek, P. (1999), “Intelligent manufacturing: present state and future trends” Journal
of Intelligent and Robotic Systems, Vol. 26, pp. 217-29.
Mendes, N., Neto, P., Pires, J.N. and Moreira, A.P. (2010), “Fuzzy-PI force control for
industrial robotics”, in Vadakkepat et al. (Ed.), Trends in Intelligent Robotics,
Springer-Verlag, Berlin Heidelberg, pp. 322-9.
Nagata, F., Kusumoto, Y., Fujimoto, Y. and Watanabe, K. (2007), “Robotic sanding
system for new designed furniture with free-formed surface”, Robotics and
Computer-Integrated Manufacturing, Vol. 23 No. 4, pp. 371-9.
Nayak, N. and Ray, A. (1990), “An integrated system for intelligent seam tracking in
robotic welding: part I – conceptual and analytical development” IEEE International
Conference on Robotics and Automation, pp. 1892-7.
Neto, P., Pires, J.N. and Moreira, A.P. (2010a), “High-level programming and control
for industrial robotics: using a hand-held accelerometer-based input device for
gesture and posture recognition” Industrial Robot, Vol. 37 No. 2, pp. 137-47.
Neto, P., Pires, J.N. and Moreira, A.P. (2010b), “CAD-based off-line robot
programming” IEEE International Conference on Robotics, Automation and
Mechatronics, Singapore, pp. 516-21.
Pires, J.N., Godinho, T. and Ferreira, P. (2004), “CAD interface for automatic robot
welding programming” Industrial Robot, Vol. 31 No. 1, pp. 71-6.
Schaefer, T. and Schraft, D. (2005), “Incremental sheet metal forming by industrial
robot” Rapid Prototyping Journal, Vol. 11 No. 5, pp. 278-86.
Video 1 (2010), “Robot program generation from CAD virtual paths”, available at:
http://robotics.dem.uc.pt/pedro.neto/GS3.html (accessed 15 December 2010)
Video 2 (2010), “Robot path adjustment – laser camera”, available at:
http://robotics.dem.uc.pt/pedro.neto/GS4.html (accessed 15 December 2010)
Video 3 (2010), “Robot path adjustment – force sensor”, available at:
http://robotics.dem.uc.pt/pedro.neto/GS5.html (accessed 15 December 2010)