0% found this document useful (0 votes)
158 views18 pages

High-Level Robot Programming Based On CAD: Dealing With Unpredictable Environments

This document summarizes a research paper that proposes a new CAD-based system for programming robots. The system allows non-expert users to generate robot programs directly from 3D CAD drawings, without requiring advanced robot programming skills. Sensory feedback is also integrated to allow robots to adjust their paths when dealing with unpredictable environments. It was found that robot programs can be successfully generated from CAD drawings and run without concerns about calibration or model accuracy. A limitation is that the system is designed for particular applications, but it has advantages over commercial robot programming software by being lower cost and easier to use for companies that already use CAD software.

Uploaded by

Mihail Avramov
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
158 views18 pages

High-Level Robot Programming Based On CAD: Dealing With Unpredictable Environments

This document summarizes a research paper that proposes a new CAD-based system for programming robots. The system allows non-expert users to generate robot programs directly from 3D CAD drawings, without requiring advanced robot programming skills. Sensory feedback is also integrated to allow robots to adjust their paths when dealing with unpredictable environments. It was found that robot programs can be successfully generated from CAD drawings and run without concerns about calibration or model accuracy. A limitation is that the system is designed for particular applications, but it has advantages over commercial robot programming software by being lower cost and easier to use for companies that already use CAD software.

Uploaded by

Mihail Avramov
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

High-level robot programming based on CAD:

dealing with unpredictable environments


1
Pedro Neto, 1Nuno Mendes, 1Ricardo Araújo, 1J. Norberto Pires
2
A. Paulo Moreira
1
Department of Mechanical Engineering (CEMUC), University of Coimbra, Coimbra, Portugal
2
Institute for Systems and Computer Engineering of Porto (INESC-Porto), Porto, Portugal

Abstract:

Purpose – The purpose of this paper is to present a CAD-based human-robot interface


that allows non-expert users to teach a robot in a manner similar to that used by humans
to teach each other. Another important issue addressed here has to do with how robots
deal with uncertainty.

Design/methodology/approach – Intuitive robot programming is achieved by using


CAD drawings to generate robot programs off-line. Sensory feedback allows
minimization of the effects of uncertainty, providing information to adjust the robot
paths during robot operation.

Findings – It was found that it is possible to generate a robot program from a common
CAD drawing and run it without any major concerns about calibration or CAD model
accuracy.

Research limitations/implications – A limitation of the proposed system has to do


with the fact that it was designed to be used for particular technological applications.

Practical implications – Since today most of the manufacturing companies have CAD
packages in their facilities, CAD-based robot programming may be a good option to
program robots without the need for skilled robot programmers.

Originality/value – A new CAD-based robot programming system is proposed. Robot


programs are directly generated from a CAD drawing “running” on a commonly
available 3D CAD package and not from a commercial computer aided robotics
software, making it a simple CAD integrated solution. This is a low-cost and low-setup
time system where no advanced robot programming skills are required to operate it.

Keywords: CAD, Industrial Robotics, High-Level Programming, Sensory Feedback,


Unpredictable Environments.
1. Introduction

1.1. Motivation
Increasingly, companies are changing and reinventing their production systems.
Traditional manufacturing systems (often based on fixed automation and manual work)
are being replaced by flexible and intelligent manufacturing systems, enabling
companies to continue to be competitive in the global market (Kopacek, 1999). This
competitiveness is reflected in the companies’ capacity to respond/react quickly to
market demands, producing more and better quality products at competitive prices.
Owing to its flexibility, programmability and efficiency, industrial robots are seen as
a fundamental element of modern flexible manufacturing systems. Nevertheless, there
are still some problems that hinder the utilization of robots in industry, especially in
small and medium-sized enterprises (SMEs). SMEs have difficulty finding skilled
workers capable of operating with robots. Therefore, new and more intuitive ways for
people to interact with robots are required to make robot programming more accessible,
easier and faster. The goal is that the instructor can teach a robot in a manner similar to
that used by humans to teach each other, for example using CAD drawings, gestures or
through verbal explanation (Neto et al., 2010a).

1.2. Objectives
Robot programming through the typical teaching method (using the teach pendant) is a
tedious and time-consuming task that requires technical expertise. The goal is to
develop methodologies that help users to program a robot in an intuitive way, quickly,
with a high-level of abstraction from the robot specific language, and, if possible,
without speeding too much money.
In this paper, a CAD-based system to program a robot from a 3D CAD drawing,
allowing users with basic skills in CAD and robot programming to generate robot
programs off-line, is presented. In addition, the 3D CAD package, Autodesk Inventor,
that interfaces with the user is a well-known generic CAD package, widespread on the
market at a relative low-cost. Starting from the CAD model of the robotic cell in study,
the way the user generates a robot program is as simple as “drawing” the desired robot
paths in the CAD environment. Later, the information needed is automatically extracted
from the CAD environment, analysed and converted into robot programs. Note that the
robot programs are not extracted neither from a computer aided manufacturing (CAM)
software nor from a computer aided robotics (CAR) software. It means that we are
proposing a simple CAD integrated solution for the robotics field.
Commercial CAR packages are powerful tools, which enable modelling, simulation
and robot programming. Nevertheless, they have some disadvantages that hinder their
use in companies, especially in SMEs. By comparing commercial CAR packages with a
CAD-based robot programming system similar to that presented in this paper (Neto et
al., 2010b), it was found that the CAD-based system has some relative advantages:

• Low-cost. Since the construction of CAD models and the robot programming task
are performed in the same environment/platform (Autodesk Inventor) the programming
task becomes easier and cheaper;
• Short learning curve;
• Simplicity of use. The most time consuming task, the construction of the CAD
model, is present in both systems.

CAD-based robot programming approaches work well if the environment of the


robot tasks is well defined. In the other hand, there are situations which are likely to
create errors or impede the normal operation of the robot:

• The CAD models do not reproduce correctly the geometry of the real scenario;
• Inaccuracies created in the robot calibration process;
• Inefficient fixtures that do not ensure the static character of the workpieces;
• A “foreign” object is introduced in the real environment.

In these cases, we can say that we are in the presence of a dynamic and unpredictable
environment.
To perform successful manipulation robots depend on precise information about
objects in their surrounding. In an unpredictable environment, such information cannot
be given to the robot a priori, robots have to autonomously and continuously acquire
information about their surrounding environment to support their decision making and
react to unanticipated events. Sensory feedback allows a robot to recognize your work
environment for itself, for example producing corrections (on-line) in pre-programmed
robot paths (Figure 1). In fact, the integration of sensors into robotic platforms reduces
the setup time, the need for accurate robot trajectory programming and promotes
flexibility and the autonomous behaviour of robotic systems (Bolmsjö and Olsson,
2005; Johansson et al., 2004).
a) b) c) Robot
Planned Robot path
path
path
Collision
P1 P2 P1 P2 P1 P2

Figure 1 (a) – planned path for a specific environment; (b) – a “foreign” object is introduced into the
environment and collision occurs; (c) – sensory feedback is introduced, helping the robot to deal with the
unpredictable environment (robot path is adjusted)

We validate our methods with two different real-world experiments for two different
tasks, seam tracking and for applications that require the robot to follow a geometric
profile while maintaining a contact force.

2. Related Work

In recent years, CAD technology has become economically attractive and easy to work
with so that today millions of SMEs worldwide are using it to design and model their
products. Already in the 80’s, CAD was seen as a technology that could help in the
development of robotics (Bhanu, 1987). Since then, a variety of research has been
conducted in the field of CAD-based robot planning and programming.
Pires et al. (2004) proposes to extract robot motion information from a CAD DXF
file and converting it into robot commands for welding purposes. A review of CAD-
based robot path planning for spray painting is presented by Chen et al. (2009). Another
study presents a method to generate 3D robot working paths for a robotic adhesive
spray system for shoe outsoles and uppers (Kim, 2004). Nagata et al. (2007) proposes a
robotic sanding platform where the robot paths are generated by CAD/CAM software.
An example of a novel process that benefits from the robots and CAD versatility is the
so-called incremental forming process of metal sheets (Schaefer and Schraft, 2005).
Feng-yun and Tian-sheng, (2005) presents a robot path generator for the polishing
process, where the cutter location data is generated from the postprocessor of a CAD
system. As we have seen above, a variety of research has been done in the area of CAD-
based robot planning and programming. However, none of the studies so far deals with
a “global” solution for this problem.
Unpredictable environments pose a significant challenge because of their complexity
and inherent uncertainty. Over the last few years, important studies have been carried
out to deal with uncertainty in the robotics field: using models of “ideal” environments,
sensory feedback, and implementing reasoning methods into robotic platforms
(Bruyninckx et al., 1991; Nayak and Ray, 1990). These concepts have evolved and
recently, researchers have been successful in developing skills that can handle the
complexity of dynamic and predictable environments (Kenney et al. 2009; Mendes et
al., 2010). A number of authors have devoted attention to sensor simulation, trying to
mimic as closely as possible the behaviour of a real sensor, and thus integrating it (the
virtual sensor) within a CAR platform (Cederberg et al., 2002; Brink et al., 1997;
Bolmsjö and Olsson, 2005). Moreover, sensor information has been used to update
robotic cell models in real-time, allowing to avoid problems such as collisions,
kinematic singularities and exceeding of joint limits (Brink et al., 1997; Johansson et
al., 2004).
The concept of seam tracking applied to robotic welding has been studied over the
last two decades (Nayak and Ray, 1990). Recently, important work has been carried out
in the integration of sensors to assist the robotic arc welding process (Fridenfalk and
Bolmsjö, 2002; Bolmsjö and Olsson, 2005).

3. Robot Programming from CAD

Starting from a 3D CAD model of the robotic cell in study, the way the user generates a
robot program can be as simple as “drawing” the desired robot paths in the CAD
environment. Furthermore, to define the robot end-effector pose (position and
orientation), it is necessary to know, not only the robot path positions but also the end-
effector orientations in space. Therefore, after drawing the robot paths, simplified tool
models should be placed along the paths. These models will define the orientation of the
robot end-effector in each segment of the path (Figure 2).

Robot path Xn+1


Yn
Yn+1
Zn-1 Xn
Xn-1 Zn+1
Yn-1
Zn

Simplified tool model

Figure 2 simplified tool models defining the end-effector orientation

The information needed to program the robot will be extracted from the CAD
environment by using an application programming interface (API) provided by
Autodesk. This API allows the extraction of the points that characterize each of the
different lines used to define a robot path; straight lines, splines and arcs. Moreover, the
API also gives information about the transformation matrix of each part model
represented in the CAD environment. The transformation matrix contains the rotation
matrix and the position of the origin of the part model to which it refers, both in relation
to the origin of the CAD assembly model. Later, the information extracted from the
CAD is converted into robot programs (Video 1, 2010). A diagram with the procedure
to extract 3D data from CAD and their conversion into a robot program is presented in
Figure 3.
Start

Open a 3D CAD Drawing

Draw Robot Paths

Define Robot Tool Orientation

Define Robot Parameters

Execute Interface Software

Select Robot Paths

Extract Data from CAD


(Position and Orientation)

Generate Robot Program

End
Figure 3 extracting 3D data from CAD

3.1. Application programming interface


The Autodesk Inventor API shows the Inventor’s functionalities in an object-oriented
manner, allowing developers to interact with Autodesk Inventor using current
programming languages; Visual Basic, Visual C#, Visual C++. In our proposed system,
a standalone application was used to extract information from the CAD and the
Autodesk Apprentice Server was used to display the CAD models on the screen, Figure
4. A flow chart, containing the method to automatically extract information about a
straight line drawn in CAD, is shown in Figure 5.

Add-In Standalone
(EXE) EXE

Inventor Application
Add-In VBA
(DLL) Apprentice Server

Inventor
Data

Figure 4 accessing the Autodesk Inventor’s API


Extract 3D Data from CAD
Select Item(s)

Is the selected item


No a SketchLine3D?

Yes
Define a Line3D as a
ScketchLine3D

Get the starting point of


the Line3D

Get the endpoint of the


Line3D

Figure 5 extracting data from CAD (straight line)

3.2. Position and orientation in space


In order to off-line generate a robot program from a CAD environment and put it
running in a real environment, the CAD cell should match with the real one. In other
words, it is necessary to have all robot end-effector positions and orientations with
respect to one or more reference frames known a priori by the robot. These frames are
made known to the robot through a calibration process. Generally, this is a simple and
non-time consuming process where the user needs to define the frame(s) within the
CAD environment and then to teach the real robot about that frame(s)’ pose in the real
scenario (off-line to on-line mapping). When there are a significant number of frames to
define, the calibration process can be lengthy and prone to error.
The API gives all the information (transformation matrices and path lines data) with
respect to the origin of the CAD assembly model, the universe coordinate system {U} .
Considering that a frame {B} is defined relative to {U} during the calibration process,
U
from the API we have the transformation matrix of {B} relative to {U} , BT . This
means that frame {B} “makes the link” between the virtual and real world. Note that, as
mentioned above, it is possible to define more than one frame if necessary, as the
process is similar.
Since Autodesk Inventor considers the robot path lines drawn as a constituent of a
single CAD part model (.ipt file) contained in the CAD assembly model (.iam file), the
transformation matrix (relative to {U} ) of that single part model defines the pose of the
path lines. For the general case presented in Figure 6, the path line is part of the table
top model in which the origin and orientation is defined by frame {E} . However, it is
not necessary to know the orientation of the path lines as the API gives all the necessary
U
points to define the path lines relative to {U} , for example the initial path point Pini
(Figure 6). So it is necessary to achieve the path line points relative to frame {B} . In
terms of establishing the robot end-effector orientation, frames {C} and {D} help to
define the origin and orientation of simplified tool models in Figure 6. As mentioned,
the API gives the transformation matrix of these models relative to {U} , UCT and UDT .
However, for robot programming purposes we wish to express frame {C} and {D} in
terms of frame {B} , BCT and BDT . For the case of BCT we have:

B
CT  UBT  UCT (1)

B
To find UT , we must compute the rotation matrix that defines frame {U} relative to
{B} , B
U R , and the vector that locates the origin of frame {U} relative to {B} , B PUorg .
So, we know that:

 UB R B PUorg 
B
U T   (2)
0 0 0 1 

B
Given the characteristics of a rotation matrix, U R= UB R T , and as we know U
T , the next
B
B U
step is to calculate PUorg . Considering a generic vector/point defined in {U} , P ; if
we wish to express this point in space in terms of frame {B} we must compute:

B
P  UB R  U P  B PUorg (3)

For the specific case of the initial path point in Figure 6, Pini , since the API gives U Pini ,
from (3) we can write Pini relative to {B} :

B
Pini  UB R  U Pini  B PUorg (4)

Rewriting (3):

 PBorg   UB R  U PBorg  B PUorg


B U
(5)

The left side of (4) must be zero, so, from (4) we have:

B
PUorg   UB R  U PBorg   UB RT  U PBorg (6)

From (2) and (5) we can write:


 U RT  UB RT  U PBorg 
B
U T  B  (7)
0 0 0 1 

B
Now, we can rewrite (1) and achieve C T . The same methodology can be used to
achieve BDT and any other transformation.

ZD XD

YD

ZC
YC

XC
ZE
XE
Tool Model
Robot Path
YE Pini
ZB
YB
ZU
XB
XU

U
YU PBorg

Figure 6 system frames

3.3. Position and orientation interpolation


When an industrial robot is performing a pre-programmed movement and this one
requires abrupt end-effector orientation changes, we must take special care because it
can come into a situation where no one has total control over the end-effector
orientation. This is particularly true when robot programs are generated off-line. The
proposed solution to circumvent this problem is based on the implementation of linear
smooth interpolation of end-effector positions and orientations (Feng-yun and Tian-
sheng, 2005). The process involves the following steps:

 Identification of risk areas (paths). This is done by analyzing the CAD model and
manually defining those areas in the drawing.
 Discretization of the risk path in equally spaced intervals.
 Calculation of end-effector orientations for each interpolated path point. The new
path is smoother than the initial (Figure 7).
a) b)
Pj  2 Pj  2
Pj Pj 1 Pj

Figure 7 (a) – end-effector pose before interpolation; (b) – end-effector pose after interpolation

T
Consider r(k)= rx (k) ry (k) rz (k)  a generic end-effector position generated at the

discrete time k and defined in  Pj Pj+2  , (Figure 7). Pj , Pj+1 and Pj+2 are known end-
effector poses, extracted from the CAD drawing (see section 4.1.2). For the profile in
Figure 7 (possible area of risk) we will separate the interpolation in two sections, S1 and
S2 ; S1   PjPj+1  and S2  Pj+1 Pj+2  . The calculations are presented for section S1
but for other sections the procedure is the same. So, r(k) is calculated using both the
known data points from CAD ( Pj , Pj+1 ) and the profiling velocity v(k) :

T
v(k )  vx (k ) v y (k ) vz (k )  (8)

It is assumed that the magnitude of v(k) , v(k) , is a constant. Considering


r(k)  Pj Pj+1  , a direction vector W can be defined as:

W  Pj 1  Pj (9)

From (8) and (9), each directional velocity profile is obtained by:

Wi
vi (k )  v(k )  , (i  x, y, z ) (10)
W

From (10), using a sampling width Δt , the interpolated position r(k) is given by:

r (0)  PjT   Pj , x Pj , y Pj , z  (11)

r (n)  PjT1   Pj 1, x Pj 1, y Pj 1, z  (12)

 (i  x, y, z )
ri (k )  ri (0)  vi (k )  k  t ,  (13)
(k  1,..., n  1)

Note that n represents the number of interpolated points.


A quaternion interpolation algorithm (spherical linear interpolation – Slerp) to
interpolate smoothly a sequence of end-effector orientations was used. For the profile in
Figure 7 we will interpolate end-effector orientations between Pj and Pj+2 . Given two
known unit quaternions, Q 0 (from Pj ) and Q n (from Pj+2 ), with parameter k moving
from 1 to n-1 , the interpolated end-effector orientation Q k can be obtained as follows:

  k 1    k 1 
sin  1     sin   
  n 1    n 1  (14)
Qk   Q0   Qn , k  1 n  1
sin  sin 

Where:

  cos1  Q0  Qn  (15)

3.4. Robot program generation


Using the information extracted from the CAD environment, the system presented here
is able to generate robot programs for specific robotic applications. The code generation
process is divided into two distinct phases:

 Definition and parameterization of robot positions/orientations, reference frames,


tools, etc. The end-effector positions and orientations extracted from CAD are used
to define the robot path target poses (16). When confronted with risk areas the
interpolation algorithms automatically generate the appropriate end-effector poses
for these areas. From (3) we have the end-effector positions B P ; from (1) the
transformation matrix  T  containing the rotation matrix, which in turn is used to
B
C

calculate the end-effector orientation in the form of quaternions or Euler angles; from
(13) the interpolated positions r(k) ; and finally from (14) the interpolated
orientations (quaternions) Q k .

P  x, y, z , q1, q 2, q3, q 4
B B
(16)
P and ri ( k ) T and Qk
C

 Body of the program. A robot program contains predominantly robot motion


instructions (linear, joint, circular or spline robot movement). These movement
instructions are selected according to the type of lines used in the CAD drawing to
define the robot paths.

4. Experiments
Two different experiments are discussed, and in both cases, robot programs are
generated off-line from a CAD drawing. In the first experiment, seam tracking, robot
paths are adjusted with the information received from a laser camera attached to the
robot. In the second experiment, a robot follows a geometric profile while maintaining a
contact force, robot paths are adjusted with the information received from a force/torque
(F/T) sensor attached to the robot wrist.
To better visualize the robot path adjustments provided by sensory feedback, the
robotic space was forced to become a more “viewable” unpredictable environment by
purposely making a rough calibration process. Often, calibration errors arise from the
little time and attention devoted to the robot calibration process.

4.1. Seam tracking


4.1.1. Experimental setup
The experimental setup of the robotic platform (Figure 8) is the following:

 An industrial robot ABB IRB 2400 equipped with a S4C+/M2000 controller.


 A computer running Microsoft Windows Xp.
 A laser camera DIGI-I/S from Servo Robot.

The computer is running a CAD package (Autodesk Inventor) and the developed
software interface, which receives data from CAD, interprets the data received and
generates robot programs. The robot can be remotely controlled and managed by the
software interface, which uses an ActiveX named PcRob for such purposes. The laser
camera is connected with the robot controller via serial port.

Welding Machine

PcRob

3D CAD
Package

Software
Interface Camera
Controller

Figure 8 system architecture

4.1.2. CAD model


The CAD assembly model from which a robot program will be generated does not need
to accurately represent the real cell in all its aspects (Figure 9). On the contrary, it can
be a simplified model containing the “important” information. As an example, the robot
tool length, robot paths and relative positioning of CAD models should represent the
real scenario, however, the models appearance do not need to be exactly equal to the
real objects.
For this particular experiment, the CAD assembly model should contain the
workpieces to be welded, the robot paths and the robot tools with the desired torch
orientation for each path segment. In terms of risk areas, there is only one abrupt tool
orientation change (Figure 9).

Risk Path

Z
X

Figure 9 CAD assembly model of the workpieces to be welded (butt joint). Note: a robot program will be
generated from this model

4.1.3. Path adjustment


Analyzing the incoming data from the laser camera, the implemented control system
decides which end-effector adjustments should be applied to the main paths extracted
from CAD. The system modus operandi is relatively simple:

1 Definition/calibration of the robot tool to match with the robot reference frame.
2 The laser camera is configured with information about the welding joint and the
desired vertical and/or horizontal distances (tool standoff) that the torch must
maintain to the welding joint.
3 Features from the workpiece profile are extracted and matched against the predefined
joint templates and tolerances.
4 The automatic end-effector adjustment is achieved by a closed loop position control
that promotes compensation of the errors in Y and Z directions. Correction data are
acquired with a sample rate of 5 Hz.

4.1.4. Results and discussion


Results showed that the CAD-based robot programming system is easy to use and
within minutes an untrained user can generate a robot program for welding purposes.
However, in the real scenario (Figure 10) we have a dynamic environment where robot
path adjustments are required. Figure 11 shows the robot path adjustments/corrections
(in the Y direction) made by the robot during the seam tracking process (Video 2,
2010).
As the robot only allows path adjustments at a frequency of 5 Hz, for higher welding
speeds the path correction does not appear so smooth. Another limitation is the low
robot resolution (0.01 mm), making the path adjustment process more abrupt.

Figure 10 robotic cell

6
Corrected path (mm)

0
0 10 20 30 40 50 60 70 80
Distance (mm)

“Ideal” Situation (gray)

Out of Line (dashed)

Figure 11 path adjustments in Y direction (robot velocity 10 mm/s)

4.2. Profile following


4.2.1. Experimental setup and features
The experimental setup of the robotic platform (Figure 12) is the following:

 An industrial robot Motoman HP6 equipped with the NX100 controller.


 A computer running Microsoft Windows Xp.
 A six degrees of freedom (DOF) F/T sensor from JR3.
 A local area network (LAN), Ethernet and TCP/IP based, used for robot-computer
communication (100 Mbps).

The computer is running Autodesk Inventor and the developed software interface. This
interface generates robot programs from CAD and manages the force control system,
acquiring data from the F/T sensor and sending motion commands (adjustments) to the
robot. The software interface communicates with the robot using a software component
named MotomanLib. The ActiveX component JR3PCI is used to acquire force and
torque data from the F/T sensor. The robot pose is adjusted with a sample rate of 20 Hz.
As in the previous experiment, the robot program is generated from a CAD drawing
(Figure 13). The real work environment is an unpredictable environment due to the
“uncertainty” that comes from an inaccurate calibration process and due to the surface
roughness of the workpiece. The robot tool should follow a geometric profile while
maintaining a contact force. In order to facilitate the analysis of experimental results, a
ball-shaped tool was mounted on the robot’s end-effector.

MotomanLib

3D CAD
Package

Software
JR3PCI
Interface

Figure 12 system architecture

Figure 13 CAD assembly model of the working profile.

4.2.2. Results and discussion


Regarding the generation of the robot program from a CAD drawing, this experiment
showed similar results to those of section 4.1.4. From the incoming data from the F/T
sensor, the implemented force and robot displacement control system (Fuzzy-PI and PI
reasoning) decides which displacements should be applied to the robot end-effector to
achieve satisfactory performance (Mendes et al., 2010; Video 3, 2010). The force
control system ensures that the contact forces are maintained at a constant value,
adjusting the pre-programmed robot paths extracted from CAD (Figure 14 and Figure
15). The graphs of Figure 14 show some force fluctuation due to the roughness of the
surface and the noise of F/T data.
0 0
Axial Force
Force (N) Set Point -15 (N)

Force (N)
-10 -10

-20 -20

-30 -30
100 50 100 150 200 100 50 100 150 200
(mm)
Position (mm)

Position (mm)
5 5
Position

0 0

-5 -5
0 50 100 150
150 200
200 0 50 100 150 200
Time (sec) Time (sec)

Figure 14 experimental results by using a Fuzzy-PI controller (at left) and PI controller (at right)

Axial Force

Figure 15 robot tool in contact with the real workpiece

4.3. Overall results


Some problems can occur when external sensors are used to on-line adjust robot
motion:

 Collisions between the external sensor and the surrounding workspace;


 Situations in which the robot arm is sent to a location outside of the robot working
area;
 Kinematic singularities;
 Poor choice of process parameters;
 The communications delay between the external sensor and the robot controller can
produce a negative effect on the proper definition of the robotic task.

In order to avoid the above mentioned problems the operator should ensure that the
workpieces are inside the working area of the robot, no collisions occur and kinematic
singularities are identified.
During effective robot operation, if a failure or malfunctioning is detected, two
different situations can be considered: task abortion or activation of a reactive task.
After aborting the process, the restarting of the system can be a complicated issue,
depending on the type of robotic task. For example, for an arc welding application,
restarting the system requires at least placing the torch at the point where the robot
stopped.

5. Conclusion and future work

A CAD-based robot programming system with capacity to deal with dynamic and
unpredictable environments was presented. Results showed that the proposed platform
opens new possibilities for intuitive robot programming. It means that an untrained
operator can generate a robot program for a specific task within minutes. Moreover,
since the construction of the CAD models and robot programming task are performed in
the same platform the entire robot programming process becomes easier and cheaper.
This is very important for SMEs that produce small batches of products and need to
constantly reprogram the robotic cells. In addition, sensory feedback enables the robot
to be more flexible when confronted with product changeover. By adding sensory
feedback to the robotic platforms we ensure that the robot manoeuvres in an
unpredictable environment, damping possible impacts and increasing the tolerance to
positioning errors from the calibration process or from the construction of the CAD
models.
Future work will be required to proceed with the development of methodologies
which would facilitate sensor integration in robotic platforms, especially for when
robots are programmed off-line.

References

Bhanu, B. (1987), “CAD-based robot vision”, IEEE Computer, Vol. 20 No. 8, pp. 12-
16.
Bolmsjö, G. and Olsson, M. (2005), “Sensors in robotic arc welding to support small
series production” Industrial Robot, Vol. 32 No. 4, pp. 341-5.
Brink, K., Olsson, M., and Bolmsjö, G. (1997), “Increased autonomy in industrial
robotic systems: a framework” Journal of Intelligent and Robotic Systems, Vol. 19,
pp. 357-73.
Bruyninckx, H., De Schutter, J. and Allotta, B. (1991), “Model-based constrained
motion: a. modelling, specification and control” IEEE 5th International Conference
on Advanced Robotics, Pisa, pp. 976-81.
Cederberg, P., Olsson, M. and Bolmsjö, G. (2002), “Virtual triangulation sensor
development, behavior simulation and CAR integration applied to robotic arc-
welding” Journal of Intelligent and Robotic Systems, Vol. 35, pp. 365-79.
Chen, H., Fuhlbrigge, T. and Li, X. (2009), “A review of CAD-based robot path
planning for spray painting”, Industrial Robot, Vol. 36 No. 1, pp. 45-50.
Feng-yun, L. and Tian-sheng, L. (2005), “Development of a robot system for complex
surfaces polishing based on CL data”, The International Journal of Advanced
Manufacturing Technology, Vol. 26, pp. 1132-7.
Fridenfalk, M., and Bolmsjö, G. (2002), “Design and validation of a sensor guided robot
control system for welding in shipbuilding” International Journal for the Joining of
Materials, Vol. 14 No. 3/4, pp. 44-55.
Johansson, R., Robertsson, A., Nilsson, K., Brogardh, T., Cederberg, P., Olsson, M.,
Olsson, T. and Bolmsjö, G. (2004), “Sensor integration in task-level programming
and industrial robotic task execution control” Industrial Robot, Vol. 31 No. 3, pp.
284-96.
Kenney, J., Buckley, T. and Brock, O. (2009), “Interactive Segmentation for
Manipulation in Unstructured Environments” IEEE International Conference on
Robotics and Automation, Kobe, Japan, pp. 1337-82.
Kim, J.Y. (2004), “CAD-based automated robot programming in adhesive spray
systems for shoe outsoles and uppers” Journal of Robotic Systems, Vol. 21 No. 11,
pp. 625-34.
Kopacek, P. (1999), “Intelligent manufacturing: present state and future trends” Journal
of Intelligent and Robotic Systems, Vol. 26, pp. 217-29.
Mendes, N., Neto, P., Pires, J.N. and Moreira, A.P. (2010), “Fuzzy-PI force control for
industrial robotics”, in Vadakkepat et al. (Ed.), Trends in Intelligent Robotics,
Springer-Verlag, Berlin Heidelberg, pp. 322-9.
Nagata, F., Kusumoto, Y., Fujimoto, Y. and Watanabe, K. (2007), “Robotic sanding
system for new designed furniture with free-formed surface”, Robotics and
Computer-Integrated Manufacturing, Vol. 23 No. 4, pp. 371-9.
Nayak, N. and Ray, A. (1990), “An integrated system for intelligent seam tracking in
robotic welding: part I – conceptual and analytical development” IEEE International
Conference on Robotics and Automation, pp. 1892-7.
Neto, P., Pires, J.N. and Moreira, A.P. (2010a), “High-level programming and control
for industrial robotics: using a hand-held accelerometer-based input device for
gesture and posture recognition” Industrial Robot, Vol. 37 No. 2, pp. 137-47.
Neto, P., Pires, J.N. and Moreira, A.P. (2010b), “CAD-based off-line robot
programming” IEEE International Conference on Robotics, Automation and
Mechatronics, Singapore, pp. 516-21.
Pires, J.N., Godinho, T. and Ferreira, P. (2004), “CAD interface for automatic robot
welding programming” Industrial Robot, Vol. 31 No. 1, pp. 71-6.
Schaefer, T. and Schraft, D. (2005), “Incremental sheet metal forming by industrial
robot” Rapid Prototyping Journal, Vol. 11 No. 5, pp. 278-86.
Video 1 (2010), “Robot program generation from CAD virtual paths”, available at:
http://robotics.dem.uc.pt/pedro.neto/GS3.html (accessed 15 December 2010)
Video 2 (2010), “Robot path adjustment – laser camera”, available at:
http://robotics.dem.uc.pt/pedro.neto/GS4.html (accessed 15 December 2010)
Video 3 (2010), “Robot path adjustment – force sensor”, available at:
http://robotics.dem.uc.pt/pedro.neto/GS5.html (accessed 15 December 2010)

You might also like