0% found this document useful (0 votes)
119 views156 pages

STP 1594-2016

Uploaded by

Tim Schouw
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
119 views156 pages

STP 1594-2016

Uploaded by

Tim Schouw
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 156

ASTM INTERNATIONAL

Selected Technical Papers

Autonomous
Industrial Vehicles:
From the Laboratory to the
Factory Floor
STP1594
Editors:
Roger Bostelman
Elena Messina

www.astm.org

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
Selected technical PaPerS
StP1594

Editors: Roger Bostelman, Elena Messina

Autonomous Industrial
Vehicles: From the
Laboratory  to the
Factory  Floor
ASTM Stock #1594
DOI: 10.1520/STP1594-EB

ASTM International, 100 Barr Harbor Drive, PO Box C700, West Conshohocken, PA 19428-2959.
Printed in the U.S.A.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
Library of Congress Cataloging-in-Publication Data

Names: Bostelman, Roger, editor. | Messina, E. R. (Elena R.), editor. | ASTM


International.
Title: Autonomous industrial vehicles : from the laboratory to the factory
f oor / editors, Roger Bostelman, Elena Messina.
Description: West Conshohocken, PA : ASTM International, [2016] | Series:
Selected technical papers ; STP1594 | Papers presented at a workshop held
May 26-30, 2015, in Seattle, Washington, USA. | “ASTM Stock #STP1594.” |
“DOI: 10.1520/STP1594-EB.” | Includes bibliographical references.
Identi f ers: LCCN 2015050464 (print) | LCCN 2015051463 (ebook) | ISBN
9780803176331 (pbk.) | ISBN 9780803176348 ()
Subjects: LCSH: Autonomous vehicles—Congresses. | Motor vehicles—Automatic
control—Congresses. | Intelligent control systems—Congresses.
Classi f cation: LCC TL152.8 .A87 2016 (print) | LCC TL152.8 (ebook) | DDC
629.04/6--dc23
LC record available at http://lccn.loc.gov/2015050464

Copyright © 2016 ASTM INTERNATIONAL, West Conshohocken, PA. All rights reserved. This material
may not be reproduced or copied, in whole or in part, in any printed, mechanical, electronic, f lm, or other
distribution and storage media, without the written consent of the publisher.

Photocopy Rights
Authorization to photocopy items for internal, personal, or educational classroom use, or the internal,
personal, or educational classroom use of speci f c clients, is granted by ASTM International provided that
the appropriate fee is paid to the Copyright Clearance Center, 222 Rosewood Drive, Danvers, MA 01923,
Tel: (978) 646-2600; http://www.copyright.com/
The Society is not responsible, as a body, for the statements and opinions expressed in this publication.
ASTM International does not endorse any products represented in this publication.

Peer Review Policy


Each paper published in this volume was evaluated by two peer reviewers and at least one editor. The
authors addressed all of the reviewers’ comments to the satisfaction of both the technical editor(s) and
the ASTM International Committee on Publications.
The quality of the papers in this publication re f ects not only the obvious e forts of the authors and the
technical editor(s), but also the work of the peer reviewers. In keeping with long-standing publication
practices, ASTM International maintains the anonymity of the peer reviewers. The ASTM International
Committee on Publications acknowledges with appreciation their dedication and contribution of time
and e fort on behalf of ASTM International.

Citation of Papers
When citing papers from this publication, the appropriate citation includes the paper authors, “paper title,”
STP title, STP number, book editor(s), ASTM International, West Conshohocken, PA, year, page range,
paper DOI listed in the footnote of the paper. A citation is provided on page one of each paper.

Printed in Bay Shore, NY


April, 2016

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
Foreword
THIS COMPILATION OF Selected Technical Papers, STP1594, Autonomous In-
dustrial Vehicles: From the Laboratory to the Factory Floor, contains peer-reviewed
papers that were presented at a workshop held May 26–30, 2015, in Seattle, Wash-
T
ington, USA. e workshop was sponsored by ASTM International Committee F45
on Driverless Automatic Guided Industrial Vehicles.

Workshop Chairpersons and STP Editors:

Roger Bostelman
Elena Messina
National Institutes of Standards and Technology
Gaithersburg, MD, USA

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
Con ten ts

Overview vi i

Acknowledgments xi

Towards Development of an Automated Guided Vehicle Intelligence Level


Performance Standard 1
Roger Bostelman and Elena Messina

Preliminary Development of a Test Method for Obstacle Detection and Avoidance


in Industrial Environments 23
Adam Norton and Holly Yanco

3D Sensors on Driverless Trucks for Detection of Overhanging Objects in the Pathway 41


Klas Hedenberg and Björn Åstrand

Multi-AGV Systems in Shared Industrial Environments: Advanced Sensing and


Control Techniques for Enhanced Safety and Improved E f ciency 57
Lorenzo Sabattini, Elena Cardarelli, Valerio Digani,
Cristian Secchi, and Cesare Fantuzzi

The Safety-to-Autonomy Curve: An Incremental Approach to Introducing


Automation to the Workforce 82
Daniel Theobald and Frederik Heger

Dynamic Metrology Performance Measurement of a Six Degrees-of-Freedom


Tracking System Used in Smart Manufacturing 91
Roger Bostelman, Joseph Falco, Mili Shah, and Tsai Hong Hong

Harmonization of Research and Development Activities Toward Standardization


in the Automated Warehousing Systems 106
Zdenko Kovačić, Michael Butler, Paolo Lista, Goran Vasiljević, Ivica Draganjac,
Damjan Miklić, Tamara Petrović, and Frano Petric

Recommendations for Autonomous Industrial Vehicle Performance Standards 129


Roger Bostelman

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017 v
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
Overvi ew

Automatic guided vehicles (AGVs) were one of the earliest applications for mobile
Tf
robots. e rst AGVs were deployed in the 1950s to transport materials in large fa-
f
cilities and warehouses. Mobile robot capabilities have advanced signi cantly in the
T
past decades. is progress is due in large part to researchers at technical universities
who have made tremendous strides in applying computer control and sensors to mo-
bile platforms for uses in applications such as manufacturing, health care, military,
and emergency response. As industrial vehicles gained more capabilities, the “A” in
AGV began to transition from “automatic” to “automated” in informal usage. is T
mirrors the progress in guided vehicles in areas such as safety sensing and reacting.
Further advancements in mobile robotics, such as in more general-purpose sens-
ing, planning, communications, and control, are paving the way for an era where
T
the “A” stands for “autonomous.” is evolution in onboard intelligence has greatly
expanded the potential scope of applications for AGVs and thus raised the need for
standard means of measuring performance.
A new committee was formed under ASTM International to develop these miss-
ing standards for measuring, describing, and characterizing performance for this new
breed of AGVs. ASTM’s Committee F45 on “Driverless Automatic Guided Industrial
Vehicles” (http://www.astm.org/COMMITTEE/F45.htm) is scoped to include stand-
f
ardized nomenclature and de nitions of terms, recommended practices, guides, test
f T
methods, speci cations, and performance standards for AGVs. ese new perfor-
mance standards will complement the ongoing work in AGV safety standards by
the Industrial Truck Standards Development Foundation [1] , the British Standards
Institution [2] , and others. F45 is addressing areas that are important for potential
AGV users to understand when making purchase and task application decisions.
T f
erefore, the committee is divided into ve technical subcommittees that focus on
the key areas of interest for the community:
F45.01 Environmental Efects
F45.02 Docking and Navigation
F45.03 Object Detection and Protection
F45.04 Communication and Integration
F45.91 Terminology
Tfe rst event organized by the ASTM F45 Committee was a workshop intended
to foster communication between researchers and practitioners and was held at the

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017 vii
2015 Institute of Electrical and Electronic Engineers International Conference on
Robotics and Automation (ICRA). Organized by Roger Bostelman of the National
Institute of Standards and Technology and Pat Picariello from ASTM International,
the workshop “Autonomous Industrial Vehicles: From the Laboratory to the Factory
Floor” solicited researcher input for the development of consensus standards and
sought to educate researchers about a standards-based mechanism for rapid technol-
ogy transfer from the laboratory to industry.
T is book comprises expanded versions of selected papers presented at the ICRA
T
workshop. e workshop and this book feature perspectives from related standards
Tf
eforts, industry needs, and cutting-edge research. e rst chapter, “Towards Devel-
opment of an Automated Guided Vehicle Intelligence Level Performance Standard”
by Bostelman and Messina, sets the stage by reviewing standards development for
other mobile robot application domains, such as emergency response, and suggests
T
approaches for tackling performance measurement for intelligent AGVs. e authors
discuss examples of performance standards that could be used for vehicle navigation
performance and for perception systems (which would be key components of intel-
ligent vehicles).
Norton and Yanco’s chapter, entitled “Preliminary Development of a Test Method
for Obstacle Detection and Avoidance in Industrial Environments,” builds on the
f T
rst chapter by documenting the process for developing a test method. eir pro-
cess starts with building an understanding of the deployment environment through
T
the development of a taxonomy of relevant obstacles. e key characteristics are ab-
f
stracted to create recon gurable artifacts for conducting tests that are representative
f
of robot tasks. Statistical signi cance of performance data and other key aspects nec-
essary for successful test methods are also considered.
One of the challenges of deploying AGVs in unstructured facilities is the pos-
sibility of obstacles appearing not just on the ground but also above the foor. To
broaden obstacle detection capabilities, Hedenberg and Åstrand implemented time
of fight and structured light sensors on an unmanned vehicle and conducted sev-
eral experiments to characterize the performance of this sensing combination in
T
the laboratory and in an industrial setting. e results of their experiments are pre-
sented in “3D Sensors on Driverless Trucks for Detection of Overhanging Objects in
the Pathway,” which discusses the implications of using these sensors in real-world
settings.
In the chapter “Multi-AGV Systems in Shared Industrial Environments: Advanced
Sensing and Control Techniques for Enhanced Safety and Improved Efciency,” Sab-
attini et al. tackle the complexities of multiple AGVs operating in unstructured envi-
T
ronments. ey do so through fusion of sensor data from the diferent vehicles. e T
fused data produces a global environment representation that is updated in real-time
and is used for assigning missions to AGVs and supporting path planning and ob-
stacle avoidance.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017 viii
T eobald and Heger’s chapter considers the transition from research capabili-
T
ties to implementations in industry from an incremental perspective. eir chapter,
T
entitled “ e Safety-to-Autonomy Curve: An Incremental Approach to Introducing
Automation to the Workforce,” proposes gradual implementation of automation for
robotic systems. Starting with the deployment of the necessary safety systems, which
include sensing and supporting algorithms, the authors advocate leveraging the sen-
sor data from the safety systems to accumulate information and knowledge about the
T
environment and humans. us, the robots learn how to navigate and behave on an
f
ongoing basis, building con dence in the industry to allow incremental adoption.
T e criticality of robust sensing to enable advanced performance and safety for
AGVs heightens the importance of measuring how well a sensor system performs.
Performance test methods must have a basis for comparison to a reference—or
ground truth—system that is typically ten times better than the system under test.
T e chapter by Bostelman et al., “Dynamic Metrology Performance Measurement of
a Six Degrees-of-Freedom Tracking System Used in Smart Manufacturing,” describes
a method for evaluating the accuracy of a potential ground truth system.
T e chapter “Harmonization of Research and Development Activities Toward
Standardization in the Automated Warehousing Systems” by Kovačić et al. high-
T
lights the role of standards in bridging research and commercialization. eir work
describes a European Commission project in advancing automated warehousing
T
through a set of freely navigating AGVs in large-scale facilities. e authors discuss
performance standards and benchmarks that can enable technology transfer from
the laboratory to industry.
T f
e book’s nal chapter, “Recommendations for Autonomous Industrial Vehicle
Performance Standards,” by Bostelman, summarizes and synthesizes a discussion
Tf
session that was held at the ICRA workshop. e ndings from the workshop pre-
sented in this chapter are meant to inform the standardization eforts under ASTM
Committee F45 and accelerate the infusion of intelligence so as to enable autono-
mous guided vehicles.

References
[1] ANSI/ITSDF B56.5:2012, Safety Standard for Driverless, Automatic Guided
Industrial Vehicles and Automated Functions of Manned Industrial Vehicles,
November 2012, http://www.itsdf.org
[2] British Standard Safety of Industrial Trucks—Driverless Trucks and eir Sys- T
tems. Technical Report BS EN 1525, 1998.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017 ix
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
Acknowledgments
Te ICRA workshop was supported by the IEEE Technical Committee on Perfor-
mance Evaluation and Benchmarking of Robotic and Automation Systems. Te
Editors are grateful to the reviewers of this book whose comments greatly improved
its  quality.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017 xi
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
AUTONOMOUS INDUSTRIAL VEHICLES: FROM THE LABORATORY TO THE FACTORY FLOOR 1

STP 1594, 2016 / available online at www. astm. org / doi: 10. 1520/STP159420150054

Roger Bostelman 1 and Elena Messina 1

Towards Development of an
Automated Guided Vehicle
Intelligence Level Performance
Standard
Citation
Bostelman, R. and Messina, E., “Towards Development of an Automated Guided Vehicle
Intelligence Level Performance Standard,” Autonomous Industrial Vehicles: From the
Laboratory to the Factory Floor, ASTM STP1594, R. Bostelman and E. Messina, Eds., ASTM
International, West Conshohocken, PA, 2016, pp. 1–22, doi:10.1520/STP1594201500542

ABSTRACT
Automated guided vehicles (AGVs) typically have been used for industrial
material handling since the 1950s. In the years following, U.S. and European
safety standards have been evolving to protect nearby workers. However, no
performance standards have been developed for AGV systems. In our view,
lessons can be learned for developing such standards from the research
and standards associated with mobile robots applied to search and rescue and
military applications. Research challenge events, tests and evaluations, and
intelligence-level efforts have also occurred that can support industrial AGV
developments into higher-level intelligent systems and provide useful standards
development criteria for AGV performance test methods. This chapter provides
background information referenced from all of these areas to support the need
for an AGV performance standard.

Keywords
standards, performance, mobile robot, automated guided vehicle (AGV)

Manuscript received June 16, 2015; accepted for publication August 11, 2015.
1
National Institute of Standards and Technology, 100 Bureau Dr., Gaithersburg, MD 20899-8230
2
ASTM Workshop on Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor on
May 26–30, 2015 in Seattle, Washington.

Copyright VC 2016 by ASTM International, 100 Barr Harbor Drive, PO Box C700, West Conshohocken, PA 19428-2959.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
2 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

Introduction
Automated guided vehicles (AGVs) have been used since 1953. In the years
following, “AGVs have evolved into complex material handling transport vehicles
ranging from mail handling AGVs to highly automatic trailer loading AGVs using
laser and natural target navigation technologies” [1]. Potential users of AGV
technology know that AGVs are safe when AGV manufacturers conform to the
American National Standards Institute/Industrial Truck Safety Development
Foundation (ANSI/ITSDF) B56.5:2012, Safety Standard for Driverless, Automatic
Guided Industrial Vehicles and Automated Functions of Manned Industrial
Vehicles [2]. However, there are no current standards to directly compare AGV
intelligent performance such that users can fully appreciate their potential AGV
investment without independent tests and evaluations.
Nonindustrial vehicle applications (e.g., driverless cars, search and rescue
robots, military unmanned vehicles) are rapidly improving their capabilities and
intelligence, thus providing a clear sense of the advanced features that could be
installed in industrial AGVs. The benefits to AGV users would be enormous if
AGVs gained onboard intelligence that would allow the vehicles to adapt to their
manufacturing facilities instead of vice versa. An intelligence-level performance
standard would benchmark current capability levels and standardize test methods
to do the benchmarking. Benchmarks provide an incentive for AGV developers to
achieve higher performance, which enables them to expand their markets to include
broader applications, such as those within unstructured environments with workers
present.
This chapter proposes methods for measuring AGV performance that can
provide the foundation for a new, voluntary AGV intelligence-level performance
standard. The standard would cover a broad range of AGV classes and include
performance-measurement test methods for estimating vehicle capabilities associ-
ated with the particular vehicle classes. We provide (1) background information
about current standards, including those under development for vehicles in emer-
gency response applications; (2) examples of vehicle challenge events and programs
that have improved autonomous vehicle intelligence; and (3) a list of capabilities to
be considered in a new AGV performance standard.
A performance standard would provide AGV manufacturers with test meth-
ods to estimate performance that could be referenced as part of their product
marketing. The results of the performance test methods, which would measure
capabilities along a spectrum, would also provide insight for manufacturers and
developers who could use the results to guide investments in research and devel-
opment or to target particular market niches. Because there is no current per-
formance standard, users must rely on specifications provided by manufacturers
with, perhaps, no traceable and reproducible basis. This lack of standards-based
performance characterization discourages many potential AGV users from
attempting to automate processes or leads them to purchase AGVs that may not be
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN AND MESSINA, DOI 10.1520/STP159420150054 3

appropriate for their particular environments and tasks. Mismatches between expect-
ations and results may require additional capital investment to upgrade the equip-
ment or environment.
As used in this document, “At a minimum, intelligence requires the ability to
sense the environment, to make decisions, and to control actions. Higher levels of
intelligence may include the ability to recognize objects and events, to represent
knowledge in a world model, and to reason about and plan for the future” [3].
Some intelligence measurements are included in the ANSI/ITSDF B56.5 safety
standard. For example, vehicle speed must be reduced when navigating through
confined areas, and control system performance tests must prove that the vehicle
stops prior to contact with human-representative obstacles when they are sensed
using noncontact safety sensors. Specific tests for the latter example in the ANSI/
ITSDF B56.5 standard measure performance of the navigation and safety sensors
when integrated into the vehicle controller so that safe performance is ensured
prior to transferring the vehicle to the user. However, most vehicle capabilities,
which may or may not include safety functions, lack standard means of conducting
performance measurements for reporting to potential users to enable them to make
informed decisions that reduce the risk of adopting AGVs.

Background
Efforts to develop methods for testing and evaluation of the safety performance
of automated and semiautomated vehicles (SSVs) have been ongoing at the
National Institute of Standards and Technology (NIST) and other organizations for
many years. The NIST has supported the ANSI/ITSDF B56.5, B56.11.6 (powered
industrial vehicle operator visibility), and B56.1 (fork trucks) standards [4]. This
project showed that it is possible to make static and dynamic measurements of both
vehicle safety and capability needs. Measurement results have provided a basis to
suggest improvements to the ANSI/ITSDF B56.5 standard.
Similarly, other vehicle programs and challenge events have improved the
intelligence and capabilities of autonomous and SSVs. Additionally, standards for
emergency response robot vehicles have been and continue to be developed.
Examples of relevant efforts are shown in this section, gathered from Internet
searches and the individuals and organizations shown in the acknowledgments
section.

Example Related Standards


ASTM EMERGENCY RESPONSE ROBOT PERFORMANCE STANDARDS
Several ASTM International standards have been developed for emergency response
robots, and several others are in the pipeline. Response robots are used in urban
search and rescue, bomb disposal, military, law enforcement, and other time-critical
and hazardous applications. Standards that measure various performance capabilities
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
4 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

of these robots are being developed under ASTM’s Committee for Homeland Security
Applications, Operational Equipment, Robots (E54.08.01). This section provides a list
of those ASTM standards that are most relevant to AGVs [5]. In the descriptions,
approved standards are shown with their standard number preceded by ASTM; work
items are prefixed with WK; and status (as of this writing) of preliminary develop-
ment is designated by V ¼ validating (i.e., checking or proving repeatability) or
P ¼ prototyping (i.e., experimenting with artifacts and procedures to best measure the
particular system capability concerned). The majority of commercially available
robots for response applications have limited onboard intelligence; hence, they are
remotely operated by a human (typically a responder) using displays transmitted back
from the cameras or other sensors onboard. There are some emerging assistive
autonomy capabilities (for example, in stair climbing), but generally speaking, the
response robots require much greater human interaction than AGVs do. The test
methods described in the following sections are designed to be agnostic as to whether
the robot is programmed to run through the tests independently or if it has to be con-
trolled by an operator during the entire test, meaning that a fully autonomous robot
should run through the same tests as one that has to be “driven” by an operator.
In general, the test methods developed under ASTM E54.08.01 consist of the
following elements:
• Apparatus (or prop): A repeatable, reproducible, and inexpensive representa-
tion of tasks that the robot is expected to perform. It should challenge the
robot with increasing difficulty or complexity and be easy to fabricate interna-
tionally to ensure all robots are measured similarly.
• Procedure: A script for the test administrator and the robot operator to follow.
These tests are not intended to surprise anybody. They should be practiced to
improve technique.
• Metric: A quantitative way to measure the capability. For example, complete-
ness of ten continuous repetitions of a task, or terrain figure eights, resulting
in a cumulative distance traversed. Together with the elapsed time, a resulting
rate in tasks/time or distance/time can be calculated.
• Fault conditions: A failure of the robotic system preventing completion of
ten or more continuous repetitions. This could include an inverted robot, a
stuck robot, or failure requiring field maintenance.
Starting the analysis off with terminology, many terms defined in the standard
terminology for urban search and rescue robots could have applicability to AGVs.
For example, ramps, towing, confined area/space, maneuvering, obstacles, and peak
power are just a few potentially relevant and overlapping terms across the two
industries.

S ta n d a rd Term i n ol og y for U rba n S ea rch a n d Rescu e Roboti c

O pera ti on s (AS TM E 25 21 –0 7a )

Mobility standards listed here may prove relevant to industrial AGVs and mobile
robots by describing test methods and definitions for areas such as environmental
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN AND MESSINA, DOI 10.1520/STP159420150054 5

effects, obstacle detection and avoidance, and terminology. Ramps, speed, terrain
types, towing, and so on are all associated with industrial vehicle intelligent safety
and performance capabilities (similar to response robots). The environmental
conditions typically are not expected to be as harsh in the industrial settings that
AGVs encounter, but the concepts for testing mobility are still transferrable.
Standard Test Method for Evaluating Emergency Response Robot Capabilities,
Mobility:
• Confined Area Obstacles: Gaps (ASTM E2801)
• Confined Area Obstacles: Hurdles (ASTM E2802)
• Confined Area Obstacles: Inclined Planes (ASTM E2803)
• Confined Area Obstacles: Stair/Landings (ASTM E2804)
• Confined Area Terrains: Gravel (V) (WK35213)
• Confined Area Terrains: Sand (V) (WK35214)
• Confined Area Terrains: Mud (P)
• Confined Area Terrains: Continuous Pitch/Roll Ramps (ASTM E2826)
• Confined Area Terrains: Crossing Pitch/Roll Ramps (ASTM E2827)
• Confined Area Terrains: Symmetric Stepfields (ASTM E2828)
• Confined Space Terrains: Vertical Insertion/Retrieval Stack with Drops (P)
• Maneuvering Tasks: Sustained Speed (ASTM E2829)
• Maneuvering Tasks: Towing: Grasped/Hitched Sleds (ASTM E2830)
• Maneuvering Tasks: Towing Hitched Trailers (P)
Energy and power standards listed here are relevant to industrial AGVs
and mobile robots. The energy and power measurements are conducted under
somewhat arduous conditions in order to expedite the test process (draining of
the battery) and to represent some typical energy usage profiles in response appli-
cations. Test methods inspired by the ones in ASTM E54 but adapted to AGVs
would ensure all vehicle manufacturers and users conform to the same energy
and power measurement techniques. Although not as potentially catastrophic as
losing a robot that has penetrated a hazardous area due to a dead battery, users of
AGVs need to have predictable and known battery performance to ensure
efficient operations.
Standard Test Method for Evaluating Emergency Response Robot Capabilities,
Energy/Power:
• Endurance Tasks: Confined Area Terrains: Continuous Pitch/Roll Ramps (V)
(W34433)
• Peak Power Tasks: Confined Area Obstacles: Stairs/Landings (P)
Vehicle communication with the master or warehouse management systems
as used in AGV applications relates to rescue robot communication capabilities.
The following standards can provide the beginning of communication and inter-
ference test methods for industrial AGVs and mobile robots. In particular, being
able to characterize the wireless communications between the vehicle and either
the operator control station or the central factory or warehouse controller is
essential. In typically teleoperated response robot applications, there is need for
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
6 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

constant video streaming from the robot’s onboard cameras to the operator
control station and of motion and other commands from the operator to the
robot. When AGVs have onboard path replanning capabilities (for example, to go
around an obstacle), they may need to provide updates to the centralized control-
ler if they deviate from a programmed path. The electromagnetic environment in
factories and warehouses may be challenging to wireless communications, height-
ening the priority of having a means of characterizing the AGV’s communication
system.
Standard Test Method for Evaluating Emergency Response Robot Capabilities,
Radio Communication:
• Control and Inspection Tasks: Line-of-Sight Environment (ASTM E2854)
• Control and Inspection Tasks: Non-Line-of-Sight Environment (ASTM
E2855)
• Control and Perception Tasks: Structure Penetration Environment (P)
• Control and Perception Tasks: Urban Canyon Environment (P)
• Control and Perception Tasks: Interference Signal Environment (P)
Human-system interaction performance standards have perhaps minimal
relevance on industrial vehicle test methods because AGVs require less and differ-
ent types of human interaction. Nevertheless, there may be some human-robot
interactions, potentially with factory or warehouse workers who need to intervene
with the AGV. Some concepts may be transferrable from the ASTM E54 human-
system interaction test methods to those for AGVs.
Standard Test Method for Evaluating Emergency Response Robot Capabilities,
Human-System Interaction (HSI):
• Search Tasks: Random Mazes with Complex Terrain (ASTM E2853)
• Navigation Tasks: Random Mazes with Complex Terrain (V) (WK33260)
• Search Tasks: Confined Space Voids with Complex Terrain (V) (WK34434)
Sensors are commonly used on AGVs and mobile robots. Response robot
standards, listed here, provide performance test methods for how capable sensors
are when integrated with the vehicle. Test methods also evaluate how well the
control algorithms place sensor data in maps to localize the vehicle and for use in
obstacle detection and avoidance.
Standard Test Method for Evaluating Emergency Response Robot Capabilities,
Sensors:
• Ranging: Spatial Resolution (P)
• Localization and Mapping: Hallway Labyrinths with Complex Terrain (P)
• Localization and Mapping: Wall Mazes with Complex Terrain (P)
• Localization and Mapping: Sparse Feature Environments (P)
The aforementioned standards can measure performance of current teleoper-
ated robots as well as emerging robots with autonomous capabilities. The standards
are already being used to compare capabilities of different rescue, military recon-
naissance, and bomb disposal robots. Based on such comparisons, users can choose
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN AND MESSINA, DOI 10.1520/STP159420150054 7

the best system for their needs. Similar standards-based means of comparing
performance of different candidate AGVs are needed.

INSTITUTE OF ELECTRICAL AND ELECTRONIC ENGINEERS ROBOTICS AND


AUTOMATION SYSTEMS
The Institute of Electrical and Electronic Engineers Robotics and Automation Sys-
tems Society (IEEE-RAS) has launched two standards efforts aimed at advancing
the capabilities of robots. Ontologies (i.e., formal descriptions of the concepts and
relationships that can exist) for robots are important to the AGV industry as vehicle
intelligence increases. Standard knowledge transfer among vehicles, management
systems, facility equipment, and other associated systems will allow direct imple-
mentation of mobile robot systems into facilities. Similarly, standard representa-
tions of vehicle control maps will allow faster and simpler vehicle integration into
facilities.
• Ontologies for Robotics and Automation Working Group
The IEEE-RAS Ontologies for Robotics and Automation Working Group is
developing a standard ontology and associated methodology for knowledge
representation and reasoning in robotics and automation, together with the
representation of concepts in an initial set of application domains. The first
of these standards is IEEE P1872, Core Ontologies for Robotics and Auto-
mation, which was approved by the IEEE Standards Association Standards
Board in 2015. Such standards will provide a unified way of representing
knowledge and will provide a common set of terms and definitions, allowing
for unambiguous knowledge transfer among any group of humans, robots,
and other artificial systems. A study group has begun work to formalize
an industrial ontology. This promises to have great relevance to AGV
applications.
• IEEE-RAS Robot Map Data Representation for Navigation Working Group
The IEEE-RAS Robot Map Data Representation for Navigation Working
Group aims to standardize a common representation and encoding for two-
dimensional map data used for navigation by mobile robots. The encoding
will be used when exchanging map data with other systems or subsystems.
The standard focuses on the interchange of map data among systems, particu-
larly those that may be supplied by different vendors. In addition to the encod-
ing, the standard aims to specify suitable application programmer interfaces
and protocols for the interchange process so that navigation-related compo-
nents from multiple vendors may interoperate.
SOCIETY OF AUTOMOTIVE ENGINEERS AS-4—AUTONOMY LEVELS FOR
UNMANNED SYSTEMS
The Society of Automotive Engineers (SAE) AS-4 Unmanned Systems Steering
Committees address all facets of unmanned systems—design, maintenance, and
in-service experience [6]. The primary goal of AS-4 is to publish open systems
standards and architectures that enable interoperability of unmanned systems for
military, civil, and commercial applications. Version 2.0 of the Autonomy Levels
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
8 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

for Unmanned Systems (ALFUS) framework, which includes terminology, is evolv-


ing as unmanned systems technology evolves. Version 2.0 essentially covers the
results of the ALFUS effort up to the end of 2007. The ALFUS model and how it
applies to AGVs and mobile robots is discussed in greater detail as follows.
• International Organization for Standardization Performance Standards
for Service Robots
The International Organization for Standardization (ISO) [7] is planning to
establish performance standards for service robots [8]. Service robots are
those used in applications excluding industrial automation; they can be
either personal (used by a layperson for noncommercial tasks) or professio-
nal (used by a trained operator for commercial tasks). However, the standard
and the development process may also be applicable to industrial AGVs
and mobile robots. The International Organization of Standards/Technical
Committee (ISO/TC) 184/Standards Committee (SC2)/Working Group
(WG) 8 has started the development of a standard for measuring perform-
ance of this class of robots. The current early draft (ISO/Draft International
Standard [DIS] 18646-1) for “Performance criteria and related test methods
for service robot—Part 1: Locomotion for wheeled robots” includes instruc-
tions for measuring speed and braking time on different surfaces to prevent
falling down, to increase mobility on a slope and over a sill/curb, to detect
and avoid obstacles, and to measure the relative distance/speed between
human and robot.
• International Electrotechnical Commission Standards for Service Robots
The International Electrotechnical Commission (IEC) is developing safety and
performance standards for systems and components used for robotic cleaning
applications [9]. IEC SC 59F (floor treatment appliances) WG 5 (methods for
measuring the performance of household cleaning robots) is considering
performance such as mobility (coverage rate), navigation, dust collection in a
standardized area, noise, battery power, and so on. IEC 60312 standards define
test methods for measuring performance of household cleaning appliances. As
with the draft ISO/DIS 18646-1, IEC 60312 may also prove useful in develop-
ing industrial AGV and mobile robot performance standard test methods.

EXAMPLE PERFORMANCE CHALLENGES AND PROGRAMS


Several challenge events, involving teleoperative and fully autonomous unmanned
vehicles, have occurred in the last several years. These events and the challenges
they pose are described in the following sections, including contacts and brief
explanations of the event. We discuss these events because we believe that (1) they
have led to improvements in the intelligent performance of nonindustrial vehicles,
and (2) many of these improvements can be harnessed by the AGV manufacturers
for their vehicles.
• RoboCup [10]
* RoboCup is an international robotics competition founded in 1997 to pro-

mote robotics and artificial intelligence research. The competition involves


publicly appealing but formidable challenges for robots. Initially focused
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN AND MESSINA, DOI 10.1520/STP159420150054 9

on Robotic Soccer, it has expanded to other applications, such as RoboCup


Rescue, RoboCup@Work [11], and RoboCup@Home [12]. These newer
applications focus on service and assistive robot technologies by bench-
marking performance in several areas, including human-robot interaction,
navigation and mapping in dynamic environments, object recognition,
object manipulation, and adaptive behaviors. All of these performance
areas are also critical to intelligent industrial automated vehicles where the
competition is moving the technologies to meet industrial challenges. For
example, RoboCup@Work, which was initiated in 2012, aims to foster
advancements in mobile robots equipped with manipulators for industrial
applications. The competition includes complex tasks ranging from manu-
facturing, automation, and parts handling to general logistics [10].
• IEEE Solution in Perception Challenge [13]
* This competition was held in 2011 as part of the International Conference

on Robotics and Automation to determine the performance capabilities


of algorithms that detect, recognize, and locate an arbitrary collection of
artifacts in space. These are perceptual capabilities that intelligent AGVs
will require.
• Association for Unmanned Vehicle Systems International [14]
* The Association for Unmanned Vehicle Systems International (AUVSI)

holds annual intelligent ground vehicle competitions (IGVCs), one of


four unmanned systems student competitions that they sponsor. The
IGVC is a multidisciplinary exercise in product realization that chal-
lenges college engineering student teams to integrate advanced control
theory, machine vision, vehicular electronics, and mobile platform funda-
mentals to design and build an unmanned system. Integration of the
same components is also necessary for industrial AGVs and mobile
robots. Teams from around the world focus on developing a suite of tech-
nologies to equip ground vehicles of the future with intelligent driving
capabilities.
It is worth noting that the RoboCup Rescue and the Solutions in Perception
Challenge have close ties to performance standards. RoboCup Rescue features test
arenas that comprise test methods based on approved and draft standards [15].
These include the ASTM E54 test methods listed earlier. The diversity of robots and
number of competitors provides support for the development of draft test methods
and allows for their streamlining and improvement. Furthermore, exposing the par-
ticipating researchers to test methods that capture application challenges provides
them with tangible goals toward which to engineer more advanced robotics
capabilities. The Perception Challenge proved out initial concepts for measuring
the performance of systems that determine the pose (position and orientation) of
objects. These approaches formed the foundations for ASTM E2919–14, Standard
Test Method for Evaluating the Performance of Systems that Measure Static, Six
Degrees of Freedom (6DOF), Pose.
The NIST fostered challenges to promote academic advancement of AGV intel-
ligence for factory environments. These challenges were intended to raise the level
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
10 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

of intelligent performance for tasks that may occur in real situations. Two such
challenges were:
• Virtual Manufacturing Automation Competitions (VMACs) 2007–2009 [16]
* These were workshops and virtual/real AGV competitions based on real-

world factory scenarios demonstrating accurate path following and docking


tasks and requiring that the same application code run on the simulation
system and on a real platform.
• Mobility and Task Completion Challenge, International Conference on
Robotics and Automation [17]
* This virtual challenge was designed to address the need for one or more

factory AGVs to operate in unstructured environments that include dynamic


obstacles. Teams used the Unified System for Automation and Robot Simu-
lation framework [18] to deliver completed pallets throughout a simulated
warehouse environment, including loading and unloading of vehicles with a
robotic arm.

DEPARTMENT OF DEFENSE PROGRAMS/CHALLENGES


Highly intelligent, military robotic vehicle systems have been developed, accom-
panied by development of tests and evaluation metrics for these systems. Estimat-
ing military vehicle intelligence requires different kinds of tests and evaluation
methods, which are briefly described in the following sections. They are typically
matched to operational needs and not to formal industry standards. However, just
as before, the methods depend upon the vehicle class and the environment where
the vehicle will be used. In our view, these methods are also applicable to AGVs.
An overview of some of the key early developments in defense robotics can be
found in Shoemaker [19]. Highlights are summarized here.
• In 1987, the U.S. Army Laboratory Command (LABCOM) (later part of the
U.S. Army Research Laboratory [ARL]) initiated a program to consolidate the
technologies necessary to develop a test bed unmanned ground vehicle (UGV)
in a cooperative effort among the laboratories in the command. This was the
beginning of the Techbase Enhancement for Autonomous Machines, which
later became the Robotics Test Bed program [20], and it carried out a number
of demonstration programs:
* Demo I (1991–1992)
* Demo A (1993)—Demonstrated a working vehicle and operator worksta-

tion infrastructure and early integration of road following and teleopera-


tion capabilities.
* Demo B (1994)—Demonstrated on-road and off-road navigation, obstacle

avoidance, and target detection using forward-looking infrared.


* Demo C (1995)—Demonstrated dual cooperating surrogate SSV, target

detection and tracking capabilities, mission planning and monitoring and


exercised the system in militarily relevant scenarios.
* Demo II (1996)—Developed and matured navigation and automatic

target recognition technologies critical for the development of supervised


AGVs capable of performing military scout missions with minimal human
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN AND MESSINA, DOI 10.1520/STP159420150054 11

oversight. The program culminated with a highly successful series of field


exercises with three cooperating SSVs in a military environment at Ft.
Hood, TX.
* Demo III (1998)—The Experimental Unmanned Vehicle (XUV) Experi-

ment was conducted to examine the results of adding semiautonomous


platforms to the Scout Platoon of the Army After Next.
* Robotics Collaborative Technical Alliance (2009-2020) [21]—The goal of

this alliance is to develop highly advanced ground vehicles that act as


partners to soldiers. Specific research goals are:
n Perception—Perceive and understand dynamic and unknown environ-
ments, including the creation of a comprehensive model of the sur-
rounding world.
n Intelligence—Autonomously plan and execute military missions; readily
adapt to changing environments and scenarios; learn from prior experi-
ence; share common understanding with team members.
n Human-Robot Interaction—Seamlessly integrate unmanned systems
into military and civilian society.
n Dexterous Manipulation and Unique Mobility—Manipulate objects
with near-human dexterity and maneuver through three-dimensional
environments.
Experiments are conducted regularly that evaluate progress in each of these
areas and for the integrated system.
The ARL series of unmanned vehicle programs listed here have provided
advancements in hardware and algorithms that enable greater autonomy, especially
in navigation and obstacle avoidance. The newer programs are advancing other
technologies that will be of relevance to AGV capabilities, including greater intelli-
gence and better human-robot interaction.
Another collaborative technology alliance being sponsored by ARL is called
Micro-Autonomous Science and Technology (MAST) [22]. It seeks to advance
the onboard intelligence for small vehicles. MAST develops autonomous, multi-
functional, collaborative ensembles of agile, mobile microsystems to enhance
tactical situational awareness in urban and complex terrain for small unit opera-
tions. The scale and missions envisioned for these vehicles are not relevant to
AGVs. However, situational awareness is relevant, and the research may prove
useful for AGVs.
• Defense Advanced Research Project Agency Robotics Programs and
Challenges [23]
* The Defense Advanced Research Project Agency’s (DARPA) UGV

Program [24] developed key technologies of autonomous navigation and


automatic target recognition prior to transitioning them to the Department
of Defense (DoD).
* 2004/2005 Grand Challenge: This was a long-distance competition for

driverless cars that were tasked to autonomously drive 240 km (150 miles)
through the Mojave Desert from Los Angeles to Las Vegas.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
12 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

2004–2008 DARPA Learning Applied to Ground Robots (LAGR) Pro-


*

gram [25]: The LAGR program had the goal of accelerating progress in
autonomous, perception-based, off-road navigation in robotic UGVs.
The program had a novel approach of providing all teams with the same
baseline platform and software, which they would augment with their
specific advancements. Regular trials were held to compare each team’s
results against the baseline and the other teams.
* 2007 Urban Challenge: This was a competition on a 96-km (60 mile) urban

area course at George Air Force Base, CA. It required driving autono-
mously while obeying all traffic regulations and negotiating other traffic
and obstacles and merging into traffic.
* 2012–2015 DARPA Robotics Challenge: In this competition, the goal was

to develop ground robots capable of executing complex tasks in dangerous,


degraded, human-engineered environments.
The DARPA programs can be leveraged by AGVs where advanced sensor,
mapping, and behavior generation challenges can carry over into intelligent
industrial vehicle situation awareness/avoidance and controls. The LAGR Pro-
gram’s evaluation approach provided regular trials that measured progress against
a baseline, similar to how performance test methods should be considered
for AGVs.
TEST AND EVALUATION OF MILITARY VEHICLES
By providing funding for research programs and challenge problems, the military has
stimulated the advancement of technologies for autonomous, intelligent vehicles. The
programs and challenges also serve as a good model for test and evaluation ofvehicles
that are to be fielded, as well as AGVs. During the Science of Autonomy Technology
Focus Team study in 2008 and 2009 [26], the panel identified test and evaluation/ver-
ification and validation as being a major roadblock to the fielding of future autono-
mous military vehicle system capabilities. It is believed that the situation has not
changed much in the interim.
The DoD test and evaluation community has recognized this roadblock and,
as a result, the Autonomous Systems Test and Evaluation Requirements Study
(ASTERS) was set up through the Army Test and Evaluation Command (ATEC)
[27]. ATEC looks for potentially useful new equipment for soldiers in addition
to verifying the safety of that equipment and lists the “technical or operational
limitations” of technologies that require further investigation and testing.
ASTERS will assess the current state of emerging (military) AGV technologies,
emerging AGV requirements, and current test and evaluation capabilities.
It is clear from the previous sections that “task complexity and adaptability
to the environment” are critical to improving performance of unmanned and
autonomous systems [28]. Both of these challenges, however, involve human/AGV
awareness/interaction (cognition) and autonomous control levels (autonomy). Com-
plexity at the systems level is increased when these systems are aggregated with other
AGVs and manned systems in manufacturing and warehouse scenarios.
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN AND MESSINA, DOI 10.1520/STP159420150054 13

E xa m pl e I n tel l i g en ce- Level E fforts

The ALFUS [29] Ad Hoc Working Group has formulated, through consensus, a
framework within which the different levels of autonomy can be described. The
initial version of the framework was presented at the 2004 ASME International
Mechanical Engineering Congress [26]. Significant progress has been made since
then [28]. However, the complexity of the autonomy-level issue forced the group to
identify additional technical challenges—many of which are active issues in the
research communities [10]. The group agreed that the autonomy levels for
unmanned systems must be characterized using three dimensions: mission com-
plexity, environmental difficulty, and human-robot interaction. The group devised
a three-axis representation for those dimensions. Fig. 1a shows this representation
applied to industrial AGVs, Fig. 1b shows the levels of autonomy, and Fig. 1c shows
a summary score card on which to enter the autonomy level along each axis. We
believe this same model could be used to identify contextual autonomy for AGVs.

FIG. 1 ALFUS model applied to AGVs (a), autonomy levels (b), and utonomy level
summary score graph that incorporates the autonomy level along each axis (c).

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
14 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

FIG. 2 Emergency Response Robots Performance Evaluation Standard Test Method


Development Process Model. (Adapted from an original version from the
Department of Homeland Security Standards Office.)

Inspired by ALFUS, Performance Measures for Unmanned Systems (PerMFUS)


[30] is a framework to measure performance of unmanned systems or their compo-
nents in the context of missions/tasks and environments. Under PerMFUS, tests
would be conducted, data analyzed, and test methods developed according to
associated metrics. An example process model of PerMFUS applied to emergency
response robots’ performance evaluation standard test method development is
shown in Fig. 2.
The PerMFUS example applied to AGVs would include a set of requirements
to define the AGV performance from the points of view of users or potential users,
manufacturers, and researchers. Associated standard test methods would be gener-
ated and AGV performance evaluated accordingly. Output from the effort would
drive AGV performance to higher levels and guide users’ decisions on AGV pur-
chases and implementation.

AGV Performance Standard Criteria


The goal of an AGV intelligence level performance standard is to determine, for
each vehicle class, the edge-cases or limits (e.g., resolution, accuracy, speed, slope,
etc.) for mobility, obstacle detection, and navigation in the industrial setting. The
specific criteria expected to be included in a performance standard include AGV
classes and capabilities among other relevant information as listed in the following
subsections. The following classes, criteria, and other areas’ lists include additions
from the AGV industry through brief interviews with manufacturers and users and
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN AND MESSINA, DOI 10.1520/STP159420150054 15

provide areas to consider in generic test method development. For example, vehicle
classes have particular loading, type, guidance, and so on and pose questions to
future standards task groups as to how best to consider the variety of AGVs. Simi-
larly, AGV applications in docking, palletizing, and so forth may also provide
situations for performance test methods that may or may not fit all vehicles,
and perhaps more than one test method for each application may need to be
considered.
VEHICLE CLASSES
1. Loading
a. Light weight/capacity
b. Medium weight/capacity
c. Heavy weight/capacity
2. Type
a. Unit load
b. Tugger
c. Forklift
d. Other (e.g., hybrid, mobot, etc.)
3. Guidance
a. Wire
b. Laser triangulation
c. Ceiling bar code
d. Magnetic
e. Markers
f. Chemical/paint stripe
g. Simultaneous Localization and Mapping
h. Hybrid (combinations of guidance methods)
i. Position resolution and accuracy
4. Teach Modes
a. Offline
b. Human-led in situ
5. Cognition/Autonomy Level
a. Fully Autonomous—operator never intervenes
b. Semiautonomous—operator intervenes:
i. For each new maneuver
ii. To manually clear the path and let mobility continue
c. Human-Machine Interface Control—jog or pendant control
APPLICATION-SPECIFIC PERFORMANCE CRITERIA
For each criterion listed, consider task complexity, adaptability to the environment,
and verification of performance.
• Docking with tray tables, conveyers
• Palletizing
* Known/unknown locations
* Finding the pallet openings
* Loaded/unloaded pallets

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
16 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

* Space to insert pallets


* Truck loading/unloading
• Obstacle detection/avoidance
* Avoid or overcome obstacles [2]

• Human detection
* Represented by test pieces
* Represented by mannequins
* Actual humans
* Coverings (e.g., clothes worn)

• Interaction with manual equipment and operations (e.g., forklifts)


• Environments
* Indoor
n Temperature (e.g., freezer [1])
n Lighting
* Outdoor
n Fog, smoke, humidity/precipitation, light
* Surfaces
n Smooth concrete through rough terrain
n Floor gaps
* Surface slope
n Level
n Slope angle = 0?
* Areas
n Confined
n Open
• Synchronization among vehicles [1]
* Wait to pick up load
* Not cause congestion

• Capacities
* Speed
* Vehicle weight versus payload
* Lift height

• X/Y movement
* Ackerman
* Omnidirectional
* Skid steer

• Open source
* Plug and play
* Underlying architecture or operating system (e.g., Robot Operating

System)
• Intelligence
* Autonomy level
* Situational awareness

• Mean time between failures


• Mean time between charging
* Power

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN AND MESSINA, DOI 10.1520/STP159420150054 17

OTHER POSSIBLE AREAS


• Human interaction burden
• Use of external enablers for AGV capabilities
* Sensors (e.g., radio-frequency identification) worn by workers
* Factory clothes worn by workers

Potential Model for AGV Performance Standard


To address AGV performance standards, a committee has been formed by ASTM
International, called the ASTM F45 Driverless Automatic Guided Industrial
Vehicles Committee. Working documents for Docking, Navigation (F45.02), and
Terminology (F45.91) have been initiated based on ASTM E54 mobility standards.
The table of contents for ASTM F45.02 is shown in and represents the layout
Ta bl e 1

of information that the AGV performance standards might follow.


The task of developing test methods for ASTM F45 falls within the four
subcommittee areas: ASTM F45.01, Environmental Effects; ASTM F45.02, Docking
and Navigation; ASTM F45.03, Object Detection and Protection; and ASTM
F45.04, Communication and Integration. These are complemented by a fifth com-
mittee, ASTM F45.91, which covers terminology standards.
ASTM International provides participating, organizational, and informational
memberships where member types are producers (i.e., equipment manufacturers),
users (i.e., equipment users), consumers, or general interest (i.e., researchers, public,
etc.). ASTM uses a consensus process to ensure all committee members have a say
in the development process, and balance in representation among the three partici-
pating groups is required. An ASTM standard can be one of six types: test methods,
specifications, classification, practice, guide, or terminology. We chose to develop
test method and terminology standards as a starting point to support the AGV and

TABLE 1 ASTM F45.02 working document table of contents.

1 Scope
2 Referenced Documents
3 Terminology
4 Summary of Test Method
5 Significance of Use
6 Apparatus
7 Hazards
8 Calibration and Standardization
9 Procedure
10 Report
11 Precision and Bias
12 Measurement and Uncertainty
13 Keywords

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
18 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

mobile robot industries. Test methods are definitive procedures that produce test
results and, as shown in , precision and bias and measurement and uncer-
Ta bl e 1

tainty. Therefore, scientific reference must be provided within each test method
standard. Additionally, replicable and propagatable artifacts are expected to be
developed and used as relatively accurate and inexpensive test method support
devices, similar to, for example, the test pieces used for noncontact sensing evalua-
tion within ANSI/ITSDF B56.5. Standards with simple, relatively inexpensive arti-
facts would, therefore, not require vehicle vendors and users to procure expensive
measurement systems to ensure their vehicles conform to standards or to conduct
in-house testing. As suggested in previous sections of this chapter, metrics, test
methods, and terminology for the AGV and mobile robot industries may be
adopted from those associated with autonomous and intelligent capabilities evalua-
tions in other domains. A summary of potential AGV relevance for each non-AGV
standard described previously is listed in and a summary of potential AGV
Ta bl e 2 ,

relevance for each challenge and program described previously is listed in . Ta bl e 3

TABLE 2 Potential AGV relevance for each non-AGV standard.

Sta n d a rd Pot en ti a l AG V Re l eva n ce

Standard Terminology for Urban Search and Terminology


Rescue Robotic Operations (ASTM E2521 – 07a)
Standard Test Method for Evaluating Emergency Environmental effects, obstacle detection and
Response Robot Capabilities: Mobility avoidance, and terminology
Standard Test Method for Evaluating Emergency Energy/power measurement of vehicle
Response Robot Capabilities: Energy/Power functionality on inclines and in other situations
Standard Test Method for Evaluating Emergency Vehicle communications with the master or
Response Robot Capabilities: Radio warehouse management systems
Communication
Standard Test Method for Evaluating Emergency Evaluation of human factors in controlling and
Response Robot Capabilities: Human-System interacting with AGVs
Interaction (HSI)
Standard Test Method for Evaluating Emergency AGV sensor capability, control algorithm mapping
Response Robot Capabilities: Sensors quality, behavior generation for obstacle detection
and avoidance
Ontologies for Robotics and Automation Working Knowledge representation and transfer between
Group (ORA WG) vehicles and systems
IEEE-RAS Robot Map Data Representation for Representation of vehicle control maps
Navigation Working Group (MDR WG)
SAE AS4—Autonomy Levels for Unmanned Definition of autonomy levels for AGVs
Systems (ALFUS)
International Organization for Standardization Standard development process
(ISO) Performance Standards for Service Robots
(draft ISO/DIS 18646-1)
International Electrotechnical Commission (IEC) Standard development process
Standards for Service Robots (IEC 60312)

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN AND MESSINA, DOI 10.1520/STP159420150054 19

TABLE 3 Potential AGV relevance for each chal l enge and program.

Ch a l l en g e or Pro g ra m Pot en t i a l AG V Rel eva n ce

RoboCup Rescue, RoboCup@Work, Human-robot interaction, navigation and mapping


RoboCup@Home in dynamic environments, object recognition,
object manipulation, and adaptive behaviors
IEEE Solution in Perception Challenge AGV perception, particularly for object recognition
and pose determination
Association for Unmanned Vehicle Systems Advanced control theory, machine vision, vehicular
International (AUVSI) electronics, and mobile platform fundamentals
Virtual Manufacturing Automation Competitions Path following and docking
(VMAC)
Mobility and Task Completion Challenge Multiple AGVs in unstructured environments with
dynamic obstacles
Defense Advanced Research Project Agency Autonomous navigation technologies
(DARPA) Unmanned Ground Vehicle (UGV)
Program
U.S. Army Laboratory Command (LABCOM) Autonomous navigation technologies
Micro-Autonomous Science and Technology Situational awareness
(MAST)
DARPA Robotics Programs and Challenges Situation awareness/avoidance and controls,
vehicle evaluation compared to baseline
Autonomous Systems Test and Evaluation Technical or operational limitations
Requirements Study (ASTERS)
Performance Measures for Unmanned Systems Framework to measure performance of AGVs
(PerMFUS)

Conclusions
Currently, there are no performance measurement standards for AGVs, only safety
standards. However, that situation is likely to change. In developing such standards, it
is important to realize that much ofthe mobile robot research from different organiza-
tions and application areas is applicable to AGVs. Furthermore, performance stand-
ards do exist for mobile robots. As such, performance standards for AGVs can be
based on these performance standards for mobile robot capabilities. In this chapter,
we described a number of areas where overlaps are possible. The next steps are to
proceed with the development of performance measurement standards for AGVs and
mobile robots with input from mobile robot, sensors, and other supporting industries.
ACKNOWLEDGMENTS
The authors would like to thank several key individuals for their help with this chap-
ter, including:
• Andrew Moore of the Southwest Research Institute, San Antonio, TX
• Ann Virts of the National Institute of Standards and Technology, Gaithers-
burg, MD
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
20 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

• Satoshi Tadokoro of the International Rescue System Institute/Tohoku Uni-


versity, Japan
• Jon Bornstein, Kelly Swinson, and Marshal Childers of the Army Research
Laboratory, Aberdeen Proving Grounds, MD

References
[1] Egemin, “History of AGVs,” Egemin Automation, Inc., Holland, MI, 2011, www.egeminusa.
com/pages/agv_education/education_agv_history.html (accessed April 4, 2016).
[2] ANSI/ITSDF B56.5:2012, Safety Standard for Driverless, Automatic Guided Industrial
Vehicles and Automated Functions of Manned Industrial Vehicles, Industrial Truck Stand-
ards Development Foundation, Washington, DC, 2012, www.itsdf.org
[3] Albus, J. S., “Outline for a Theory of Intelligence,” IEEE Transactions on Systems, Man,
and Cybernetics, Vol. 21, No. 3, 1991, pp. 473–509.
[4] Bostelman, R., Shackleford, W., Cheok, G., and Saidi, K., “Safe Control of Manufacturing
Vehicles Research Towards Standard Test Methods,” Proceedings of the International
Material Handling Research Colloquium, Gardanne, France, June 25–28, 2012.
[5] ASTM International, West Conshohocken, PA, 2012, www.astm.org
[6] Huang, H.-M, Messina, E., English, R., Wade, R., Albus, J., and Novak, B., “Autonomy
Measures for Robots,” Proceedings of the ASME 2004 International Mechanical Engi-
neering Congress and Exposition , American Society of Mechanical Engineers, Anaheim,
CA, November 13–19, 2004.
[7] ISO/DIS 18646-1, Robots and Robotic Devices—Performance Criteria and Related Test
Methods for Service Robot, International Organization for Standardization, Geneva,
Switzerland, 2015, www.iso.org
[8] Huang, H. and Messina, E., Autonomy Levels for Unmanned Systems (ALFUS) Framework
Volume II: Framework Models Initial Version , NIST Special Publication 1011-II-1.0, National
Institute of Standards and Technology (NIST), Gaithersburg, MD, 2007.
[9] IEC SC 59F, Surface Cleaning Appliances, International Electrotechnical Commission,
Geneva, Switzerland, 2015, http://www.iec.ch/dyn/www/f?p ¼ 103:7:0::::FSP_ORG_ID,
FSP_LANG_ID:1395,25 (accessed April 4, 2016).
[10] The RoboCup Federation, 2012, www.robocup.org (accessed April 4, 2016).
[11] “Welcome to RoboCup@Work,” 2015, www.robocupatwork.org/index.html (accessed
April 4, 2016).
[12] “The RoboCup@Home League,” 2015, www.robocupathome.org (accessed April 4, 2016).
[13] Marvel, J. A., Hong, T., and Messina, E., “Solutions in Perception Challenge Performance
Metrics and Results,” Proceedings of the Workshop on Performance Metrics for
Intelligent Systems (PerMIS 0 12) , Association for Computing Machinery, New York, NY,
March 20–22, 2012, pp. 59–63.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN AND MESSINA, DOI 10.1520/STP159420150054 21

[14] “About AUVSI,” Association for Unmanned Vehicle Systems International, Arlington, VA,
2012, www.auvsi.org (accessed April 4, 2016).
[15] Sheh, R., Jacoff, A., Virts, A. M., Kimura, T., Pellenz, J., Schwertfeger, S., and
Suthakorn, J. (January). “Advancing the State of Urban Search and Rescue Robotics
Through the RoboCupRescue Robot League Competition,” Field and Service
Robotics, K. Yoshida and S. Tadokoro, Eds., Springer, Berlin, Heidelberg, 2014,
pp. 127–142.
[16] Balakirsky, S., Chitta, S., Dimitoglou, G., Gorman, J., Kim, K., and Yim, M., “Robot
Challenge,” Robotics and Automation , December 2012, pp. 9–11.
[17] Balakirsky, S. and Madhavan, R., “Advancing Manufacturing Research Through
Competitions,” Proceedings of SPIE Defense Security and Sensing, Orlando, FL,
April 13–17, 2009.
[18] Balakirsky, S., Scrapper, C., Carpin, S., and Lewis, M., “USARSim: Providing a Framework
for Multi-Robot Performance Evaluation,” Proceedings of the Performance Metrics for
Intelligent Systems Workshop, NIST, Gaithersburg, MD, August 21–23, 2006.
[19] Shoemaker, C., “Development of Autonomous Robotic Ground Vehicles: DoD’s Ground
Robotics Research Programs: Demo I through Demo III,” Intelligent Vehicle Systems:
A 4D/RCS Approach , R. Madhavan, E. R. Messina, and J. S. Albus, Eds., Nova Publishers,
New York, 2006, pp. 283–315.
[20] Haas, G. A., David, P., and Haug, B. T., “Target Acquisition and Engagement from an
Unmanned Ground Vehicle: The Robotics Test Bed of Demo 1,” Technical Report
ARL-TR-1063, Army Research Laboratory, Adelphi, MD, March 1996.
[21] Army Research Laboratory, “Robotics Collaborative Technology Alliance (RCTA), FY
2011 Annual Program Plan,” March 2011, http://www.arl.army.mil/www/pages/392/
rcta.fy11.ann.prog.plan.pdf (accessed April 4, 2016).
[22] “Micro Autonomous Systems and Technology (MAST),” GRASP Laboratory, University of
Pennsylvania, Philadelphia, PA, 2015, www.mast-cta.org
[23] Wikipedia, “DARPA Grand Challenge,” Defense Advanced Research Project Agency
Challenges, 2012, http://en.wikipedia.org/wiki/DARPA_Grand_Challenge (accessed
April 4, 2016).
[24] Spofford, J. R., Rimey, R. D., and Munkeby, S. H., “Overview of the UGV/Demo II
Program,” Lockheed Martin Astronautics, Denver, CO, 1996.
[25] Wikipedia, “Defense Advanced Research Project Agency Learning Applied to Ground
Robots (LAGR) Program,” 2009, http://en.wikipedia.org/wiki/DARPA_LAGR_Program
(accessed April 4, 2016).
[26] Ratcliff, A., “OSD Manufacturing Technology Overview,” NDIA Quarterly meeting,
Department of Defense Manufacturing Technology Program presentation, May 14, 2009.
[27] Swinson, K., “Test and Evaluation of Autonomous Ground Robots,” NDIA Ground
Robotics Capabilities Conference and Exhibition, Aberdeen, MD, March 23, 2012, http://
www.dtic.mil/ndia/2012grcce/Swinson.pdf (accessed April 4, 2016).

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
22 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

[28] Macias, F., “The Test and Evaluation of Unmanned and Autonomous Systems,” Interna-
tional Test and Evaluation Association Journal, Vol. 29, No. 4, 2008, pp. 388–395.
[29] Huang, H., Pavek, K., Albus, J., and Messina, E., “Autonomy Levels for Unmanned
Systems (ALFUS) Framework: An Update, 2005,” Proceedings of the SPIE Defense
and Security Symposium, Orlando, FL, March 28–April 1, 2005.
[30] Huang, H., “Performance Measures for Unmanned Systems,” presented at the SAE AS4D
Meeting, San Diego, CA, October 18, 2010.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
AUTONOMOUS INDUSTRIAL VEHICLES: FROM THE LABORATORY TO THE FACTORY FLOOR 23

STP 1594, 2016 / available online at www. astm. org / doi: 10. 1520/STP159420150059

Adam Norton 1 and Holly Yanco 2

Preliminary Development of
a Test Method for Obstacle
Detection and Avoidance
in Industrial Environments
Citation
Norton, A. and Yanco, H., “Preliminary Development of a Test Method for Obstacle Detection
and Avoidance in Industrial Environments,” Autonomous Industrial Vehicles: From the
Laboratory to the Factory Floor, ASTM STP1594, R. Bostelman and E. Messina, Eds., ASTM
International, West Conshohocken, PA, 2016, pp. 23–40, doi:10.1520/STP1594201500593

ABSTRACT
There is currently no standard method for comparing autonomous capabilities
among systems. We propose a test method for evaluating an automated
mobile system’s ability to detect and avoid obstacles, specifically those in an
industrial environment. To this end, a taxonomy is being generated to
determine the relevant physical characteristics of obstacles so that they can be
accurately represented in the test method. Our preliminary development
includes the design of an apparatus, props, procedures, and metrics. We have
fabricated a series of obstacle test props that reflect a variety of physical
characteristics and have performed a series of tests with a small mobile robot
toward validation of the test method. Future work includes expanding the
taxonomy, designing more obstacle test props, collecting test data with more
automatically guided vehicles and robots, and formalizing our work as a
potential standard test method through the ASTM F45 Committee on
Driverless Automatic Guided Industrial Vehicles, specifically ASTM F45.03
Object Detection and Protection.

Manuscript received July 1, 2015; accepted for publication August 28, 2015.
1
New England Robotics Validation and Experimentation (NERVE) Center, University of Massachusetts Lowell,
1001 Pawtucket Blvd., Lowell, MA 01854
2
Department of Computer Science, University of Massachusetts Lowell, 1 University Ave., Lowell, MA 01854
3
ASTM Workshop on Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor on
May 26–30, 2015 in Seattle, Washington.

Copyright VC 2016 by ASTM International, 100 Barr Harbor Drive, PO Box C700, West Conshohocken, PA 19428-2959.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
24 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

Keywords
obstacle detection, obstacle avoidance, standard test method, mobile robot,
automatically guided vehicle (AGV)

Introduction
Automatically guided vehicles (AGVs) have become very common in industrial
manufacturing environments. The use of autonomous mobile robots in this
domain is also on the rise. A necessary capability of both systems is obstacle
detection and avoidance. Obstacles in this domain range between static objects
(e.g., tables, pallets, barrels) and moving agents (e.g., forklifts, people). The loca-
tion of static objects in some environments is fixed, while in others it changes
very frequently when a job requires a different work flow and layout. If a system
is capable of detecting and avoiding obstacles, it creates a safer work environment
and allows for faster integration into a facility because less a priori knowledge of
the environment is needed [1].
Currently, there is no standard for comparing this capability between systems.
The Committee on Driverless Automatic Guided Industrial Vehicles (ASTM F45
[2]) has been formed to achieve this goal, specifically ASTM F45.03, which is
focused on object detection and protection. We propose a test method design that
can aid in this effort by accurately simulating the relevant physical characteristics
of an industrial manufacturing environment. In particular, the test method will
replicate the physical qualities of common obstacles and objects that can affect a
system’s ability to detect them with its sensors and avoid collisions.

Related Work
There are a variety of efforts working toward standardized performance metrics and
test methods for robotic systems. The National Institute of Standards and Technology
(NIST) has been leading an effort for the development of standard test methods for
response robots [3] for well over a decade through the Committee on Homeland
Security Applications; Operational Equipment; Robots (ASTM E45.08.01 [4]). Those
test methods focus on different capabilities of mobility, manipulation, sensors, and
human-system interaction, most prominently for teleoperated robots.
For AGVs, there is a safety standard test method specified in American
National Standards Institute/Industrial Truck Safety Development Foundation
(ANSI/ITSDF) B56.5 [5] that verifies whether or not a system’s safety sensor(s)
are able to detect a potential obstacle in its path. Within that test method, two test
pieces that must be detected and avoided are placed at varying orientations and
distances from the system. The surface of the test pieces are either black, because
that can cause issues for optical sensors, or highly reflective, because that can
cause issues for ultrasonic sensors. That test is primarily focused on dynamic
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
NORTON AND YANCO, DOI 10.1520/STP159420150059 25

agents (e.g., a person) that temporarily enter a vehicle’s path. A similar standard
for autonomous robots is ISO 13482 [6], which includes appropriate distances
between the system and an agent/object that enters its space and emergency stop
functions. It is aimed at personal care robots (not those in industrial environ-
ments) and is largely focused on the system’s safety when co-located with people.
Although it specifies standard performance, it does not specify a standard test
method for determining performance.
The complexity levels of environment and obstacles (CLEO) and prediction in
dynamic environments framework efforts outlined in Madhavan et al. [7] are aimed
at measuring the performance of autonomous systems. They note that the capabil-
ity of a system to work in unstructured, dynamic environments is a “critical enabler
for next-generation industrial mobile robots.” The CLEO framework provides a
method for characterizing an autonomous system’s ability to navigate through
increasingly complex environments and obstacles as both aspects become more
dynamic. The metrics used include geometric correctness, dynamic map update
methods, and amount of time to update for environment changes.
The development of standard test methods for AGVs is also prevalent at
NIST [8], focusing on collaborative workspaces among humans, unmanned
vehicles, and manned vehicles. That work focuses on the detection of objects and
agents that either enter the path or stop zone of a vehicle or that are beyond it.
The test pieces from ANSI/ITSDF B56.5 are used for obstacles as well as for an
alternative to ground truth measurement called the grid-video method. This
method involves placing a grid on the ground and computing ground truth loca-
tions from recorded video of a test.

Scope
Obstacle detection and avoidance is a common capability of any mobile, autono-
mous system. Given the existing audience and effort for ASTM F45, the develop-
ment is initially focused on automated mobile systems used in indoor industrial
environments, particularly for manufacturing applications. Mobile systems in this
domain include AGVs, which can take the form of traditionally human-operated
vehicles that are instead automated (e.g., Seegrid Vision Guided Vehicles [9]) and
autonomous robots (e.g., Adept MobileRobots [10]).
All of these systems are required to detect and avoid objects and agents that
either enter their path or that form the edges of their path. This includes any entity
in an industrial environment that, if a mobile system were to collide with it, could
cause damage to the system, its payload, or to the entity itself. For developmental
purposes, these are what we refer to as obstacles. Obstacles can sit on the ground
or protrude from a wall or ceiling in the environment.
Depending on a variety of factors, including the system’s capabilities and the
layout of the environment, the manner in which an obstacle is avoided varies.
Avoidance can mean stopping in place until the obstacle is no longer in the way or
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
26 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

driving around the obstacle to continue on a path. The metric of performance in


this test method will have to take into account what the system is able to do, given
the environment constraints and its own mobility capabilities. Also, depending on
the sensors used by the system to sense the obstacle, avoidance does not necessarily
mean noncontact. The amount of allowable contact in the test method should not
cause any damage to the system or to the obstacle that is being hit.
Every industrial environment serves a different purpose and uses different
brands of furniture, machinery, and so on, resulting in a vast number of possible
characteristics. However, all of these obstacles can be grouped into types. The
physical characteristics of an obstacle are what determine if an automated system
is able to detect it properly. To distill the physical characteristics of each type
into those that are relevant to obstacle avoidance, the creation of a taxonomy to
guide development is proposed. A simple Internet image search of manufacturing
environments was performed to determine the common types of obstacles (e.g.,
tables, shelving units, carts, pallets, barrels, ladders, etc.) to be included in the
taxonomy.
When testing an autonomous or automated capability, a dynamic test is more
appropriate than a static one to ensure that the system being tested is sensing and
reacting to obstacles in situ, eliminating any possibility of an operator gaming the test
to exhibit better performance. To achieve this, external characteristics (e.g., location,
orientation) and internal characteristics (e.g., overall and individual feature dimen-
sions) of the obstacles in the environment should vary during a test session. Given
the many possible real-world objects, variability of internal characteristics will allow
for a wide array of these objects to be properly simulated. A downselection process
can also be performed to determine the appropriate obstacle types to test depending
on the system’s dimensions, sensor types, sensor placement, and so on.
For some mobile systems, their implementation in an industrial environment
requires the areas where the system can drive and landmark locations to be defined.
We refer to the edges that define such spaces as boundaries. They can be defined in
a variety of ways. Some systems require physical augmentation of a space (e.g., the
laying of magnetic tape) in order for them to be implemented. Others make their
own map of the space, and the boundaries are virtually augmented in software. The
identification of locations also varies among systems; some read markers in the
environment to know when a location is reached and others have locations defined
in software. All of these features (if applicable to a system) must be possible within
the test method, allowing for holistic testing of a system.
An important aspect of the response robot test methods specified through
ASTM E54.08.01 is the availability of the materials used to fabricate the test appara-
tuses and props. All of those test methods are made with lumber, wood panels,
polyvinyl chloride (PVC) pipes, and other common building materials. This has
allowed them to be easily implemented, disseminated, and used by the first
responder and robotics community. This test method should follow the same
convention.
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
NORTON AND YANCO, DOI 10.1520/STP159420150059 27

REQUIREMENTS
All of these factors form a set of requirements that define the scope of the test method
so that it meets the needs of the manufacturing robotics domain, allows both tradi-
tional AGVs and mobile robots to be tested, and can be fabricated by anyone:
• R1 : The relevant characteristics of common obstacles in an industrial environ-
ment must be physically represented.
• R2: Obstacle test props must have variable settings to allow for a variety of
real-world objects to be represented.
• R3: Obstacle test prop settings, both internal and external qualities, must be varied
during a test session to prevent gaming and to test the flexibility ofthe autonomy.
• R4: The semipermanent boundaries of the environment and locations within
it must be represented in the test apparatus such that they are appropriately
detectable by the system being tested.
• R5 : The test apparatus and props must be fabricated using readily available
building materials that are inexpensive.
TAXONOMY OF RELEVANT CHARACTERISTICS
In order to accurately simulate an appropriate level of detail in manufacturing
environments, a taxonomy of relevant characteristics is being developed. The tax-
onomy will guide the design of the obstacle test props and will provide a unified
language to describe their purpose. The characteristics that are to be included are
distilled through the following process:
• T1 : Identify types of real-world obstacles and features found in an industrial
environment.
• T2: Break down each obstacle into their physical components.
• T3 : Outline the constant and variable physical relationships for each obstacle
component.
• T4: Identify overlaps in physical characteristics among real-world obstacles
(this is performed to limit the number of unique obstacle test props that will
need to be developed).
• T5 : Design obstacle test props that capture the physical characteristics while
reducing overlap among other obstacle test props.
The use of the taxonomy development process ensures that the requirements that
pertain to the obstacle test props are met. Specifically, R1 is satisfied by T1 and T2, R2 is
satisfied by T3, and R3 and R5 will help guide T5. This process has been used to develop
an initial set ofexample test prop designs, which are detailed in “Test Method Design.”
A snapshot of T1–T3 can be seen in , using a table and shelving unit
Tabl e 1

as examples. Other obstacles included in the taxonomy are conveyor belts,


chairs, pallets, ladders, railings, bollards, columns, and barrels. Surface quality is not
listed as a variable characteristic because the flat black and reflective surface
qualities used on the test pieces described in ANSI/ITSDF B56.5 can be used for all
obstacle test props to represent edge cases for sensors. While performing T4 across
the obstacles, a set of higher level qualities emerged as relevant characteristics.
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
28 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

TABLE 1 A snapshot of T1–T3 of the obstacl e taxonomy devel opment process, using a tabl e and
shelving unit as examples.

T1 : Re a l - Worl d O bsta cl es T3 : Co n sta n t a n d Va ri a bl e Ph ysi ca l Rel a ti on sh i ps

T2 : Ph ysi ca l Co m po n e n ts of E a ch Am on g O bst a cl e Com pon en t s

Table Constants
Leg (vertical column extending from ground) At least one vertical leg that extends between
the ground and the tabletop
Tabletop (horizontal plane above ground) A tabletop that sits above the leg(s) with
empty space between it and the ground
Feet (horizontal or vertical features extending Variables
from leg on ground) Number of legs
Bracer (horizontal plane extending between Horizontal distance between legs
legs above or on ground) Horizontal distance between legs and
tabletop edge
Vertical distance between ground and
tabletop
Tabletop dimensions
Feet type (e.g., perpendicular extensions,
wheels)
Vertical distance between horizontal bracers
and ground (if any)
Type of horizontal bracers (e.g., solid plane
between ground and tabletop, bar)
Shelving unit Constants
Shelf (horizontal plane above or on the ground) At least one shelf that sits above the ground
with back support
Side support or leg (vertical plane or column Variables
extending from ground or between shelves, or Number of shelves
both) Vertical distance between shelves
Back support (vertical plane extending from Width and depth of shelves
ground between shelves) Horizontal distance between back support
and shelf edge
Feet (horizontal or vertical features extending Side support type (e.g., posts, solid planes
from support on ground) that extend from front to back of shelves)
Horizontal distance between side supports
(if posts)
Back support type (e.g., environment wall,
solid plane that spans shelf width)
Shelf, side, and back support material density
(e.g., solid, wire frame)
Feet type (e.g., perpendicular extensions,
wheels)

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
NORTON AND YANCO, DOI 10.1520/STP159420150059 29

• Volume:
* Closed: All of the obstacle’s components are contained within its volume
or the volume cannot be entered by part of the system (e.g., a solid block),
or both.
* Open: The volume of the object can be entered by part of the system (e.g.,
a desk).
• Spatial characteristics:
* Ground: A component touches or is attached to the ground and can be
sensed when on a side of the system.
* Elevated: A component overhangs the ground without another component
directly underneath it and can be sensed when above or on a side of the
system (or both).
* Inset: A component is set into the volume a distance from another
component that sits above it (e.g., a table whose legs do not touch the
tabletop edges) and can be sensed when on a side or above the system
(or both).
• Surface density:
* Solid: The outer surface of the obstacle is solid.
* Porous: The outer surface has many holes (e.g., a wire mesh shelving unit).
* Empty: The side-facing outer edges of the obstacle are empty.
• Location:
*Static, fixed: The obstacle is fixed in place and cannot be moved.
*Static, could be moved: The obstacle is not explicitly fixed and can be
moved if hit with enough force.
* Dynamic, moving: The obstacle moves on its own (e.g., a human, a forklift).
* Dynamic, component-enabled movement: The obstacle is able to

move when hit due to a component that enables it to do so (e.g.,


wheels).
These characteristics can be used to design a set of obstacle test props that
capture the relevant physical properties of an industrial environment. An initial
set is discussed in the next section.

Test Method Design


NIST [3] states that standard test methods specify apparatuses and props (repro-
ducible representations of tasks), procedures (a script for the test administrator to
follow), and metrics (quantitative measures of performance). They do not specify
standard performance, only a standard way to measure it. For this test method, the
apparatus is a representation of the environment or of the boundaries that deter-
mine the area where the robot can plan its path. The test props are the variable
obstacles that can block the robot’s path.

TEST APPARATUS
The size of the system, the obstacles, and the environment determine if an obstacle
can be avoided by the system stopping or by navigating around it. Many AGVs
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
30 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

are not currently equipped with the functionality to navigate around an obstacle,
but some existing autonomous mobile robots can. The size of the apparatus will
also dictate where the obstacles can be placed within it and the direction of
approach by the system.
Preliminary development of this test method is not concerned with confined
space but rather if an obstacle can be detected and avoided successfully. Therefore,
the dimensions of the apparatus should be sized such that there is sufficient space
for the system to pass on at least one side of the obstacle, which will depend on the
dimensions, locomotion type, and turning radius of the system. An exact formula
for determining this has not yet been developed.
As specified in R4, the boundaries of an environment and the locations within
it can be interpreted differently by each system. Regardless of how they are sensed,
the apparatus should be built such that any people near it are protected from
potentially unsafe system behavior (e.g., exiting the boundaries). Thus, a barrier is
implemented outside of all physical or virtual augmentations. To meet R5, the
barrier is made using wood posts and sheets of wood, which are most commonly
available measuring 2.4 by 1.2 m (96 by 48 in). The wall panels can also be used to
define the system’s path if no additional augmentation is required.
The apparatus is built to define a space for testing wherein the system is
instructed to drive from location A to B and back continuously. A dead end after
both locations forces the system to turn around and traverse its path again,
approaching the obstacle from the opposite direction. If the system is unable to
change travel directions within the dead zone, then an additional path can be added
to allow for continuous, looped travel. The interior measurements can vary, most
easily in multiples of 2.4 m (96 in), or 1.2 m (48 in), for simple fabrication. Any
required physical or virtual augmentation for boundaries and location definition
can occur within this space. A diagram of the apparatus can be seen in Fig. 1 . A wall
in the middle of the apparatus is designed for mounting obstacles to obstruct the
path. To adjust the location of the obstacle test props within the space (per R3),
bars of 80/20 Holey Tubing are attached to the wall to allow for precise attachment.
The apparatus can be seen in Fig. 1 and Fig. 2.

TEST PROPS
The test props represent a variety of obstacles whose characteristics are specified
through the taxonomy. Their locations and orientations within a space can vary,
each introducing a new challenge of detection for the system being tested. To meet
R3, the variable characteristics of each obstacle defined in T3 must be changed
during a test session. Rather than designing many obstacle test props that capture
all of the possible variations of a single obstacle type, malleable props with adjusta-
ble settings can be used. To change the settings of an obstacle test prop, minimal
tools should be required so as to not add excessive length to a test session.
To meet R5, we have opted to use a common set of building materials to fabri-
cate the obstacle test props. For flat horizontal or vertical solid planes, wooden
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
NORTON AND YANCO, DOI 10.1520/STP159420150059 31

FIG. 1 Diagram of the test apparatus environment. The optional loop zone can be
added for systems that cannot change travel directions in the allotted space
(or at all).

B y

obstacle optional
mounting loop z x
wall zone

A x

Possible obstacle locations A, B: travel locations for system Wall panel barriers
Boundaries of the robot path, implemented either physically or virtually (if applicable)
x, y Width and length of path such that system can traverse and turn around (if possible)
z Width of path such that system can traverse around an obstacle obstructing the path 1 22 cm from the mounting wall

panels are used, 122 cm along at least one dimension. For vertical or horizontal
columns, aluminum square tubing (specifically, 80/20 Holey Tubing) is used, which
comes predrilled with holes that are separated by 3.8 cm, which allows for a very
precise granular scale for adjusting attachment dimensions. Each of these items can
be painted flat black, or metal sheets can be attached to match the surface qualities
of ANSI/ITSDF B56.5. Other features such as porous surfaces and elevated
obstacles are achieved by wire mesh panels and ropes with pulleys, respectively.
All obstacle test props are fabricated using hand-tightened hardware such as bolts
and wingnuts for easy assembly and adjustment. Additional pieces of aluminum

FIG. 2 The test apparatus environment set up at the UMass Lowell NERVE Center. Left:
Three-dimensional rendering. Right: Photo of the obstacle mounting wall.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
32 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

square tubing can be used as infrastructure on the horizontal plane to attach the
obstacle test props to the apparatus along the mounting wall. Holes are cut in the
horizontal plane to allow square tubing to pass through perpendicularly and to
serve as inset features. The common building materials can be seen in Fig. 3 .
A series of obstacle test props have been designed, each satisfying a different
combination of the higher-level qualities outlined in “Taxonomy of Relevant Char-
acteristics.” See Table 2 for images of each obstacle test prop and their corresponding
characteristics. Obstacles A and B can used as qualifiers before advancing to
obstacles that use multiple surfaces of that type. The “infinite” height characteristic
means that the implementation of the obstacle is not considering where the top of
the obstacle is; generally, an AGV or mobile robot system does not detect obstacles
from the sky down but rather from the ground up. The default size for wide obsta-
cle features is 122 cm, given the availability of 122 cm by 244 cm wood panels.
Elevated obstacles can have their components raised to a height that allows part of
the system (or the entire system) to drive underneath it, potentially causing it to
collide with the horizontal plane, unless the obstacle has components on the ground
so that the system may detect the obstacle before a collision occurs.

PROCEDURE
Before conducting a test, the apparatus must be set to the appropriate dimen-
sions. Wall panels and boundaries must be set at dimensions that allow for the
system to drive comfortably with enough space for it to detect and navigate
around an obstacle (if applicable). The optional loop zone can be added if necessary.

FIG. 3 The common building materials used to fabricate the obstacle test props. Left,
top to bottom: Aluminum square tubing, black square tubing, horizontal plane
with holes, and hardware for mounting additional components. Right, top to
bottom: Thin solid panels, tall solid panels, mesh side panels.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
NORTON AND YANCO, DOI 10.1520/STP159420150059 33

TABLE 2 A set of exampl e obstacle test props using the common set of buil ding materials.

(a) (b) (c) (d)

(e) (f) (g)

(h ) (i ) (j )

Surface Spatial Analog


Volume density Width Height Elevation characteristics example

A. Flat Solid Thin Infinite n/a Ground Wall


B. Flat Porous Thin Infinite n/a Ground Mesh
partition
C. Closed Solid Wide Short n/a Ground Pallet
D. Closed Solid Wide Infinite n/a Ground Column
E. Closed Solid Wide Infinite Variable Elevated Elevated
load on
forklift
F. Closed Solid Wide Short Variable Elevated Shelf
G. Open Solid Wide Variable Variable Ground Desk
Elevated
H. Open Porous Wide Variable Variable Ground Mesh
Solid Elevated shelving
unit
I. Open Empty Wide Short Variable Ground Table
Solid Elevated
J. Open Empty Wide Short Variable Ground Conference
Solid Elevated table
Inset

Note: All example images use a flat black surface quality on all outward facing planes (e.g., side pan-
els, underside of horizontal plane, etc.). The same designs are also possible using reflective material.
*All “infinite” heights are depicted at 61 cm because those are the specific settings used for valida-
tion testing in “Validation Testing.”
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
34 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

Physical or virtual augmentation (or both) should occur to define the system’s path
as needed. The travel locations (A and B) must also be defined to the system, which
can occur virtually in software, accompanied by physical augmentations such as
quick response codes or reflectors, and so on.
A downselection of obstacle test prop types and a threshold for their
variable settings should also be performed to prevent exhaustive testing. Depend-
ing on the system’s dimensions and the sensors it has available, some obstacle
settings will have larger impacts than others. For instance, if only a forward
facing, two-dimensional lidar is used and is mounted low to the ground, then
obstacles that are elevated completely above the system will not be detectable and
therefore do not need to be tested. A proper downselection process is still in
development.
The system is instructed to traverse from location A to B, then B to A. If a
specific path can be commanded, then it should fall right through the center of
the space defined by the boundaries. One instance of this action performed by the
system is referred to as a lap. During each lap, the system will interact with the
obstacle twice, or once if the loop zone is used. After each lap, the obstacle’s
settings are adjusted as necessary, such as its location and orientation along the
mounting wall. This process should be repeated as many times as necessary to
achieve a statistically significant measure of successful detection and avoidance
of the obstacle type(s).
If the system does not properly avoid the obstacle, then that lap will be noted as
such. This would require a reset to the last travel location. If too many faults occur,
the settings of the test should be adjusted to an easier difficulty, which has not yet
been determined.

METRICS
The most important metric of performance is whether or not the obstacle
was avoided. If noncontact sensors are used by the system being tested, then
avoidance means that the system did not collide with any part of the obstacle.
If contact sensors are used (e.g., a bumper), then avoidance means that, upon
contact, the obstacle did not move enough to cause any damage or to create
a safety hazard. This can be determined by observing the system as it per-
forms within the test method. The test can also be video recorded for review
afterwards.
A more detailed metric of performance is the distance between the obstacle and
the system after it has avoided the obstacle. The distance depends on the speed the
system is traveling, at what distance it detects/reacts to the obstacle, and how
quickly it stops moving toward the obstacle. For this type of measurement, a
motion capture system can be used, although this would not satisfy R5. An inex-
pensive way to do this is to draw a grid on the ground and calculate the distance by
processing images from the recorded video of the test (as is done in Bostelman,
Norcross, Falco, and Marvel [8]).
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
NORTON AND YANCO, DOI 10.1520/STP159420150059 35

Validation Testing
To aid in validating the design of the test method, a series of tests was conducted
at the UMass Lowell NERVE Center. Ten tests were conducted using a mobile
robot programmed to traverse within the apparatus from A to B and back for five
laps. For each test, a different obstacle was used; those used can be seen in Table 2
(where their settings are detailed). In between each lap, the location of the obstacle
along the mounting wall was altered, varying between 0 cm from the left edge,
61 cm from the left edge, center, 61 cm from the right edge, and 0 cm from the right
edge. All tests were recorded using a multi-angle camera system, depicting the
obstacle from all observable sides (see Fig. 4).

ROBOT CONFIGURATION AND OBSTACLE SETTINGS


An Adept MobileRobots Pioneer 3 was used for this testing, which is a small
research platform with a differential drive and motor encoders, and it was aug-
mented with two Hokuyo URG-04LX-UG01 sensors (one on the front, one on the
back) for 360-degree local obstacle detection, as well as a Microsoft Kinect v1
(although it was not active). The Pioneer was programmed using Robot Operating
System (ROS) [11], specifically the navigation stack, which is a generic suite of tools
for differential or holonomic navigation, with parameters specifically configured for
the robot. Before conducting the tests, the robot required virtual augmentation of
the space—the robot was manually driven through the apparatus with no obstacles
present, building a map of the space using simultaneous localization and mapping
(see Fig. 5). The A and B locations were then programmed within this map. The
robot uses the boundaries of the apparatus to determine its path (and will attempt

FIG. 4 A still frame from the multi-angle camera system used to record the test
sessions.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
36 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

FIG. 5 Map of the apparatus generated by the robot using the robot visualization
package in ROS.

to avoid any obstacles that obstruct its path) and to localize within it. With the
additional augmentations, the Pioneer measures 50 cm by 38 cm by 40 cm. The
Hokuyo URG is located approximately 30 cm above the ground, offering a two-
dimensional LIDAR view around the body of the robot. An image of the robot can
be seen in Fig. 6 .
One instance of each obstacle type listed in Table 2 was used with specific set-
tings tuned for the Pioneer. The robot’s two-dimensional sensors were located
30 cm above the ground, meaning all ground obstacles of “infinite” height did

FIG. 6 The Adept MobileRobots Pioneer 3 with additional sensors.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
NORTON AND YANCO, DOI 10.1520/STP159420150059 37

not need to be taller than that height because they would be detected by the
robot regardless. For this reason, 61-cm-tall side panels were used for obstacles
A, B, D, and E. Obstacle C used 8-cm-tall side panels such that its thin ground
components were below the sensors. No overhead components of obstacles
could be detected; if they were elevated less than 40 cm high, the robot would
have collided with them (unless they had accompanying ground features that
were otherwise detectable). In order to reduce potential damage to the robot,
obstacles E–J were elevated 61 cm high. The robot could also possibly enter the
volume of obstacles G–J; the default 122-cm size was used for all wide obstacles,
as well as insets of 30 cm, to allow for this possibility. Hokuyo URG sensors have
been shown to have issues with dark surfaces [12] , so all obstacle test props used
black surfaces.

RESULTS AND DISCUSSION


Based on the robot’s dimensions and sensing capabilities, we hypothesized it would
successfully detect and avoid solid obstacles with ground features that were within
its detectable horizontal field of view (30 cm high, parallel to ground; obstacles A,
D, and G) and drive under all closed volume elevated or open volume obstacles (or
both; obstacles E, F, G, I, and J). The prediction of success for detecting and avoid-
ing obstacles with porous surface densities was questionable. In general, our
hypotheses were correct; all obstacles with tall solid side panels on the ground were
both detected and avoided, and all closed volume elevated obstacles were avoided
but not detected. All open-volume obstacles were detected but not all avoided. See
Table 3 for test results.
Porous surface densities were not reliably avoided (obstacles B and H); the
robot needed to be noticeably closer to detect the wire mesh properly. During the
second repetition using obstacle H, the mesh panel on the right side was detected
and avoided, but due to the open volume, the robot then traversed underneath the
horizontal plane. The mesh panel on the left side was detected while the robot was
inside of the obstacle, but its driving settings (e.g., turning radius, acceleration,
braking speed) prevented it from detecting the panel in time to avoid collision.
A second trial was attempted with obstacle H but resulted in no successful
repetitions.
Although the settings for open-volume obstacles with empty surface densities
(I and J) were adjusted such that the robot would traverse through them, this was
not the case. The robot detected the obstacles’ ground features but got stuck during
its decision-making process and did not reach the next location. This is due to the
parameters in the robot’s ROS navigation stack. A second trial for obstacle J was
performed after one of these parameters was reprogrammed, which resulted in
five successful repetitions. Given the change in performance, it follows that a
software change like this constitutes a new robot configuration, and any perform-
ance exhibited using this configuration is not comparable to those with different
configurations.
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
38 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

TABLE 3 Testing results of obstacle detection and avoidance test method with the Pioneer.

Obstacle Successful reps


(see Table 2) Unique settings Duration (out of 5) Detection Avoidance

A. 61-cm side panels 13:30 5 Yes Yes


B. 61-cm side panels 10:00 5 Yes Yes
C. 8-cm side panels 0:30 0 No No
D. 61-cm side panels 9:00 5 Yes Yes
E. 61-cm side panels 14:30 5 No Yes*
61-cm elevation
F. 61-cm elevation 12:30 5 No Yes*
G. 61-cm elevation 18:30 5 Yes Yes
H. 61-cm elevation 2:30 1 Yes No
H. Same as above 1:30 0 Yes No
(Trial 2)
I. 61-cm elevation 1:30 0 Yes Yes, but did
144 cm between not reach
columns other location
J. 61-cm elevation 1:30 0 Yes No
114 cm between
columns
30 cm between
columns and
horizontal plane edge
J. Same as above 8:20 5 Yes Yes
(Trial 2)

Note: When performing Trial 2 with Obstacle J, the robot’s navigation parameters were changed.
*These obstacles were not detected because their components were elevated above the robot’s sensor
field of view but, technically, they were avoided because no part of the robot collided with the
obstacles.

Conclusion and Future Work


We have developed a preliminary design for a test method to evaluate an autono-
mous vehicle’s ability to detect and avoid obstacles. The taxonomy of relevant physi-
cal characteristics will continue to develop to ensure that all obstacles in industrial
environments can be accurately represented within the test method. Currently, all of
the common building materials are made up of right angles with no curves. The
use of circular vertical columns for legs (possibly PVC pipes) would also better match
the test pieces used in ANSI/ITSDF B56.5. Common building materials can also be
used to fabricate standard test pieces for other test methods for vehicles in industrial
environments within ASTM F45. The test method will later be expanded to capture
environmental settings such as lighting, ground types, and temperature.
A further specified process for downselecting appropriate obstacle types and
settings to be used during a test session is needed. For our testing with the Pioneer,
settings were chosen based on the robot’s physical configuration with respect to its
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
NORTON AND YANCO, DOI 10.1520/STP159420150059 39

dimensions and sensors. To formalize this, a proper definition of a system’s config-


uration (both hardware and software) is also needed. The configuration of the sys-
tem will then inform this process. Some elements of a configuration may be
variable, such as whether or not the system is carrying a load behind it, on top of
it, or in front of it. For instance, if the Pioneer was carrying a box on top of it and
traversed through the open-volume obstacles, then the payload may have collided
with the horizontal plane, unperceived by the robot.
This work will be leveraged while developing standards and test methods
through the ASTM F45.03 Object Detection and Protection subcommittee. In order
to do so, more validation testing needs to be performed using real-world AGVs and
mobile robots to ensure that the test method accurately captures their capabilities
of detecting and avoiding obstacles. The manner in which test results are presented
must also be refined such that they are usable by industry.

ACKNOWLEDGMENTS
This research has been supported in part by NIST under 70NANB14H235. The
authors would like to thank Jordan Allspaw, Brian Carlson, James Dalphond, and
Alexandra Derderian of the UMass Lowell NERVE Center for their assistance in
fabricating the test method and in validation testing.

Referen ces

[1] Martınez-Barbera, H. and Herrero-Perez, D., “Autonomous Navigation of an Automated


Guided Vehicle in Industrial Environments,” Robotics and Computer-Integrated
Manufacturing, Vol. 26, No. 4, 2010, pp. 296–311.
[2] ASTM Committee F45 on Driverless Automatic Guided Industrial Vehicles, ASTM
International, West Conshohocken, PA, 2015, www.astm.org
[3] National Institute of Standards and Technology, “Standard Test Methods for
Response Robots,” National Institute of Standards and Technology, Gaithersburg, MD,
2014, http://www.nist.gov/el/isd/ms/robottestmethods.cfm (accessed July 2015).
[4] ASTM, Subcommittee E54 on Operational Equipment, ASTM International, West Con-
shohocken, PA, 2015, www.astm.org
[5] ANSI/ITSDF B56.5-2012, Safety Standard for Driverless, Automatic Guided Industrial
Vehicles and Automated Functions of Manned Industrial Vehicles, Industrial Truck Stand-
ards Development Foundation, Washington, DC, 2012, www.itsdf.org
[6] ISO 13482:2014, Robots and Robotic Devices—Safety Requirements for Personal Care Robots,
International Organization for Standardization, Geneva, Switzerland, 2014, www.iso.org
[7] Madhavan, R., Bostelman, R., Kootbally, Z., Lakaemper, R., Gupta, S. K., and Balakirsky, S.,
“Smart, Flexible, and Safe Industrial Mobile Robots: Performance Evaluation & Standardi-
zation Efforts,” Proceedings of the International Test and Evaluation Association Technical
Review Conference, Annapolis, MD, July 19–21, 2011.
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
40 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

[8] Bostelman, R., Norcross, R., Falco, J., and Marvel, J., “Development of Standard Test
Methods for Unmanned and Manned Industrial Vehicles Used Near Humans,” Proceed-
ings of the SPIE Defense, Security, and Sensing Conference, International Society for
Optics and Photonics, Baltimore, MD, April 29–May 3, 2013.
[9] Seegrid, “Vision Guided Vehicles,” Seegrid Corp., Pittsburgh, PA, 2014, http://www.
seegrid.com/products.php (accessed July 2015).
[10] Adept Technology, “Adept Lynx,” Adept Technology, Inc., San Ramon, CA, 2015,
www.adept.com
[11] Quigley, M., Conley, K., Gerkey, B. P., Faust, J., Foote, T., Leibs, J., Wheeler, R., and Ng,
A. Y., “ROS: An Open-Source Robot Operating System,” ICRA Workshop on Open Source
Software, Vol. 3, No. 3.2, 2009, p. 5.
[12] Kneip, L., Tâche, F., Caprari, G., and Siegwart, R., “Characterization of the Compact
Hokuyo URG-04LX 2D Laser Range Scanner,” Proceedings of the IEEE International
Conference on Robotics and Automation , Kobe, Japan, May 12–17, 2009, pp. 1447–1454.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
AUTONOMOUS INDUSTRIAL VEHICLES: FROM THE LABORATORY TO THE FACTORY FLOOR 41

STP 1594, 2016 / available online at www. astm. org / doi: 10. 1520/STP159420150051

Klas Hedenberg 1 and Björn Åstrand 2

3D Sensors on Driverless Trucks


for Detection of Overhanging
Objects in the Pathway
Citation
Hedenberg, K. and Åstrand, B., “3D Sensors on Driverless Trucks for Detection of Overhanging
Objects in the Pathway,” Autonomous Industrial Vehicles: From the Laboratory to the Factory
Floor, ASTM STP1594, R. Bostelman and E. Messina, Eds., ASTM International, West
Conshohocken, PA, 2016, pp. 41–56, doi:10.1520/STP159420150051 3

ABSTRACT
Human-operated and driverless trucks often collaborate in a mixed work space in
industries and warehouses. This is more efficient and flexible than using only one
kind of truck. However, because driverless trucks need to give way to driven
trucks, a reliable detection system is required. Several challenges exist in the
development of such a system. The first is to select interesting situations and
objects. Overhanging objects are often found in industrial environments (e.g., tines
on a forklift). Second is choosing a system that has the ability to detect those
situations. (The traditional laser scanner situated two decimetres above the floor
does not detect overhanging objects.) Third is to ensure that the perception
system is reliable. A solution used on trucks today is to mount a two-dimensional
laser scanner on top and tilt the scanner toward the floor. However, objects at the
top of the truck will be detected too late, and a collision cannot always be
avoided. Our aim is to replace the upper two-dimensional laser scanner with a
three-dimensional camera, structural light, or time-of-flight (TOF) camera. It is
important to maximize the field of view in the desired detection volume. Hence,
the sensor placement is important. We conducted laboratory experiments to
check and compare the various sensors’ capabilities for different colors, using
tines and a model of a tine in a controlled industrial environment. We also
conducted field experiments in a warehouse. Our conclusion is that both the

Manuscript received June 15, 2015; accepted for publication November 3, 2015.
1
University of Skövde, School of Engineering Science, Portalen, Kaplansgatan 11, SE-541 34 Skövde, Sweden
2
Halmstad University, School of Information Technology, Kristian IV:s väg 3, SE-30118 Halmstad, Sweden
3
ASTM Workshop on Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor on
May 26–30, 2015 in Seattle, Washington.

Copyright VC 2016 by ASTM International, 100 Barr Harbor Drive, PO Box C700, West Conshohocken, PA 19428-2959.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
42 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

tested structural light and TOF sensors have problems detecting black items that
are non-perpendicular to the sensor. It is important to optimize the light
economy—meaning the illumination power, field of view, and exposure time—in
order to detect as many different objects as possible.

Keywords
mobile robots, safety, obstacle detection

Introduction
The need for an obstacle detection system for driverless forklift trucks is obvious.
However, such a system also may be used on driven trucks with automated func-
tions, such as the ability to stop if an obstacle appears. Hence, developing systems
that can be implemented for driven and driverless forklift trucks will not only create
a safer environment but will also decrease the cost of the obstacle detection system.
The world market for driven forklift trucks is orders of magnitude higher than for
driverless forklift trucks. American companies sold 927 automated guided vehicles
(AGVs) in 2011 [1]. During that period, more than 200,000 forklift trucks were
sold in the United States [2]. This includes electric rider trucks, electric warehouse
rider trucks, electric warehouse pedestrian trucks, and internal combustion trucks.
For the world market, World Industrial Truck Statistics reported order bookings
slightly below one million driven trucks for 2011 [2].
Two different safety standards exist for driverless trucks, one for Europe
(EN1525) and one for the United States (ANSI/ITSDF B56.5-2012), and each has
developed differently. In terms of obstacles, both consider contact with humans and
have two test items that represent parts of a human—a lower leg and a body. How-
ever, an object representing a piece of a machinery is added to the U.S. standard,
and the standard also considers different materials for different sensors as well as
more test cases [3,4]. A continuous development of standards for driverless trucks
is important to make use of state-of-the-art sensor technology.
Overall, the motivation for this work is to improve the safety of automated
material handling by proposing better sensor solutions for obstacle detection. The
challenge in developing an obstacle detection system in industrial settings is three-
fold. The first is to select situations to detect that are of special interest. The second
is choosing a perception system that has the ability to detect those situations. The
third is to ensure that the perception system is reliable.
This chapter is organized as follows. First, we discuss related work. This is
followed by a problem definition and then descriptions of the experiments and the
results. Finally, we offer a discussion and conclusions drawn from the results.
Related Work
The National Institute of Standards and Technology has presented covering
standards for driverless forklift trucks in industrial environments. Bostelman,
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
HEDENBERG AND ÅSTRAND, DOI 10.1520/STP159420150051 43

Hong, and Madhavan [5] use a time-of-flight (TOF) camera to detect objects
described in the U.S. safety standard [4] in which they also test the camera out-
doors and conclude that it shows good results in shady environments. Bostelman
[6] conducts tests with a sonar, a two-dimensional lidar, and a three-dimensional
TOF camera. The sensors are tested against objects in the standards as well as
an additional item, 500 mm by 100 mm, posed at 0 ? and 45 ? to the robot’s direc-
tion of travel to make a difference for TOF sensors. The tests also included various
materials (e.g., cardboard, gray plastic, cotton denim, black reflectance paper,
aluminum, and clear glass). Sonar detected all objects in various materials but
had problems with different angles. The two-dimensional lidar had problems with
flat glass at a 45 ? angle but detected other objects. The three-dimensional TOF
camera showed a notable difference between highly reflective and low-reflective
materials. Bostelman [6] proposed changes to ANSI/ITSDF B56.5 that were later
adopted into the standard [4]. Bostelman, Norcross, Falco, and Marvel proposed
test methods for driverless trucks and give examples of potential human and
equipment effects on the path of a driverless truck in human/AGV collaborative
work spaces [7].
Hedenberg and Åstrand use a test apparatus (Fig. 1 ) to evaluate three-dimensional
sensors that include test items in the safety standard as well as new items that repre-
sent objects in an industrial environment [8].

Problem Definition
SENSORS FOR OBSTACLE DETECTION
A driverless truck equipped with a two-dimensional range scanner (e.g., laser scan-
ner) situated less than two decimetres above the floor to detect objects described in
the safety standards (ANSI/ITSDF B56.5-2012, EN1525) does not detect all
obstacles in the desirable detection volume—the yellow area in Fig. 2. A solution
used on trucks today is to mount a two-dimensional laser scanner on the top of the
truck and tilt the scanner toward the floor. However, objects on the top of the truck
will be detected too late, and a collision cannot always be avoided. Another solution
proposed by Bostelman, Shackleford, Cheok, and Saidi is to use laser scanners on
each side of the truck to detect all items that enter the contour area of the truck [9].
However, this will dramatically increase sensor costs.
Our aim is to replace the upper two-dimensional laser scanner with a three-
dimensional camera in order to increase the detection volume and thus detect
obstacles earlier compared to the systems used today ( Fig. 2).
For all vision systems, the placement of the cameras is essential for obtaining
a good result. This has to be considered for every new setup [10]. There are
several ways to determine the placement of the camera system on the robot. The
easiest way is to just choose a placement by intuition. Putting a little more
effort into this judgment will probably increase the precision in the system. It
will also make a discussion of camera placement more unbiased if the system’s
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
44 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

FIG. 1 The test apparatus. In one scene, the test rig represents a collection of objects:
Item A—a prone human, Item B—a standing human, Item C—a flat target used by
Bostelman and Shackleford [6], Item D—a ladder, Item E—tines on a forklift, Item
F—hanging cable, Items G and H—vertical bars, Item I—horizontal bar, and Item
J—thin cable. A ladder, Item D, typically has a steeper slope than 45 ? . However,
objects that have a larger angle may be considered vertical, while objects with a
lower angle can be considered horizontal. The test apparatus measures 1.8 m by
1.8 m and the bars have a thickness of 25 mm. The hanging cable has a diameter
of 13 mm. All items are painted in matt black.

F
G H
E

D
A

performance has to be increased later. Huang and Krotkov concluded that the
best placement for cameras on mobile robots is at the highest point possible [10].
This is true if the desired detection volume is small in comparison to the available
field of view (FOV). In other cases, maximizing the FOV in the desired detection
volume is necessary.

OBJECTS IN THE ENVIRONMENT


Obstacles used in academic studies about mobile robots in indoor environments
are often box- or cylindrical-shaped (e.g., cans) or are humans. However, a closer
look at the industry will identify other obstacles. Detecting these obstacles can be as
important to human safety as direct detection of humans. Four examples of typical
situations found in an industry environment are shown in Fig. 3 and include a pro-
truding bar, hanging tools, blocking tape, and a ladder. None of these shows a
typical box- or cylindrical-shaped form. A common denominator for these objects
is their relatively thin structure and the fact that they overhang.
Forklift tines are other common objects in industrial environments and need to
be considered because both human-operated and driverless trucks collaborate in
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
HEDENBERG AND ÅSTRAND, DOI 10.1520/STP159420150051 45

FIG. 2 Configuration of the sensor system on a driverless truck. A two-dimensional


laser scanner mounted to detect obstacles below 2 dm above the floor is used
to meet the safety standards. Overhanging objects can be detected by a tilted
two-dimensional laser scanner mounted on the top of the truck. This solution
will not detect all obstacles in the desired detection volume (yellow area). A
three-dimensional camera covers a larger volume.

the same work space, Fig. 4. On a system level, it is no problem to keep track of the
position and the action taken for all the driverless trucks, but a similar system for
human-operated trucks is more problematic. A driverless truck must give way to a
human-operated truck, and the lower two-dimensional laser scanner on a driverless
truck can indicate the presence of a human-operated truck. One problem with this
is if the human-operated truck delivers goods in front of a driverless truck ( Fig. 4).
The laser scanner will not detect the tines, presenting the risk of the driverless truck
running into the human-operated truck.
The problem with the coverage of the desired detection volume and FOV of a
structural light sensor is demonstrated in Fig. 4. The FOV hardly covers the floor
close enough to the truck and at the top of the truck’s contour area.
Other similar scenarios are turns around corners ( Fig. 5) as described earlier [7].
Mirrors are often used in human work spaces to see around corners. For driverless
trucks, this is a more complex scenario and still is an open issue.
To illustrate the complexity in an industrial environment, Fig. 4b shows how
spilled water may change the environment for sensor systems. The wet spot on the
otherwise matt floor may cause objects to be reflected, which reduces the optical
signal returning to the sensor. This could cause false readings.
In the existing safety standards for driverless trucks, objects commonly
handled are those that represent humans [3,4]. Since 2012, the U.S. standard also
has a 500 mm by 500 mm plate that needs to be detected at 0? and 45 ? with differ-
ent colors and reflectance depending on the sensor used.
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
46 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

FIG. 3 Objects relevant for obstacle detection in an industrial environment. A common


denominator for these objects is that they are thin and mostly overhanging.
Objects on images (a)—(c) are also overhanging. Images (a)—(c) are published
with permission from Volvo Group Trucks Operations; (d) is published with
permission from Volvo Car Corporation.

Experiments and Results


TEST APPROACH
Our aim is to replace the upper two-dimensional laser scanner with a three-
dimensional camera and to make early calculations on how to maximize the FOV
in the desired detection volume. For many reasons, a high position seems advanta-
geous. However, early tests with different sensor positions also showed that reflec-
tance on untextured regions may have a high impact on the sensor readings.
Therefore, this caused some initial tests to identify the importance of the color of
the test items and their effect on sensor placement.
We also conducted experiments with different sensors in a controlled industrial
environment and made real-field experiments in a warehouse using a combination
of driverless and human-operated trucks.
TIME-OF-FLIGHT AND STRUCTURAL LIGHT SENSORS ON UNTEXTURED
REGIONS WITH DIFFERENT COLORS
An experiment was conducted to investigate whether there are any performance differ-
ences in TOF and structured light cameras on untextured regions with different colors.
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
HEDENBERG AND ÅSTRAND, DOI 10.1520/STP159420150051 47

FIG. 4 View from a driverless truck. Human-operated and driverless trucks cooperate
in many industrial environments. A human-operated truck delivers goods in
front of a driverless truck. The two-dimensional laser scanner on the driverless
truck indicates the presence of the human-operated truck and stops. Note the
wet spot on the otherwise matt floor behind the truck, which is more visible in
(b). This makes a reflective surface that might cause problems for some sensors.

FIG. 5 An AGV with a bumper turns around a pillar. Note the window to the left in the
left image. Sunlight may have an impact on three-dimensional sensors.
Published with permission from Volvo Car Corporation.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
48 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

TABLE 1 Results for the structured light camera. The camera has problems with bl ack col ors at
almost al l angles. The resul ts are rounded off to the cl osest 1 0 % valu e. The results are not
symmetrical due to different distances to the board at different angles.

Structured light camera

Angle Gray White Red Black

? 45
?
1 00 % 1 00 % 1 00 % 0 %

? 15
?
1 00 % 1 00 % 1 00 % 1 00 %

?
0 1 00 % 1 00 % 1 00 % 1 00 %

þ1 5? 1 00 % 1 00 % 1 00 % 0 %

þ 45 ? 1 00 % 1 00 % 1 00 % 0 %

A TOF camera (Fotonic E70) and a structured light camera (Microsoft Kinect) were
used and placed 2.5 m from a piece of board painted in four different colors:
gray, white, red, and black. A white/neutral background was placed 1 m behind the
board. The board was arranged at five different angles: ? 45? , ? 15? , 0? , þ 15? , and
þ 45? . Obstacles at an angle of ? 45 ? are used in the U.S. safety standard [4]. Three
consecutive images were taken by each sensor. The analysis was made by counting the
number of pixels within a given depth value from the sensor. Then the ratio between
the number of detected pixels and the number of maximum possible pixels covered by
the sensor was computed. The average of the results from the three images is given in
Table 1 and Table 2, and a scene is shown in Fig. 6. The results were rounded off to the
closest 10 % value.
It is clear that TOF and structured light cameras perform well for untextured
regions. However, this depends heavily on the reflectance. Both sensors performed
poorly on black surfaces in this test due to the fact that the black color had poor
reflectance. Usually, black is a good light absorber, but reflectance is important, and
one issue is to define the reflectance for different wavelengths. The structured light
sensor performed better on the perpendicular flat object in black. The board was

TABLE 2 Resul ts for the time-of-fl ight camera. The camera has probl ems with bl ack colors at
al most all angl es and al so with the gray plate at ? 45 ? . The results are rounded off to the
closest 10 % val ue.

TOF camera

Angle Gray White Red Black

? 45
?
60 % 1 00 % 1 00 % 0 %

? 15
?
1 00 % 1 00 % 1 00 % 0 %

?
0 1 00 % 1 00 % 1 00 % 60 %

þ1 5? 1 00 % 1 00 % 1 00 % 0 %

þ 45 ? 1 00 % 1 00 % 1 00 % 0 %

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
HEDENBERG AND ÅSTRAND, DOI 10.1520/STP159420150051 49

FIG. 6 The plate at þ 15 ? rotated about the vertical axis. The time-of-flight (b) and
structured light (c) camera do not detect the black plate. All other colors are
detected by both sensors.

FIG. 7 A time-of-flight and structural light camera were mounted on a driverless truck.
A scenario is to turn around a corner and detect tines on a forklift.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
50 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

rather diffuse, meaning that the light was scattered in all directions. If the board
had been more specular, we would probably have seen an even larger effect by
tilting it. The results in Table 1 are not symmetric due to the different distances to
the board at different angles.

TIME-OF-FLIGHT AND STRUCTURAL LIGHT SENSORS IN A CONTROLLED


INDUSTRIAL ENVIRONMENT
To investigate the performance of TOF and structural light sensors in detecting
tines, experiments were conducted in an industrial environment. A TOF camera
(Fotonic E70) and a structured light camera (Microsoft Kinect) were mounted on a
driverless truck and programmed to turn around a corner ( Fig. 7). A laser scanner
(lidar) served as a reference. Two different objects were used: the tines of a forklift

FIG. 8 The results from a scenario where tines from a forklift are used as an
obstacle around the corner. I n the upper left, an RGB image from the Kinect
is shown. The lower left shows the two-dimensional plot from the laser
scanner in which the red line illustrates where the beam from the upper laser
scanner hits the floor. The upper right image shows the depth image from
the structured light (Kinect) camera. The lower right image shows the depth
image from the TOF camera (Fotonic). The structured light camera (Kinect)
as well as the TOF camera (Fotonic) detect the tines. Note the problems for
the TOF camera with the reflectors mounted on the wall and used for laser
navigation.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
HEDENBERG AND ÅSTRAND, DOI 10.1520/STP159420150051 51

and a model of a tine. The approach in the safety standards is to make a model of
an existing object and conduct tests. If the model is detected, the perception system
is approved for that specific situation [3,4] . The model was painted black where
the tine of the forklift had been worn off naturally. We make no claim that either
the model or the real tine represent the average tines used in the industry. Our
investigation showed that it is very hard to find a common denominator for making
one general model of a tine.
The results are shown in Fig. 8 and Fig. 9 . The lower left image shows the
two-dimensional plot from the laser scanner in which the red line illustrates
where the beam from the upper laser scanner hits the floor. The upper right
image shows the depth image from the structured light (Kinect) camera. The
lower right image shows the depth image from the TOF camera (Fotonic). Both
the structured light camera (Kinect) and the TOF camera (Fotonic) detected the

FIG. 9 The results from a scenario where a model represents a tine from a forklift and is
used as an obstacle around a corner. On the upper left, an RGB image from the
Kinect is shown. The lower left image shows the two-dimensional plot from the
laser scanner in which the red line illustrates where the beam from the upper
laser scanner hits the floor. The upper right image shows the depth image from
the structured light (Kinect) camera. The lower right image shows the depth
image from the TOF camera (Fotonic). The structured light camera (Kinect)
detects the model of the tine, while the TOF camera (Fotonic) has difficulty
detecting the model.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
52 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

model of the tine. It is also notable that the TOF camera had problems with
the reflectors used for navigation (see Fig. 8 ). The test items in the U.S. safety
standard were also used. Both sensors had difficulty detecting the black flat test
item but did detect the cylindrical black test pieces.
It is clear that the structured light sensor detects both objects, whereas the
TOF sensor has problems with the model of the tine. According to the test men-
tioned earlier, this is due to the black color of the model; but another reason
can be a lower resolution (160 by 120 versus 320 by 240) with a higher FOV (70 ?
by 53 ? versus 58.5 ? by 46.6? ) for the TOF sensor compared to the structural light
sensor. There are not enough pixels to detect objects of this size.

FIELD TEST IN A WAREHOUSE


We made a field test in a warehouse where driverless and human-operated trucks
collaborate in the same work space. A structural light camera (Kinect) was mounted
on a driverless truck close to the upper laser scanner. The truck was programmed

FIG. 10 Field test from a warehouse where a structured light camera is used.
A wooden broomstick is placed in front of the truck. The thin stick is
detected by the sensor. The dark blue color in the depth image indicates
undefined distances. The lower left image shows the two-dimensional plot
from the laser scanner in which the red line illustrates where the beam from
the upper laser scanner hits the floor. Detections of the truck are shown
at (0, 0).

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
HEDENBERG AND ÅSTRAND, DOI 10.1520/STP159420150051 53

to make several laps throughout the warehouse. Humans and driverless and
human-operated trucks occurred as obstacles as well as manually placed obstacles
to test the system’s limitations; see Figs. 10 – 12 .
The sensor detected thin structures, such as the broomstick in Fig. 10 and
the metal cage in Fig. 12 , but had problems with the black parts in the black-yellow
pattern of the safety railing in Fig. 11 .

Discussion and Conclusion


We conducted tests to determine if active three-dimensional sensors can be used
for obstacle detection in industrial environments and for replacing an upper two-
dimensional laser scanner on a driverless truck.
The initial tests identified the influence different colors have on flat untextured
regions for active three-dimensional range sensors. The results showed that a black
color with low reflectance on test items used in safety standards and in our earlier

FIG. 11 Field test from a warehouse where a structured light camera is used. The bar
in black and yellow (on the left) is only partly detected by the sensor. The
black part is not detected. The dark blue color in the depth image indicates
undefined distances. The lower left image shows the two-dimensional
plot from the laser scanner in which the red line illustrates where the
beam from the upper laser scanner hits the floor. Detections of the truck
are shown at (0, 0).

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
54 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

proposed test apparatus is relevant. Black plates at nonperpendicular angles only


return a low signal to the sensors, and these objects were difficult for the tested
TOF and structural light sensors to detect. This has an impact on the camera
placement.
The maximum detection range is shorter for low-reflective objects, implying
that the geometry and camera should be optimized for the darkest object that
should be detectable and at the minimum critical distance. The dynamic range
of the sensors is also very important in order to handle different colors at varying
distances. Multi-shutter modes to increase the dynamic range would also be inter-
esting to evaluate. The difficulty for structural light cameras to detect black items
was verified in the field experiments. The TOF camera has problems with the
reflectors for navigation as well as with the black areas on the test items. This may
make the TOF camera more problematic on driverless trucks where laser naviga-
tion with reflectors is used.
We also compared different results from TOF and structural light cameras in a
controlled industrial environment. Fork tines on a forklift and a model of a forklift

FIG. 12 Field test from a warehouse where a structured light camera is used. A metal
cage is placed in front of the truck. The thin structures are detected by the
sensor. The dark blue color in the depth image indicates undefined distances.
The lower left image shows the two-dimensional plot from the laser scanner in
which the red line illustrates where the beam from the upper laser scanner hits
the floor. Detections of the truck are shown at (0, 0).

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
HEDENBERG AND ÅSTRAND, DOI 10.1520/STP159420150051 55

tine were used as obstacles. Both sensors detected the tine. However, the TOF cam-
era had problems detecting the black tine model, while the structural light camera
performed better. The results show that a black color with low reflectance is hard
for TOF and structural light cameras to detect. They also show how difficult it is
to make a representative model of tines for a forklift. To our knowledge, making a
tine model (or a representative test of a tine model) to be used in a safety standard
is still an open issue.
To determine a good placement for a three-dimensional sensor, a large FOV
is necessary to cover the desired detection volume. The placement of the sensor
is more important due to the poor sensor performance in detecting black plates at
nonperpendicular angles. Active three-dimensional sensors currently on the market
have a limited FOV. Sensors with a larger FOV are more suitable for this task,
without losing resolution to detect thin structures. This may require infrared light
with more power. It may also require different structural light patterns more suitable
for detecting obstacles in an industrial environment.
ACKNOWLEDGMENTS
As a part of the Automatic Inventory and Mapping of Stock project, this work is sup-
ported by the Swedish Knowledge Foundation and by industry partners Kollmorgen,
Optronic, and Toyota Material Handling Europe.

Referen ces

[1] Material Handling Institute, “AGVS Quarterly Report,” MHI, Charlotte, NC, Summer 2012.
[2] European Federation of Materials Handling, “World Industrial Truck Statistics,” Informa-
tion Sheet, July 2012, Frankfurt/Main, http://www.fem-eur.com/data/File/N460-WITS_
fact_sheet_2012_FEM_corr2.pdf (accessed April 4, 2016).
[3] European Committee for Standardization (CEN), “Safety of Industrial Trucks—Driverless
Trucks and Their Systems,” CEN, Brussels, Belgium, 1997.
[4] ANSI/ITSDF B56.5, “Safety Standard for Driverless, Automatic Guided Industrial
Vehicles and Automated Functions of Manned Industrial Vehicles,” Industrial Truck
Standards Development Foundation, Washington, DC, 2012.
[5] Bostelman, R., Hong, T., and Madhavan, R., “Towards AGV Safety and Navigation
Advancement Obstacle Detection Using a TOF Range Camera,” Proceedings of the
12th International Conference on Advanced Robotics, Seattle, WA, July 18–20, 2005,
Institute of Electrical and Electronics Engineers (IEEE), New York, 2005, pp. 460–467.
[6] Bostelman, W. and Shackleford, R., “Time of Flight Sensors Experiments Towards Vehicle
Safety Standard Advancements,” Draft submitted to the Computer Vision and Image
Understanding special issue on Time of Flight Sensors, 2010.
[7] Bostelman, R., Norcross, R., Falco, J., and Marvel, J., “Development of Standard
Test Methods for Unmanned and Manned Industrial Vehicles Used Near Humans,” Pro-
ceedings of the SPIE Defense, Security, and Sensing Conference, Baltimore, MD, April
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
56 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

29–May 3, 2013, National Institute of Standards and Technology (NIST), Gaithersburg,


MD, 2013.
[8] Hedenberg, K. and Åstrand, B., “Safety Standard for Mobile Robots—A Proposal for 3D
Sensors,” Proceedings of the 5th European Conference on Mobile Robots, O† rebro,
Sweden, September 7–9, 2011, Centre for Applied Autonomous Sensor Systems, O† rebro
University, O† rebro, Sweden, 2011, pp. 235–252.
[9] Bostelman, R., Shackleford, W., Cheok, G., and Saidi, K., “Safe Control of Manufacturing
Vehicles Research Towards Standard Test Methods,” Proceedings of the International
Material Handling Research Colloquium 2012 (IMHRC 2012) , Gardanne, France,
June 25–28, 2012, NIST, Gaithersburg, MD, 2012.
[10] Huang, W. H. and Krotkov, E. P., “Optimal Stereo Mast Configuration for Mobile Robots,”
in Proceedings of the IEEE International Conference on Robotics and Automation , Vol. 3,
Albuquerque, NM, April 20–25, 1997, IEEE, New York, 1997, pp. 1946–1951.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
AUTONOMOUS INDUSTRIAL VEHICLES: FROM THE LABORATORY TO THE FACTORY FLOOR 57

STP 1594, 2016 / available online at www. astm. org / doi: 10. 1520/STP159420150052

Lorenzo Sabattini, 1 Elena Cardarelli, 1 Valerio Digani, 1


Cristian Secchi, 1 and Cesare Fantuzzi 1

Multi-AGV Systems in Shared


Industrial Environments:
Advanced Sensing and Control
Techniques for Enhanced Safety
and Improved Efficiency
Citation
Sabattini, L., Cardarelli, E., Digani, V., Secchi, C., and Fantuzzi, C., “Multi-AGV Systems in
Shared Industrial Environments: Advanced Sensing and Control Techniques for Enhanced
Safety and Improved Efficiency,” Autonomous Industrial Vehicles: From the Laboratory to the
Factory Floor, ASTM STP1594, R. Bostelman and E. Messina, Eds., ASTM International, West
Conshohocken, PA, 2016, pp. 57–81, doi:10.1520/STP1594201500522

ABSTRACT
This chapter describes innovative sensing technologies and control techniques
that aim at improving the performance of groups of automated guided
vehicles (AGVs) used for logistics operations in industrial environments. We
explicitly consider the situation where the environment is shared among AGVs,
manually driven vehicles, and human operators. In this situation, safety is a
major issue that needs always to be guaranteed, while still maximizing the
efficiency of the system. This paper describes some of the main achievements
of the PAN-Robots European project.

Manuscript received June 15, 2015; accepted for publication November 6, 2015.
1
University of Modena and Reggio Emilia, Dept. of Sciences and Methods for Engineering, via Amendola 2,
42122 Reggio Emilia, Italy
2
ASTM Workshop on Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor on
May 26–30, 2015 in Seattle, Washington.

Copyright VC 2016 by ASTM International, 100 Barr Harbor Drive, PO Box C700, West Conshohocken, PA 19428-2959.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
58 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

Introduction
Production flow of goods in manufacturing plants has been highly automated in
the last decades, primarily to reduce costs and avoid unsafe working conditions.
Manufacturing plants often need warehouses for raw materials and final products
at the beginning and at the end of the production line. Although production often
is automated to a large extent, logistics typically are only marginally automated
and generally require manual operations performed by human workers and hand-
operated forklifts. Therefore, logistics, which are not fully integrated into the manu-
facturing processes, cause inefficiencies as well as high-risk working conditions for
employees [1]. Factory logistics are crucial for overall production flow, and logisti-
cal weaknesses affect production efficiency and the quality of goods delivery, partic-
ularly in terms of product traceability. Bottlenecks and problems in warehouse
logistics heavily impact factory competitiveness in the market.
Warehousing in factories of the future can rely on automated guided vehicles
(AGVs) and integrated systems for the complete handling of logistical operations
( Fig. 1 ). Nowadays these autonomous systems have a market share of about a
few thousands vehicles sold every year, and they are not yet ready to be in wide-
spread use in manufacturing plants. In fact, safety, efficiency, and plant installa-
tion costs are still ongoing problems, and technology is not mature enough to
fully support a pervasive diffusion of AGVs. Therefore, innovations to address
AGV weaknesses and automated warehouse systems will boost the capabilities of
these logistical solutions, bringing them toward a pervasive diffusion in modern
factories.
This chapter is organized as follows: First, we describe the system under con-
sideration and introduce the problem to be solved. Related works on multisensor
data fusion are outlined and the proposed centralized data fusion methodology is
discussed. The results of the data fusion methodology are then expanded to define
advanced navigation strategies. This is followed by concluding remarks.

FIG. 1 Automated warehouse with AGVs.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
SABATTINI ET AL., DOI 10.1520/STP159420150052 59

Problem Definition
In this chapter we consider technological issues related to AGV systems used for
factory logistics [2,3]. Several research groups have been working on AGV systems
in the last few decades. Tsumura presents a comprehensive survey of the relevant
literature [4], where authors describe the main technologies adopted for localization
and guidance of AGVs in industrial environments. Stouten and de Graaf [5]
describe the use of multiple AGVs for cooperative transportation of huge and
heavy loads.
Generally speaking, AGV systems are used for moving goods among different
positions in the industrial environment [6,7]. Each movement operation is generally
referred to as a mission . Different kinds of missions can be performed—pallets of
goods can be transported from the end of an automated production line to the
warehouse, from the warehouse to the shipment, or between two locations of the
warehouse. Typically, goods prepared in automated production lines need to be
picked up from a wrapper or from a palletizer.
AGVs are exploited to accomplish missions in an automated manner. For this
purpose, the AGV system is handled by a centralized controller—usually referred
to as a warehouse management system—that is in charge of assigning each mission
to be completed to a specific AGV. Once each mission has been assigned to a
specific AGV, then the centralized controller needs to coordinate the motion of the
AGVs themselves for mission completion. When dealing with a single AGV, several
strategies can be exploited for single-robot path planning (e.g., Martinez-Barberá
and Herrero-Peréz [8]). Conversely, when multiple AGVs share the same environ-
ment, coordination strategies need to be adopted in order to optimize the traffic.
Typically, the central controller is in charge of coordinating the motion of the
AGVs [9–13]. In order to simplify the coordination and enhance the safety of
operations, AGVs often are constrained to move along a predefined set of roads,
referred to as a road map ( Fig. 2).
Next, we will summarize the main characteristics of AGV systems typically
adopted in modern automated warehouses.

SYSTEM INSTALLATION
Automated motion of AGVs requires precise and constantly updated knowledge of
their current position. This is typically obtained by using laser-based technologies [14]
that provide very reliable results. In particular, laser-based localization is obtained by
each vehicle, computing its relative position with respect to opportunely placed
artificial landmarks (i.e., reflectors). A precise knowledge of the landmarks map is
mandatory for obtaining a highly precise localization. Moreover, the position of the
landmarks themselves has a great influence on the localization accuracy; optimal land-
mark placement is addressed, for instance, by Beinhofer, Muller, and Burgard [15].
For large plants, hundreds to thousands of reflectors are necessary for obtaining
accurate localization. Hence, one of the first phases of plant installation consists
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
60 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

FIG. 2 Portion of the road map of a plant.

of installing a large number of reflectors, placing them in order to avoid unwanted


symmetries and to cover all the working space. This is a time-consuming operation
that is generally performed manually by a team of specialized technicians. More-
over, it is often necessary to achieve the final result by step-by-step modification
and verification of the new reflector layout design.
Subsequently, the plant can be prepared to enable AGVs to complete missions.
As mentioned before, missions consist of moving goods among different locations in
the environment. Locations where loading and unloading operations generally are
performed are referred to as operation points. The position and characteristics of each
operation point need then to be mapped with high precision in order to ensure that
loading and unloading operations can be performed in an effective manner.
Finally, a set of roads needs to be defined that connect each pair of operation
points. This set of roads is typically referred to as a road map. The design of the
road map usually is performed manually with computer-aided design software
and requires a highly specialized operator. The road map design impacts AGV traf-
fic and defines the system efficiency.
MOTION COORDINATION
The motion of the AGVs is coordinated inside the warehouse in order to ensure
completion of all missions in an efficient manner. This entails reducing the overall
completion time while avoiding collisions and deadlocks.
Centralized path planning and traffic coordination is a very complex problem,
the complexity of which grows exponentially with the number of vehicles. In order
to reduce the complexity of the coordination algorithm, a commonly adopted
methodology consists of constraining the motion of the vehicles along the road
maps. It is worth noting that, in general, the design of the road map and the design
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
SABATTINI ET AL., DOI 10.1520/STP159420150052 61

of the coordination algorithm both contribute to the overall efficiency of the system.
Typically, specific features of the road map are handled by adding exceptions to the
coordination algorithm in the form of traffic rules [16].
Moreover, although a road map is a very effective manner of reducing the
computational resources needed for traffic management, constraining the motion
of the AGVs on a finite set of roads severely reduces the flexibility of the system.
For instance, if an obstacle appears on an AGV’s road, it is necessary to replan the
path to circumvent the obstacle. When an alternative path is not available, traffic
jams might occur that can be solved only with the intervention of an operator who
would manually remove the obstacle.

SENSING AND PERCEPTION FOR AGVs in Shared Environments


Human workers and autonomous machines usually share the environment in
warehouses, so safety is the main issue that must be fully addressed (Fig. 3). Safety sys-
tems always need to be reliable and robust and commonly rely on safety certified laser
scanners. These sensors are unable to distinguish among different kinds of obstacles.

FIG. 3 Human operators sharing the environment with AGVs.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
62 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

Moreover, they have a limited field of view and thus do not provide a complete view
of the surrounding environment; sensors are limited to predefined areas. Therefore,
AGVs need to greatly reduce their speed in critical zones in order to guarantee safe
behavior in response to unpredictable situations.
Along the same lines, although road maps are a very effective manner of reduc-
ing the computational resources needed for traffic management, constraining the
motion of the AGVs on a finite set of roads severely reduces the flexibility of
the system. In particular, this reduced flexibility clearly affects the performance of
the system in the presence of unforeseen obstacles. In fact, if an obstacle suddenly
appears in front of an AGV, it is necessary to replan the AGV’s path in order
to avoid collisions with the obstacle. If AGVs are constrained on the road map,
replanning means finding an alternative path, which is not always feasible.
Consider, for instance, the frequent case of monodirectional roads. In this case, if
an alternative path cannot be found on the road map, the AGV is stuck in one spot
until the obstacle has been removed.
Two main issues prevent the application of advanced control strategies that
would greatly increase the performance of the system.
First, laser scanners are the most common sensing devices that typically are
mounted on each AGV. Although these devices are very effective in guaranteeing
safety, they are not suitable for obtaining a reliable classification of the acquired
object. In particular, it is not possible to distinguish between humans and other
kind of obstacles. This is very relevant because humans act in an unpredictable
manner; therefore, for safety reasons, it is not possible to assume any knowledge
about the intentions of the humans themselves. Hence, if a human is within
the sensing range of an AGV, the only safe procedure is to avoid any movement.
Conversely, static obstacles do not make any unpredictable movement; hence, in
principle, they could be safely passed using a local detour. However, the impossibil-
ity of reliably distinguishing between humans and other kind of obstacles prevents
the implementation of this kind of advanced control technique.
Second, sensor systems installed on board each AGV are not able to acquire
global information about the surrounding environment. Roughly speaking, they
cannot look around corners. Hence, when approaching an intersection, it is neces-
sary for the AGV to slow down in order to ensure safety in the presence of unex-
pected moving objects (or humans).

CONTRIBUTION
Several studies address issues related to system installation [17–19], where methodolo-
gies are described for obtaining (in a semi-automated manner) a three-dimensional
map of an industrial environment, which can be subsequently exploited for automati-
cally designing a road map.
Subsequent motion coordination has also been considered [20–23], with
methodologies proposed for optimizing the coordination of the vehicles along the
road map, taking into account the model of the traffic.
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
SABATTINI ET AL., DOI 10.1520/STP159420150052 63

Advanced sensing technologies have been proposed based on advanced


laser scanners and omnidirectional cameras [24–27]. Advanced sensing systems
are used for contour-based localization [28] and for obstacle detection and
classification.
In this paper, we address the problem of multisensor data fusion. Considering
data acquired by multiple and heterogeneous sensing systems in particular, we
develop a data fusion strategy that provides a coherent global view of the environ-
ment, which is constantly updated. Here, we extend the preliminary results presented
in other studies [29,30], providing a more complete description and assessment of
the proposed methodology.

Related Works
In this section, we briefly analyze the most relevant literature related to multisensor
data fusion. Multisensor data fusion deals with the combination of information
coming from multiple sources in order to provide a robust and accurate represen-
tation of an environment or process of interest. A review and discussion of several
data fusion definitions is presented by Boström et al. [31].
The Joint Directors of Laboratories [32] define data fusion as “a process dealing
with the association, correlation, and combination of data and information from
single and multiple sources to achieve refined position and identity estimates, and
complete and timely assessments of situations and threats, and their significance.
The process is characterized by continuous refinements of its estimates and assess-
ments, and the evaluation of the need for additional sources, or modification of the
process itself, to achieve improved results.”
Multisensor data fusion is a multidisciplinary technology that involves several
application domains, such as robotics [33,34], military applications [35], biomedi-
cine [36,37], wireless sensor networks [38], and video and image processing [39].
Significant attention has been dedicated to the field in recent years; a review of
contemporary data fusion methodologies, as well as the most recent developments
and emerging trends in the research field, is presented by Khalegh et al. [40].
Different criteria can be used for the classification of data fusion techniques, as
discussed by Castanedo [41]. Considering the characteristics of the utilized data,
Luo, Yih, and Su [42] propose four types of abstraction: signal level, pixel level,
characteristic (based on features extracted from images or signals), and symbols (or
decision). More generally, we address three main levels of abstraction: measure-
ments, features, and decisions.
Another possible classification relative to the data abstraction level concerns
the following:
• Low-level fusion: This level deals directly with raw data to improve the
accuracy of the individual sources.
• Medium-level fusion: This is based on the processing of features or character-
istics (dimension, shape, position).
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
64 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

• High-level fusion: Also known as decision fusion, this level addresses symbolic
representation, such as object classes. This level is also known as the feature
or characteristic level.
• Multiple-level fusion: This level is based on the processing of data provided
at different levels of abstraction.
Sensor fusion can be also characterized, as introduced by Durrant-Whyte [43],
based on the relationship among the fused data, namely:
• Complementary, where each sensor provides incomplete information about
the world, and the objective of data fusion is combining these different parts
to achieve a more complete and accurate representation.
• Competitive, where information about the same target is provided by two or
more sources, and data fusion is used to increase reliability and accuracy,
reducing conflicts as well as noisy and erroneous measurements.
• Cooperative, in which the information provided by different sources is com-
bined into new and, typically, more complex information.
Finally, considering the implementation architecture, it is possible to distinguish
three main types of data fusion categories: centralized, distributed, and hierarchical:
• In a centralized architecture, a single module collects information from all the
input sources and makes decisions according to the received raw data. The
principal drawbacks of this solution are the possibility of a communication
bottleneck and the large amount of bandwidth requested to transmit raw data
over the network.
• In a distributed architecture, source nodes process raw data independently
and provide an estimation of the object status based on only their local views;
this information is the input to the multisensor fusion, which provides a fused
global view.
• Hierarchical architectures are combinations of decentralized and distributed
nodes in which the data fusion process is performed at different levels in the
hierarchy.
GLOBAL ENVIRONMENT REPRESENTATION
In this section, we present the main methodologies that can be found in the
literature for obtaining a constantly updated global environment representation as
a result of multisensor data fusion.
The focus is to obtain a global live view of the environment that contains both
structural elements and dynamic entities acquired by sensors. In other words, this
will define a global map that mainly will contain information about the static and
dynamic obstacles detected in the sensors’ surrounding area. Hence, it is necessary
to analyze methodologies for multisensor data fusion that are focused on obstacle
detection. Sensor fusion methods are particularly common in the obstacle detection
field to achieve improved accuracies that could not be guaranteed by the use of a
single sensor [44–47].
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
SABATTINI ET AL., DOI 10.1520/STP159420150052 65

Occupancy grids [48,49] are among the most commonly utilized low-level
sensor fusion strategies. They are often used in robotics to detect and track moving
objects and for simultaneous localization and mapping as well as path planning.
They allow automatic generation of a discrete map of the environment, represent-
ing the area of interest as a grid of two- or three-dimensional cells of equal size.
Originally designed for sonar data fusion, occupancy grid approaches have been
extended for fusion of stereo and optical flow data [50] and, under certain circum-
stances, for fusion of monocular camera data [51]. Compared with feature-based
approaches [52], grid maps are particularly flexible and robust for the fusion of
noisy information. In fact, they allow the integration of different kinds of input
data in the same framework, while considering the inherent uncertainty of each
input sensor. Fast inverse models [49,53] or, alternatively, more accurate forward
models [54], can be utilized as occupancy mapping algorithms for the update of the
grid cells. The major drawback of fixed grid structures is their large memory
requirement, especially during their initialization phase. Moreover, the extent of
the mapped environment needs to be known beforehand; otherwise, every time the
map is expanded, high-cost operations must be performed to increase the size of
the utilized memory.
Octrees [55] are a means of coping with these limitations; they are hierarchical
tree-based representations that delay the initialization of map volumes until meas-
urements need to be integrated. Thus, the map is populated only with volumes that
have been measured, and the hierarchical structure of the trees also can be used as a
multiresolution representation.
An alternative to grid-based methods is the sensor fusion strategy presented by
Jung et al. [46] for obstacle detections/classification in an active pedestrian protec-
tion system. Range data provided by a laser scanner are fused with images coming
from a camera, obtaining a set of images representing vehicle and pedestrian candi-
dates. These images are used as input for two pattern classifiers (one for vehicle
detection and the other for pedestrian detection) implemented by a support vector
machine with a Gabor filter bank [56].
It is worth noting that these approaches require the processing of low-level
information (images, three-dimensional point clouds, laser raw data) in the data
fusion level. Therefore, despite the provided accuracy and robustness, they are not
suitable for global live view implementation.
Conversely, in order to optimize the data transmission time and reduce the
network overhead, we consider a hierarchical data fusion strategy that processes
only medium-level features (identification [ID], age, position, orientation, velocity,
and size) and high-level features (class and classification quality).

M ed i u m Level

Generally speaking, medium-level data fusion entails processing the object meas-
urements (ID, age, position, orientation, velocity, and size) estimated—possibly
with uncertainty—by the different sensing sources available in the system.
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
66 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

Thus, from a medium-level point of view, the data fusion problem can be dealt
with as a target tracking process focused on maintaining the state estimates of one
or several objects over a period of time.
In a multisensor framework, because object tracking is performed by different
sensing sources, distributed track fusion methodologies can be utilized for the
medium-level data fusion implementation, including both maximum likelihood
[57] and minimum mean square error solutions.
When considering reliability, survivability, communication bandwidth, and
computational resources, distributed processing architectures are more practical
solutions than the centralized ones. As highlighted by Kalandros et al. [45], in a
distributed architecture, the sources transmit only the target tracks instead of all
measurements; this reduces the cost in computational demand as well as in com-
munication bandwidth requirements. The drawback of dividing the tracking task
among multiple processors is the possible introduction of correlated errors among
the tracks [58]. While measurements of a target from different sensors generally are
uncorrelated, different local track estimates for a common target are correlated,
requiring additional processing. To cope with the introduction of correlation in the
estimation, the cross covariances among track estimates must be computed.
Because this calculation is computationally expensive, it is possible to utilize meth-
odologies based on direct track-to-track fusion [45] or, alternatively, to treat the
decorrelation of the state estimates [59]. In both cases, local source processors must
send additional information to the global level, such as covariances and correspond-
ing measurement matrices.
An alternative approach for implementation of medium level data fusion is
the use of a heuristic based on the evaluation of the obstacles’ occupational area.
Starting from the bounding boxes delimiting the obstacles detected by the source
sensors, the algorithm considers their positions, orientations, and occupational
overlapping in order to reconstruct a two-dimensional/three-dimensional map con-
taining the set of blobs corresponding to the region covered by each candidate.
Integrating the information about the velocities and directions estimated for
each tracked obstacle, it is possible to discriminate among static and dynamic
obstacles. Then, split and merge techniques [60] can be utilized to resolve conflicts
in the discrimination among blobs that may represent different views of the same
object or, alternatively, separated elements.
If necessary, the information representing the fused obstacles can subsequently
be integrated in a grid map on which free space and unknown regions are modeled,
supporting the implementation of path planning and navigation functions.

H i g h Level

Generally speaking, high-level data fusion entails combining the classes estimated by
each sensing source; hence, these methodologies solve classifier combination problems.
In particular, each sensing source can be represented as a classifier. Then, in a
multisensor framework, we have a set of classifiers that, given an input pattern,
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
SABATTINI ET AL., DOI 10.1520/STP159420150052 67

provide an output score for each possible class of the system (e.g., human, manual
forklift, AGV, other dynamic and static objects). This value represents a confidence
measure for the class to be the correct one for the input pattern. Therefore, accord-
ing to the type of classifiers’ outputs, methods for classifier combinations
at measurement level (or Type III [61]) can be utilized.
As discussed by Tulyakov et al. [62], it is possible to distinguish between two
main categories of combination methods: score combination functions and combi-
nation decisions. In the first approach, a function is used to combine the classifiers’
scores in a predetermined manner. Conversely, in the second method, the classi-
fiers’ outputs are used as inputs to a secondary classifier that is trained to determine
the combination function.
Simple aggregation schemes at the measurement level, such as sum rule,
product rule, average rule, and max rule are all examples of score combination
functions. Despite their simplicity, these elementary combination rules compete
with the most sophisticated combination methods, as highlighted by Kuncheva,
Bezdek, and Duin [63] . Although they demonstrated high recognition rates, the
simple aggregation schemes do not allow determination a priori, which is the best
rule for a particular data set.
Alternatively, when dealing with high-level data fusion, it is possible to utilize
complex combination decision methodologies, such as neural networks [64], naive
Bayes [61], Dempster-Shafer theory [65], and classifier ensembles, such as bagging
[66] or boosting [67].
In general, a drawback of these techniques concerns the complexity of the
training scheme. Moreover, when an obstacle does not appear in the field of view
of a perception system, no classification hypotheses are provided by that system;
in the data fusion process, a missing response from a classifier does not mean unre-
liability. For these reasons, simple aggregation rules may be a better solution for the
implementation of high level data fusion.

Advanced Sensing Systems and Centralized


Data Fusion
In this section, we describe the advanced sensing system and centralized data fusion
technology proposed for enhancing the performance of AGV systems. We consider
a sensing system composed of two main elements: onboard sensors and infrastruc-
ture sensors.
In addition to safety laser scanners, it is necessary to equip each AGV with a
reliable environment perception system that is capable of monitoring the entire
360? region around the vehicle. In particular, the onboard perception system is
composed of a multiple laser scanner that is positioned around the AGV, together
with an omnidirectional stereo vision system consisting of two omnidirectional
lenses and two cameras mounted on the top of the AGV. Implementation details
can be found in the study by Drulea et al. [25].
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
68 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

Onboard sensors are complemented by additional sensing systems installed on


the infrastructure. The idea is similar to the use of hemispherical mirrors mounted
above the intersections that are used by the workers to look around the corners
( Fig. 4). Along these lines, we consider the presence of laser scanners installed on
specific locations in the environment in order to provide efficient monitoring of the
black spots, namely areas that are occluded by infrastructural elements. Details on
the implementation of such systems are outlined by Boehning [24].
Thus, sensing data are simultaneously acquired by different systems; it is
therefore necessary to make them available to the AGV control center, which can
then include sensing data into the planning and control strategy. Therefore, we
introduce a centralized system that is in charge of receiving data from different
sources, opportunely merging them, and making them available for the AGV
control system. This centralized system defines a global live view of the environ-
ment that contains constantly updated information regarding all the entities that
populate the industrial environment [68]. The described system architecture is
represented in Fig. 5.
In particular, it is possible to distinguish between two main classes of objects
that are perceived by sensing systems: static environmental objects and dynamic
objects. We consider the case where the static environmental objects, which are all
the infrastructural elements of the plant (e.g., rack, walls, doors, etc.), are stored in
a three-dimensional map that can be constructed exploiting a plant exploration
system as described by Beinschob and Reinke [18].
Conversely, dynamic objects are perceived, at run time, by infrastructure and
on-board sensors; in particular, those systems provide object detection, tracking,
and classification capabilities.

FIG. 4 Hemispherical mirror mounted above an intersection.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
SABATTINI ET AL., DOI 10.1520/STP159420150052 69

FIG. 5 The general system architecture designed for obstacle data detection, tracking,
classification, and fusion.

Thus, in the proposed architecture, the information about the obstacles in the
scene may be provided by several sources, involving the possibility of data redun-
dancy, inconsistency, ambiguity, noise, and incompleteness. To overcome these
problems, the global live view is introduced as a module that collects all data
acquired by the sensors and combines them in a unique, complete, and coherent
representation of the overall system, including the static and dynamic entities that
act inside it. In particular, the global live view gathers higher-quality information
(with respect to information based on local sensing only) and provides a global
updated map representing the static entities (the three-dimensional map of the
plant—the road map), the dynamic entities (the current position and velocity of
the AGVs, the position and velocity of currently identified objects), the congestion
zones, and the status of the monitored intersections.
In general, the information acquired by the infrastructure and on-board
perception systems consists of tracked and classified objects that are identified with
a unique ID. In detail, data regarding each object are:
• Position, orientation, velocity, size
• Class of the objects: human, manual forklift, AGV, other dynamic object, static
object
• An assessment regarding the quality and reliability of the classification
The global live view is then updated with the information acquired during
the operation and a real-time global map is generated. This output is shared with
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
70 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

the AGV fleet in order to improve their local on-board navigation capabilities and
to support safe operations.
It is important to guarantee consistency with respect to the real world; each
virtual object represented on the map must correspond to a real-world object.
Therefore, the global live view performs data fusion to merge data acquired from
the different sensors, reducing information redundancy and verifying the presence
of data inconsistency and ambiguity. In particular, we propose a two-level method-
ology that separately implements medium- and decision-level data fusion.

MEDIUM LEVEL
In the described architecture, dealing with data fusion at the medium level means
processing the object measurements (ID, age, position, orientation, velocity, and
size), estimated with uncertainty by the on-board and infrastructure systems, as
well as the elements inside the static map of the environment.
Thus, from a medium level point of view, we introduce a heuristic based on the
evaluation of the obstacle’s occupational area; the main steps of this solution are
represented in Fig. 6. Starting from the bounding boxes delimiting the obstacles
detected by the source sensors, the algorithm considers their positions, orientations,
and occupational overlapping in order to reconstruct a two-dimensional/three-
dimensional map containing the set of blobs corresponding to the region covered
by each candidate. Integrating the information about the velocities and directions
estimated for each tracked obstacle, it is possible to discriminate among static and
dynamic obstacles. Then, split and merge techniques [60] are utilized to resolve
conflicts in the discrimination between blobs that may represent different views
of the same object or, alternatively, separated elements. The information represent-
ing the fused obstacles is then integrated in a grid map on which free space and
unknown regions are modeled, supporting the implementation of path planning
and navigation functions. (Details are provided in the text section on advanced
navigation strategies.)

HIGH LEVEL
The choice of the data fusion strategies for the implementation of the global live
view can be considered, from a high level point of view, as a classifier combination
problem. According to this problem formulation, the static three-dimensional map

FIG. 6 Principal steps of the heuristic for the global live view implementation.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
SABATTINI ET AL., DOI 10.1520/STP159420150052 71

of the environment, the on-board sensor systems, and the infrastructure perception
systems represent a set of classifiers that, given an input pattern, provide an output
score for each possible class of the system (human, manual forklift, AGV, other
dynamic object, static object). This value represents a confidence measure for the
class to be the correct one for the input pattern.
Several methods can be found in the literature for solving the problem of classi-
fier combination at the measurement level (or at Type III [69]). Among these meth-
ods, we propose to exploit simple aggregation schemes at the measurement level,
such as sum rule, product rule, average rule, and max rule. Despite their simplicity,
these elementary combination rules compete with the more sophisticated combina-
tion methods, as is highlighted by Kuncheva, Bezdek, and Duin [70]. Moreover,
these methodologies are well suited for real-time implementation, which is manda-
tory in this kind of application.

Advanced Navigation Strategies


The presence of a constantly updated centralized system that collects informa-
tion about all the objects in the industrial environment makes it possible to
implement advanced techniques for optimizing the navigation performance of
the AGVs.
In particular, the problem is that of assigning missions to each AGV, and sub-
sequently planning the path to be traveled for mission completion, in an optimized
manner. The proposed mission assignment methodology involves the well-known
Hungarian algorithm that represents the optimal algorithm for solving the assign-
ment problem. Generally speaking, the Hungarian algorithm solves the problem of
assigning a certain number of activities to a certain number of agents. This assign-
ment is based on a matrix of weights, whose element ( i, j) corresponds to the cost
of assigning the j-th activity to the i-th agent. The optimal assignment obtained
after applying the Hungarian algorithm has the minimum total cost among all
possible choices.
In the scenario considered in this paper, activities are represented by missions
to be accomplished, and agents are represented by AGVs. It is worth remarking that
the objective is to increase the overall efficiency of the system; therefore, this implies
reducing the overall completion time for all the missions.
Therefore, the cost for assigning each AGV to a particular mission should be
proportional to the time spent by that AGV in completing that mission. Currently
utilized solutions translate this idea defining the cost as a quantity that is propor-
tional to the distance between each AGV and each mission location. In fact, assum-
ing constant speed, travel distance is proportional to completion time. However,
this assumption is unrealistic for multi-AGV systems in shared industrial environ-
ments. In fact, the presence of unforeseen obstacles, as well as the presence of traffic
jams, can significantly slow down AGVs; this leads to a completion time that is no
longer proportional to the travel distance.
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
72 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

The coordination of the AGVs along the road map can be performed exploit-
ing the strategy presented by Digani et al. [20]. In particular, this coordination
strategy consists of a hierarchical control architecture composed of two layers.
The higher level performs the coordination over macro areas of the environment
called sectors, while the lower level considers the coordination within each sector.
A portion of the road map divided into sectors is depicted in Fig. 7.
Based on the hierarchical division of the road map, it is possible to introduce
a definition for a traffic model that takes into account both the number of vehicles
and the presence of obstacles within each sector. Mission assignment and motion
coordination is then performed taking into account an opportunely weighted
road map. In particular, as described in detail by Sabattini et al. [71], we consid-
ered the following model: Each sector is characterized by a certain value of
capacity C that represents the maximum number of allowed vehicles. Let nk ð tÞ be
the number of vehicles in the k-th sector at time t. Then, the weight was defined
as follows:
? ?
1
wk ð tÞ ¼ ? n
?1 ?1 k
C
(1)

The proposed methodology has been implemented in a simulated environment


where a real industrial plant was represented. For reasons of confidentiality, it is not
possible to provide details about the industrial plant used for evaluation. Generally
speaking, AGV plants can be characterized into three classes based on their dimen-
sion. In particular:
• Small plants include up to 10 AGVs.
• Medium plants include 10 to 30 AGVs.
• Big plants include more than 30 AGVs.

FIG. 7 Portion of a road map divided into sectors.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
SABATTINI ET AL., DOI 10.1520/STP159420150052 73

FIG. 8 Average number of missions accomplished per hour: percentage increase with
respect to the nominal case (i.e., k2 ¼ 0 for different values of the capacity C).

We validated the proposed methodology with simulations performed based on


a medium size plant characterized by:
• Number of AGVs: 30
• Number of sectors in the road map: 18

FIG. 9 Local deviation from the road map for obstacle avoidance; Obstacles are identified
with numbers. For obstacle number 1, which needs to be avoided, the bounding
box is depicted as well. Furthermore, the picture shows the original (straight line)
portion of the road map and the computed local deviation (curved line).

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
74 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

Simulation results are reported hereafter for k1 ¼ 1, k2¼ 10 for different values
of the capacity C 2 ½2 … 30 . In particular, simulations were performed that
; ; ?

randomly generated missions to be accomplished by the AGVs. Then, we meas-


ured the average number of missions accomplished per hour, and we compared it
with the current result—namely, computing the weights based on path length
only that is obtained with k2 ¼ 0. Fig. 8 depicts the percentage increase. It is worth
noting that the choice of sector capacity is a critical design parameter because this
value heavily influences the performance of the system. For the case under consid-
eration, different capacity values were considered, ranging from C ¼ 2 to C ¼ 30.
Very promising results were obtained for medium capacity values, namely
C 2 ½10 15 ; in these cases, the number of missions accomplished per hour
; ?

increased by more than 10 %.


Global knowledge of the obstacles in the environment makes it possible
to implement, in a safe manner, obstacle avoidance maneuvers. In particular,

FIG. 10 Snapshots of the obstacle avoidance procedure.

(a) (b)

(c) (d)

(e) (f)

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
SABATTINI ET AL., DOI 10.1520/STP159420150052 75

exploiting the strategy introduced by Digani et al. [72], it is possible to compute


local deviations from the road map. These deviations are computed locally by each
AGV, relying on information acquired by means of on-board sensors comple-
mented by a global centralized knowledge of the environment. Local deviations
from the road map are exploited by the AGVs for avoiding collisions with obstacles,
while still moving toward their goal. In particular, the proposed algorithm:
• Computes a path that, given the shape of the AGV, guarantees avoidance of
collisions with the obstacles and with any infrastructural element.
• Defines a path that is admissible with respect to the kinematic constraints of
the AGVs. In particular, the curvature radius is limited.
• Guarantees that, once the obstacle has been passed, the AGV returns to the
road map, thus carrying on the original path plan and fulfilling its objective.
An example of a local deviation maneuver performed in a real industrial
environment is depicted in Fig. 9 . In this figure, green boxes represent the acquired
obstacles, and the bounding box of the obstacle to be overcome is represented in
blue. A portion of the original road map is represented by a pink straight line
segment, while a curved red line represents the local deviation.
Snapshots of the experiments are shown in Fig. 10; as expected, the AGV is able
to locally deviate from the road map for passing an obstacle on its path.

Conclusions
Advanced sensing technologies, together with centralized data fusion systems,
represent a very effective tool for improving the efficiency of multi-AGV systems
that share the environment with human operators, where safety is a primary issue.
Despite the availability of several technological solutions that exhibit good per-
formance in a laboratory environment, a significant effort is necessary to bring
those technologies to real working environments. The results obtained within the
PAN-Robots project represent a significant step in this direction, bringing to-
gether researchers from academia and industry to develop reliable solutions and
validate them in real factory environments.
Real-world implementation and validation, performed in cooperation with
industry, represents a fundamental milestone toward the definition of new safety
and technological regulations and standards that take into account state-of-
the-art technology. The definition of regulations and standards will lead to the
possibility of a massive deployment of advanced sensing solutions in industrial
environments.

ACKNOWLEDGMENTS
This paper was written within the PAN-Robots project. The research leading to these
results has received funding from the European Union Seventh Framework Pro-
gramme (FP7/2007-2013) under grant agreement no. 314193.
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
76 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

References
[1] European Commission, “Eurostat,” European Union, Brussels, Belgium, 2016, http://
ec.europa.eu/eurostat (accessed January 30, 2015).
[2] Sabattini, L., Digani, V., Secchi, C., Cotena, G., Ronzoni, D., Foppoli, M., and Oleari, F.,
“Technological Roadmap to Boost the Introduction of AGVs in Industrial Applications,”
Proceedings of the IEEE International Conference on Intelligent Computer Communi-
cation and Processing (ICCP) , Cluj-Napoca, Romania, September 5–7, 2013, Institute of
Electrical and Electronics Engineers (IEEE), New York, 2013, pp. 203–208.
[3] Oleari, F., Magnani, M., Ronzoni, D., and Sabattini, L., “Industrial AGVs: Toward a
Pervasive Diffusion in Modern Factory Warehouses,” Proceedings of the 2014 IEEE
International Conference on Intelligent Computer Communication and Processing
(ICCP) , Cluj-Napoca, Romania, September 4–6, 2014, Institute of Electrical and Electron-
ics Engineers, New York, 2014, pp. 233–238.
[4] Tsumura, T. “AGV in Japan—Recent Trends of Advanced Research, Development, and
Industrial Applications,” Proceedings of the IEEE/RSJ/GI International Conference on
Intelligent Robots and Systems ’94. “Advanced Robotic Systems and the Real World,”
IROS ’94. , Vol. 3, Munich, September 12–16, 1994, IEEE, New York, 1994, pp. 1477–1484.
[5] Stouten, B. and de Graaf, A. J., “Cooperative Transportation of a Large Object-
Development of an Industrial Application,” in Proceedings of the 2004 IEEE International
Conference on Robotics and Automation, ICRA ’04, Vol. 3, Barcelona, Spain, April
26–May 1, 2004, IEEE, New York, 2004, pp. 2450–2455.
[6] Mahadevan, B. and Narendran, T. T., “Design of an Automated Guided Vehicle-Based
Material Handling System for a Flexible Manufacturing system,” The International Jour-
nal of Production Research , Vol. 28, No. 9, 1990, pp. 1611–1622.
[7] Berman, S. and Edan, Y., “Decentralized Autonomous AGV System for Material
Handling,” International Journal of Production Research, Vol. 40, No. 15, 2002,
pp. 3995–4006.
[8] Martı́nez-Barberá, H. and Herrero-Pérez, D., “Autonomous Navigation of an Automated
Guided Vehicle in Industrial Environments,” Robotics and Computer-Integrated Manu-
facturing, Vol. 26, No. 4, 2010, pp. 296–311.
[9] Wurman, P. R., D’Andrea, R., and Mountz, M., “Coordinating Hundreds of Cooperative,
Autonomous Vehicles in Warehouses,” AI Magazine, Vol. 29, No. 1, 2008, p. 9.
[10] Olmi, R., Secchi, C., and Fantuzzi, C., “Coordination of Multiple AGVs in an Industrial
Application,” Proceedings of the 2008 IEEE International Conference on Robotics and
Automation , Pasadena, CA, May 19–23, 2008, IEEE, New York, 2008, pp. 1916–1921.
[11] Olmi, R., Secchi, C., and Fantuzzi, C., “Coordination of Industrial AGVs,” International
Journal of Vehicle Autonomous Systems, Vol. 9, No. 1, 2011, pp. 5–25.
[12] Herrero-Perez, D. and Matinez-Barbera, H., “Decentralized Coordination of Autonomous
AGVs in Flexible Manufacturing Systems,” Proceedings of the IEEE/RSJ International
Conference on Intelligent Robots and Systems, Nice, France, September 22–26, 2008,
IEEE, New York, 2008, pp. 3674–3679.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
SABATTINI ET AL., DOI 10.1520/STP159420150052 77

[13] Herrero-Pérez, D. and Martı´nez-Barberá, H., “Decentralized Traffic Control for


Non-Holonomic Flexible Automated Guided Vehicles in Industrial Environments,”
Advanced Robotics, Vol. 25, No. 6–7, 2011, pp. 739–763.
[14] Ronzoni, D., Olmi, R., Secchi, C., and Fantuzzi, C., “AGV Global Localization Using
Indistinguishable Artificial Landmarks,” Proceedings of the 2011 IEEE International
Conference on Robotics and Automation , Shanghai, China, May 9–13, 2011, IEEE, New
York, 2011, pp. 287–292.
[15] Beinhofer, M., Muller, J., and Burgard, W., “Near-Optimal Landmark Selection for
Mobile Robot Navigation,” in Proceedings of the 2011 IEEE International Conference on
Robotics and Automation , Shanghai, China, May 9–13, 2011, IEEE, New York, 2011,
pp. 4744–4749.
[16] Pallottino, L., Scordio, V. G., Bicchi, A., and Frazzoli, E., “Decentralized Cooperative Policy
for Conflict Resolution in Multivehicle Systems,” IEEE Transactions on Robotics, Vol. 23,
No. 6, 2007, pp. 1170–1183.
[17] Beinschob, P. and Reinke, C., “Strategies for 3D Data Acquisition and Mapping in Large-
Scale Modern Warehouses,” in Proceedings of the 2013 IEEE International Conference on
Intelligent Computer Communication and Processing, Cluj-Napoca, Romania, September
5–7, 2013, IEEE, New York, 2013, pp. 229–234.
[18] Beinschob, P. and Reinke, C., “Advances in 3D Data Acquisition, Mapping, and Localiza-
tion in Modern Large-Scale Warehouses,” Proceedings of the 2014 IEEE International
Conference on Intelligent Computer Communication and Processing, Cluj-Napoca,
Romania, September 4–6, 2014, IEEE, New York, 2014, pp. 265–271.
[19] Digani, V., Sabattini, L., Secchi, C., and Fantuzzi, C., “An Automatic Approach for
the Generation of the Roadmap for Multi AGV Systems in an Industrial Environ-
ment,” Proceedings of the 2014 IEEE/RSJ International Conference on Intelligent
Robots and Systems, Chicago, IL, September 14–18, 2014, IEEE, New York, 201 4,
pp. 1736–1741.
[20] Digani, V., Sabattini, L., Secchi, C., and Fantuzzi, C., “Hierarchical Traffic Control for
Partially Decentralized Coordination of Multi AGV Systems in Industrial Environments,”
in Proceedings of the IEEE International Conference on Robotics and Automation , Hong
Kong, May 31–June 7, 2014, IEEE, New York, 2014, pp. 6144–6149.
[21] Digani, V., Sabattini, L., Secchi, C., and Fantuzzi, C., “Towards Decentralized Coordination
of Multi Robot Systems in Industrial Environments: A Hierarchical Traffic Control
Strategy,” Proceedings of the IEEE International Conference on Intelligent Computer
Communication and Processing, Cluj-Napoca, Romania, September 5–7, 2013, IEEE,
New York, 2013, pp. 209–215.
[22] Digani, V., Sabattini, L., Secchi, C., and Fantuzzi, C., “Ensemble Coordination Approach in
Multi AGV Systems Applied to Industrial Warehouses,” IEEE Transactions on Automation
Science and Engineering, Vol. 12, No. 3, 2015, pp. 922–934.
[23] Digani, V., Hsieh, M. A., Sabattini, L., and Secchi, C., “A Quadratic Programming Approach
for Coordinating Multi AGV Systems,” Proceedings of the 2015 IEEE International
Conference on Automation Science and Engineering, Gothenburg, Sweden, August
24–28, 2015, IEEE, New York, 2015, pp. 600–605.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
78 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

[24] Boehning, M., “Improving Safety and Efficiency of AGVs at Warehouse Black Spots,”
Proceedings of the 2014 IEEE International Conference on Intelligent Computer
Communication and Processing, Cluj Napoca, Romania, September 4–6, 2014, IEEE,
New York, 2014, pp. 245–249.
[25] Drulea, M., Szakats, I., Vatavu, A., and Nedevschi, S., “Omnidirectional Stereo Vision
Using Fisheye Lenses,” Proceedings of the 2014 IEEE International Conference on
Intelligent Computer Communication and Processing, Cluj Napoca, Romania, September
4–6, 2014, IEEE, New York, 2014, pp. 251–258.
[26] Nagy, A. E., Szakats, I., Marita, T., and Nedevschi, S., “Development of an Omnidirectional
Stereo Vision System,” Proceedings of the IEEE International Conference on Intelligent
Computer Communication and Processing, Cluj-Napoca, Romania, September 5–7, 2013,
IEEE, New York, 2013, pp. 235–242.
[27] Aikio, M., Makinen, J. T., and Yang, B., “Omnidirectional Camera,” Proceedings of the
IEEE International Conference on Intelligent Computer Communication and Processing
(ICCP) , Cluj-Napoca, Romania, September 5–7, 2013, IEEE, New York, 2013, pp. 217–221.
[28] Reinke, C. and Beinschob, P., “Strategies for Contour-Based Self-Localization in
Large-Scale Modern Warehouses,” Proceedings of the IEEE International Conference on
Intelligent Computer Communication and Processing, Cluj-Napoca, Romania, September
5–7, 2013, IEEE, New York, 2013, pp. 223–227.
[29] Cardarelli, E., Sabattini, L., Secchi, C., and Fantuzzi, C., “Cloud Robotics Paradigm for
Enhanced Navigation of Autonomous Vehicles in Real World Industrial Applications,” Pro-
ceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems,
Hamburg, Germany, September 28–October 2, 2015, IEEE, New York, pp. 4518–4523.
[30] Sabattini, L., Cardarelli, E., Digani, V., Secchi, C., Fantuzzi, C., and Fuerstenberg, K.,
“Advanced Sensing and Control Techniques for Multi AGV Systems in Shared Industrial
Environments,” in Proceedings of the 2015 IEEE International Conference on Emerging
Technologies and Factory Automation , Luxembourg, September 8–11, 2015, IEEE, New
York, 2015, pp. 1–7.
[31] Boström, H., Andler, S. F., Brohede, M., Johansson, R., Karlsson, A., van Laere, J.,
Niklasson, L., Nilsson, M., Persson, A., and Ziemke, T., “On the Definition of Information
Fusion as a Field of Research,” University of Skvde, School of Humanities and Infor-
matics, Tech. Rep. HS- IKI -TR-07-006, 2007.
[32] White, F. E., Data Fusion Lexicon , Joint Directors of Laboratories, Technical Panel for C3,
Data Fusion Sub-Panel, Naval Ocean Systems Center, San Diego, CA, 1986.
[33] Bellotto, N. and Hu, H., “Vision and Laser Data Fusion for Tracking People with a Mobile
Robot,” Proceedings of the IEEE International Conference on Robotics and Biomimetics
(ROBIO 2006), Kunming, China, December 17–20, 2006, IEEE, New York, 2006, pp. 7–12.
[34] Haijun, W. and Yimin, C., “Sensor Data Fusion Using Rough Set for Mobile Robots
System,” Proceedings of the 2nd IEEE/ASME International Conference on Mechatronic
and Embedded Systems and Applications, Beijing, China, August 13–16, 2006, IEEE,
New York, 2006, pp. 1–5.
[35] Bossé, E., Valin, P., Boury-Brisset, A., and Grenier, D., “Exploitation of A Priori Knowledge
for Information Fusion,” Information Fusion , Vol. 7, No. 2, 2006, pp. 161–175.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
SABATTINI ET AL., DOI 10.1520/STP159420150052 79

[36] Bracio, B., Horn, W., and Moller, D., “Sensor Fusion in Biomedical Systems,” Proceedings
of the 19th Annual International Conference of the IEEE Engineering in Medicine and
Biology Society, Vol. 3, Chicago, IL, October 30–November 2, 1997, IEEE, New York,
1997, pp. 1387–1390.
[37] Tulyakov, S. and Govindaraju, V., “Classifier Combination Types for Biometric
Applications,” IEEE Computer Society Conference on Computer Vision and Pattern
Recognition Workshop, New York, June 17–22, 2006, IEEE, New York, 2006, p. 58.
[38] Krishnamachari, B., Estrin, D., and Wicker, S. B., “The Impact of Data Aggregation in
Wireless Sensor Networks,” Proceedings of the 22nd International Conference on Dis-
tributed Computing Systems, Vienna, Austria, July 2–5, 2002, IEEE Computer Society,
Washington, DC, 2002, pp. 575–578.
[39] Joo, S. and Chellappa, R., “A Multiple-Hypothesis Approach for Multiobject Visual
Tracking,” IEEE Transactions on Image Processing, Vol. 16, No. 11, 2007, pp. 2849–2854.
[40] Khalegh, B., Khamis, A., Karray, F. O., and Razavi, S., “Multisensor Data Fusion: A Review
of the State-of-the-Art,” Information Fusion , Vol. 14, No. 1, 2013, pp. 28–44.
[41] Castanedo, F., “A Review of Data Fusion Techniques,” The Scientific World Journal,
Vol. 2013, No. 6, 2013, doi:10.1155/2013/704504
[42] Luo, R., Yih, C.-C., and Su, K.-L., “Multisensor Fusion and Integration: Approaches,
Applications, and Future Research Directions,” Sensors Journal, IEEE, Vol. 2, No. 2, 2002,
pp. 107–119.
[43] Durrant-Whyte, H. F., “Sensor Models and Multisensor Integration,” International Journal
of Robotics, Research , Vol. 7, No. 6, 1988, pp. 97–113.
[44] Dima, C., Vandapel, N., and Hebert, M., “Sensor and Classifier Fusion for Outdoor
Obstacle Detection: An Application of Data Fusion to Autonomous Off-Road Navi-
gation,” The 32nd Applied Imagery Recognition Workshop (AIPR2003), Washington,
DC, October 15–17, 2003, IEEE Computer Society, Washington, DC, 2003, pp. 255–262.
[45] Kalandros, M., Trailovic, L., Pao, L. Y., and Bar-Shalom, Y., “Tutorial on Multisensor
Management and Fusion Algorithms for Target Tracking,” Proceedings of the 2004
American Control Conference, Vol. 5, Boston, MA, June 30–July 2, 2004, IEEE,
New York, pp. 4734–4748.
[46] Jung, H., Lee, Y., Yoon, P., Hwang, I., and Kim, J., “Sensor Fusion Based Obstacle
Detection/Classification for Active Pedestrian Protection System,” Advances in
Visual Computing, G. Bebis, B. Parvin, D. Koracin, A. Nefian, G. Meenakshisundaram,
V. Pascucci, J. Zara, J. Molineros, H. Theisel, and T. Malzbender, Eds., Springer, Berlin
Heidelberg, 2006, vol. 4292, pp. 294–305.
[47] Schueler, K., Weiherer, T., Bouzouraa, E., and Hofmann, U., “360 Degree Multi Sensor
fusion for Static and Dynamic Obstacles,” 2012 IEEE Intelligent Vehicles Symposium (IV) ,
Madrid, June 3–7, 2012, IEEE, New York, 2012, pp. 692–697.
[48] Elfes, A., “Using Occupancy Grids for Mobile Robot Perception and Navigation,”
Computer, Vol. 22, No. 6, 1989, pp. 46–57.
[49] Moravec, H., “Sensor Fusion in Certainty Grids for Mobile Robots,” AI Magazine, Vol. 9,
No. 2, 1988, pp. 61–74.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
80 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

[50] Braillon, C., Usher, K., Pradalier, C., Crowley, J., and Laugier, C., “Fusion of Stereo and
Optical Flow Data Using Occupancy Grids,” in Intelligent Transportation Systems
Conference, 2006, ITSC ’06, Toronto, September 17–20, 2006, IEEE, New York, 2006,
pp. 1240–1245.
[51] Stepan, P., Kulich, M., and Preucil, L., “Robust Data Fusion with Occupancy Grid,” IEEE
Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews,
Vol. 35, No. 1, 2005, pp. 106–115.
[52] Leonard, J. and Durrant-Whyte, H., “Simultaneous Map Building and Localization for an
Autonomous Mobile Robot,” Proceedings of IROS’91, IEEE/RSJ International Workshop
on Intelligent Robots and Systems ’91, Intelligence for Mechanical Systems, Vol. 3, Osaka,
November 3–5, 1991, IEEE, New York, 1991, pp. 1442–1447.
[53] Pietzsch, S., Vu, T. D., Burlet, J., Aycard, O., Hackbarth, T., Appenrodt, N., Dickmann, J.,
and Radig, B., “Results of a Precrash Application Based on Laser Scanner and Short-
Range Radars,” IEEE Transactions on Intelligent Transportation Systems, Vol. 10, No. 4,
2009. pp. 584–593.
[54] Thrun, S., “Learning Occupancy Grids with Forward Models,” Proceedings of the 2001
IEEE/RSJ International Conference on Intelligent Robots and Systems, Vol. 3, Maui, HI,
October 29–November 3, 2001, IEEE, New York, 2001, pp. 1676–1681.
[55] Hornung, A., Wurm, K., Bennewitz, M., Stachniss, C., and Burgard, W., “Octomap: An
Efficient Probabilistic 3D Mapping Framework Based on Octrees,” Autonomous Robots,
Vol. 34, No. 3, 2013, pp. 189–206.
[56] Sun, Z., Bebis, G., and Miller, R., “On-Road Vehicle Detection Using Evolutionary Gabor
Filter Optimization,” IEEE Transactions on Intelligent Transportation Systems, Vol. 6,
No. 2, 2005, pp. 125–137.
[57] Bar-Shalom, Y. and Campo, L., “The Effect of the Common Process Noise on the
Two-Sensor Fused-Track Covariance,” IEEE Transactions on Aerospace and Electronic
Systems, Vol. AES-22, No. 6, 1986, pp. 803–805.
[58] Bar-Shalom, Y., “On the Track-to-Track Correlation Problem,” IEEE Transactions on
Automatic Control, Vol. 26, No. 2, 1981, pp. 571–572.
[59] Khawsuk, W. and Pao, L. Y., “Decorrelated State Estimation for Distributed Tracking
of Interacting Targets in Cluttered Environments,” Proceedings of the 2002 American
Control Conference, Vol. 2, Anchorage, May 8–10, 2002, IEEE, New York, 2002,
pp. 899–904.
[60] Parvin, B. and Medioni, G., “Segmentation of Range Images into Planar Surfaces by Split
and Merge,” Proceedings of International Conference on Computer Vision and Pattern
Recognition (CVPR 86) , Miami Beach, June 22–26, 1986, IEEE Computer Society Press,
Washington, DC, pp. 415–417.
[61] Xu, L., Krzyzak, A., and Suen, C., “Methods of Combining Multiple Classifiers and
Their Applications to Handwriting Recognition,” IEEE Transactions on Systems, Man, and
Cybernetics, Vol. 22, No. 3, 1992, pp. 418–435.
[62] Tulyakov, S., Jaeger, S., Govindaraju, V., and Doermann, D., “Review of Classifier
Combination Methods,” Machine Learning in Document Analysis and Recognition ,
S. Marinai and H. Fujisawa, Eds., Springer, Berlin Heidelberg, 2008, pp. 361–386.
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
SABATTINI ET AL., DOI 10.1520/STP159420150052 81

[63] Kuncheva, L. I., Bezdek, J. C., and Duin, R. P., “Decision Templates for Multiple Classifier
Fusion: An Experimental Comparison,” Pattern Recognition , Vol. 34, No. 2, 2001,
pp. 299–314.
[64] Lee, D.-S. and Srihari, S., “A Theory of Classifier Combination: The Neural Network
Approach,” Proceedings of the Third International Conference on Document Analysis
and Recognition , 1995, Vol. 1, Montreal, August 14–16, 1995, IEEE, New York, pp. 42–45.
[65] Dempster, A., “A Generalization of Bayesian Inference,” Classic Works of the Dempster-
Shafer Theory of Belief Functions, R. Yager, and L. Liu, Eds., Springer, Berlin Heidelberg,
2008, pp. 73–104.
[66] Breiman, L., “Bagging Predictors,” Machine Learning, Vol. 24, No. 2, 1996, pp. 123–140.
[67] Schapire, R. E., “A Brief Introduction to Boosting,” Proceedings of the 16th International
Joint Conference on Artificial Intelligence, IJCAI ’99, Vol. 2, Stockholm, July 31–August 6,
1999, Morgan Kaufmann, San Francisco, CA, 1999, pp. 1401–1406.
[68] Cardarelli, E., Sabattini, L., Secchi, C., and Fantuzzi, C., “Multisensor Data Fusion for
Obstacle Detection in Automated Factory Logistics,” Proceedings of the IEEE
International Conference on Intelligent Computer Communication and Processing, Cluj-
Napoca, Romania, September 4–6, 2014, IEEE, New York, 2014, pp. 221–226.
[69] Xu, L., Krzyzak, A., and Suen, C., “Methods of Combining Multiple Classifiers and
Their Applications to Handwriting Recognition,” IEEE Transactions on Systems, Man, and
Cybernetics, Vol. 22, No. 3, 1992, pp. 418–435.
[70] Kuncheva, L. I., Bezdek, J. C., and Duin, R. P., “Decision Templates for Multiple
Classifier Fusion: An Experimental Comparison,” Pattern Recognition , Vol. 34, No. 2,
2001, pp. 299–314.
[71] Sabattini, L., Digani, V., Lucchi, M., Secchi, C., and Fantuzzi, C., “Mission Assignment for
Multi-Vehicle Systems in Industrial Environments,” Proceedings of the IFAC Symposium
on Robot Control (SYROCO) , Salvador, Brazil, August 26–28, 2015, IFAC–PapersOnLine,
Vol. 48, No. 19, 2015, pp. 268–273.
[72] Digani, V., Caramaschi, F., Sabattini, L., Secchi, C., and Fantuzzi, C., “Obstacle Avoidance
for Industrial AGVs,” Proceedings of the IEEE International Conference on Intelligent
Computer Communication and Processing (ICCP) , Cluj-Napoca, Romania, September
4–6, 2014, IEEE, New York, 2014, pp. 227–232.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
82 AUTONOMOUS INDUSTRIAL VEHICLES: FROM THE LABORATORY TO THE FACTORY FLOOR

STP 1594, 2016 / available online at www. astm. org / doi: 10. 1520/STP159420150053

Daniel Theobald 1 and Frederik Heger1

The Safety-to-Autonomy Curve:


An Incremental Approach to
Introducing Automation to the
Workforce
Citation
Theobald, D. and Heger, F., “The Safety-to-Autonomy Curve: An Incremental Approach to
Introducing Automation to the Workforce,” Autonomous Industrial Vehicles: From the
Laboratory to the Factory Floor, ASTM STP1594, R. Bostelman and E. Messina, Eds., ASTM
International, West Conshohocken, PA, 2016, pp. 82–90, doi:10.1520/STP1594201500532

ABSTRACT
An explicit focus on safety first can help accelerate the adoption of robotics
and intelligent automation technology across many sectors of the economy.
The safety-to-autonomy approach to introducing automation in the workplace
provides a glide path that allows businesses to leverage the benefits of new
technology at the pace they are comfortable with and that they can sustain,
while realizing return on investment from day one. The transition to large-scale
adoption of intelligent automation starts with sensorized industrial equipment
that provides active safety features to immediately curb accident rates. They
continuously collect data and model how humans use them in existing processes.
As businesses become comfortable with these augmented machines, additional
autonomous functionality can be enabled incrementally, supported by the
information learned from observing human operators over time. Focusing on
safety can provide the glide path for businesses to embrace the robot revolution
across all sectors.

Keywords
safety, automation, machine vision

Manuscript received June 15, 2015; accepted for publication August 12, 2015.
1
Vecna Technologies, Inc., 36 Cambridge Park Dr., Cambridge, MA 02140
2
ASTM Workshop on Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor on
May 26–30, 2015 in Seattle, Washington.

Copyright VC 2016 by ASTM International, 100 Barr Harbor Drive, PO Box C700, West Conshohocken, PA 19428-2959.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
THEOBALD AND HEGER, DOI 10.1520/STP159420150053 83

Dispelling the Big Bang Theory of Robotics


Robots are cropping up in every industry, from consumer goods to manufacturing
to hospitality, and everywhere in between. It is no secret that robots can change
the way we live and work, but the adoption of automated solutions has been slow
going. According to a recent study by Pricewaterhouse Coopers [1], 41 % of manu-
facturing companies surveyed said they do not currently use robotics technology,
and they do not intend to in the next three years. This study cites both the
perceived lack of cost effectiveness of robots (that robots displace workers and
lower morale) and the insufficient resources and expertise to operate the robots as
the reasons for companies’ reluctance to adopt them into the workforce.
These perceptions are based on the “big bang” theory of automation: that
robots can be introduced to a workplace suddenly and instantly achieve complete
automation for some processes. However, by introducing robots to the workplace
incrementally, a potentially negative impact on worker morale is mitigated, and
expertise can be acquired through on-the-job introduction to new tools. This
approach ultimately improves productivity.
How is it possible to introduce a robot over time? Through paying keen
attention to safety first.

The Safety-to-Autonomy Curve


When introducing automation and autonomy, safety must be the top priority. In
2013, the National Safety Council estimated that the total cost of wage and produc-
tivity losses due to safety-related death and injury was $188.9 billion in 2011 [2].
For example, 90 % of all forklifts will be involved in some type of accident during
their useful life, causing one-sixth of all workplace deaths, at tremendous cost to the
industry. So what does safety of human-operated machines have to do with robotics?
Robots use many sensors to understand and make sense of their environment
and act appropriately. The sensors that ultimately may give a robot its autonomous
capabilities can at first be used for basic safety monitoring features aboard equip-
ment already used by human operators.
For example, a robot is initially deployed as a safety-enabled operator-
controlled unit—much like a car is outfitted today with forward collision avoidance
and blind spot monitoring. All its sensors are enabled and collecting data, but its
ability to “act” on its environment is limited to alerting the human operator or to
bringing the vehicle to a safe stop. For example, a forklift that is fitted with sensors
for safety could perceive a human in its path and come to a full stop before the
operator could even react, potentially curbing the 110,000 major forklift accidents
that occur in the United States every year [3].
Fig. 1 shows results from work Vecna has done to identify humans in various
poses using monocular cameras. Adding this information as an overlay to existing
rear view or other assistive camera systems available for industrial equipment would
be a simple first step. Making the same information available to a safety system
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
84 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

FIG. 1 Pedestrian detection from monocular camera. For human-operated machines,


this information can be used to monitor blind spots and to warn the driver or
even to stop the machine. Robotic vehicles can behave appropriately and
predictably if they know that pedestrians are present.

installed on the equipment can do even better by removing the need for the
operator to continually interpret a marked-up video feed or other interface. That
augmented system could be configured to audibly alert the operator to the presence
of humans in the vicinity and also limit the equipment’s velocity in directions that
bring it close to detected pedestrians.
The simple presence of a pedestrian near industrial equipment is not necessar-
ily cause for concern. Much depends on what that person is doing, where she is
looking, and whether or not she is aware of the machine close by. Understanding a
detected person’s activity and gaze from sensor data can help to determine what
kind of warning or action is appropriate. Vecna has demonstrated technology for
detecting individuals’ and groups’ activities in imagery and video data. Fig. 2 shows
the system recognizing air marshalling gestures. This technology could be used
as a means for a pedestrian to command or interact with a piece of automated
equipment—whether the person is actively involved in the robot’s activity or whether
she is pursuing an unrelated activity that happens to bring her close to the machine.
Fig. 3 shows how a person’s activities and focus can be identified by combining
person detection with context information from the images.
Vecna’s robotic logistics solutions make use of a variety of sensors (including
Omni-Eye [see Fig. 4] , which is currently under development) to support its safety-
first philosophy through maximum situational awareness (see Fig. 5).
From day one, a robot’s active safety features protect the human workforce and
reduce the cost of accidents. There is no risk of inadequate or undesirable autono-
mous behavior because human operators continue to be in control of the vehicle.
By sensorizing a piece of industrial equipment for safety, all sectors can prevent
death and injury, damage, and delays and can save billions of dollars annually.
In addition, intelligently safe machines with capabilities such as those described
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
THEOBALD AND HEGER, DOI 10.1520/STP159420150053 85

FIG. 2 The ability to detect gestures of people in the environment allows anyone to
command or interact with the robot.

here directly increase productivity in their facilities by allowing humans and robots
to work in close proximity. This approach avoids process waste that is introduced
when human and robot operations need to be physically separated.

Growing Return on Investment Beyond Safety


Sensorizing industrial equipment for safety implicitly includes computing capabil-
ities to process the sensor data. The processing power already available can provide

FIG. 3 Automated activity recognition augments the knowledge that humans are
nearby with additional information about their current activities. With this, the
system can evaluate whether they are aware of the machine operating near
them, or if they are occupied otherwise, and adjust its operation accordingly.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
86 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

FIG. 4 OmniEye, a high-definition omnidirectional camera. Its high-definition,


ultra-wide imaging system provides 360-degree field-of-view for maximum
situational awareness around human-operated and autonomous machines alike.

more capabilities than just operational aids and awareness; additional features
can be easily and incrementally introduced to make vehicles more fully autono-
mous. The equipment is not just sensing the surroundings, it is making sense of
its environment ( Fig. 5).
Just like an office customer relationship management software suite, the con-
sumer invests in the basic package but, after realizing initial return on investment,
takes those cost savings to upgrade and turn on additional features—the same can
be achieved with sensorized industrial equipment. For example, forklift accidents
cost U.S. businesses $135 million annually. With the cost savings from reduced
accidents through a safety-enabled sensorized forklift, businesses can fund addi-
tional features toward full autonomy.

FIG. 5 Information from many different sensors is combined to understand the


environment around the machine.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
THEOBALD AND HEGER, DOI 10.1520/STP159420150053 87

In addition to monitoring the vehicle for safety reasons, collecting all that
sensor data also enables the system to learn how humans operate the machines in a
particular environment. Learning from demonstration or other supervised machine
learning approaches can then be used to provide tailored autonomy behavior for
specific environments. This concept is particularly powerful because robotics engi-
neers and developers of robot systems generally do not have firsthand experience
working in their target domains. Important and potentially unintuitive aspects
of tasks that expert operators perform naturally without even thinking about them
can be picked up and incorporated implicitly into the autonomous system. The
more “human-like” autonomous vehicles perform key aspects of their tasks, such
as picking up pallets or getting out of the way for emergency personnel, the more
they will be trusted and the easier it will be to incorporate them into existing
human-centric processes to boost productivity.
Vecna’s QC BotV hospital logistics platform navigates autonomously through
R

spaces shared with humans, many of whom are not trained to interact with the
robot (see Fig. 6). By analyzing a human’s behavior and reactions as it travels
through the environment, the robot can modify its behavior to be as predictable as
possible in all situations.
With the safety of automated equipment ensured, advanced autonomy
features can be introduced incrementally. A business that has gradually become
accustomed to robots in its workforce and that has built up trust in the robots’
capabilities and functions will be much more willing to consider process modifi-
cations that allow even larger benefits and return on investment to be realized.
The robotic “chess game” shown in Fig. 7 is an example demonstrating such a

FIG. 6 Vecna’s QC Bot hospital logistics platform navigating autonomously and


reacting to pedestrians in its environment.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
88 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

process change. Inputs to the system were desired moves of pieces around the
board. Humans provided the high-level input, and the system decomposed these
inputs as necessary into actionable parts for execution. Robots perform their
tasks while the operator is free to work on other tasks that require her cognitive
capabilities. She receives a notification when one move is complete and can then
choose the next one. Users of Vecna’s system move from having specific robots
perform tasks for them to assigning missions to the overall system and allowing
software to decide how the available resources should be applied to meet the
requests—thus the safety-to-autonomy curve, the glide path on which an indus-
try incrementally adopts robotic solutions.

Preparing for the New Robotic Economy


Naturally, robotics does not exist in a vacuum. Just as rules and regulations needed
to be created around sharing digital media online, building skyscrapers, and driving
cars, the next wave of policy will be around the use of automation.
In fact, the Federal Aviation Administration has proposed rules about flying
drones—and struggles with developing regulations for businesses that plan to use

FIG. 7 High-level planning and optimization can decompose abstract goals such as
“move the black king to square E5” into actions the robot(s) can execute. The
robot monitors execution and recovers autonomously or escalates to request
expert operator assistance.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
THEOBALD AND HEGER, DOI 10.1520/STP159420150053 89

drones as part of their business model [4]. This is just one example of the regulatory
realities our new robotic economy will face.
Policy needs time to catch up to the technology we already have at our finger-
tips. A recent report by the McKinsey Global Institute explains how policy makers
need time to understand economic trends and nuances in regions, innovations, and
exports in order to create the right policies that support manufacturing industries
and the role of automation within them [5]. In addition, human workers also need
time to adjust to sharing their work space with intelligent machines, and business
owners need time to innovate their work flows and processes with the help of
automation.
In addition, industry needs time to evolve as well as to establish and adopt
standards for both safety and interoperability. Robots from different vendors need
to be able to coexist in shared environments. They need to be able to integrate
seamlessly with existing infrastructure, such as elevators and automatic doors,
and to interact effectively with humans. This is only possible within a healthy
ecosystem of vendors, suppliers, and manufacturers who solve common problems
once, thus reducing barriers to adoption. MassRobotics (www.massrobotics.org)
is one initiative that aims to bring together vendors, researchers, and government
and commercial interests within the growing the robotics sector to meet current
and future needs.

Safety-to-Autonomy as the Path to Rapid


Adoption
The safety-to-autonomy approach ultimately improves productivity. It allows
policy makers, business owners, and the workforce to prepare for the new robotic
economy. In addition, it saves businesses from having to front investment for
capital expenditures associated with acquiring fully autonomous machines, for
training personnel on new equipment, for reconfiguring and refining new proc-
esses, and for learning and understanding new policy all at once. Instead, the
safety-to-autonomy approach allows businesses to move existing equipment and
personnel toward safely innovating processes in a slow but steady way that mini-
mizes disruption in delivering quality goods and services to customers while
saving money. Rather than waiting for the big bang theory of robotics to radically
and instantaneously change the status quo of how business is conducted, safety
can provide the glide path for businesses to embrace the robotic revolution across
all sectors.

ACKNOWLEDGMENTS
Vecna’s research and development efforts toward ensuring that its robot solutions
support the smooth safety-to-autonomy transition described in this paper were
supported by various research grants, including projects from the Office of Naval
Research, the Defense Advanced Research Projects Agency, and NASA.
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
90 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

References
[1] Pricewaterhouse Coopers, “The New Hire: How a New Generation of Robots is Trans-
forming Manufacturing,” PwC, New York, 2014, http://www.pwc.com/us/en/industrial-
products/assets/industrial-robot-trends-in-manufacturing-report.pdf (accessed April 30,
2016).
[2] National Safety Council, “Injury Facts,” National Safety Council, Itasca, IL, 2013, http://
www.mhi.org/downloads/industrygroups/ease/technicalpapers/2013-National-Safety-
Council-Injury-Facts.pdf (accessed April 30, 2016).
[3] Frane, D., “Top 10 Forklift Accidents,” Tools of the Trade, July 2013, http://www.
toolsofthetrade.net/jobsite-equipment/top-10-forklift-accidents.aspx (accessed April 30,
2016).
[4] Pilkington, E., “US Experts Join Companies Protesting FAA Commercial Drones
Proposals,” The Guardian , February 22, 2015, http://www.theguardian.com/world/2015/
feb/22/experts-companies-protest-faa-commercial-drones-proposals (accessed April 30,
2016).
[5] Manyika, J., Sinclair, J., Dobbs, R., Strube, G., Rassey, L., Mischke, J., Remes, J., Roxburgh,
C., George, K., O’Halloran, D., and Ramaswamy, S., “Manufacturing the Future: The
Next Era of Global Growth and Innovation,” McKinsey Global Institute, McKinsey & Com-
pany, Philadelphia, PA, 2012, http://www.mckinsey.com/insights/manufacturing/the_
future_of_manufacturing (accessed April 30, 2016).

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
AUTONOMOUS INDUSTRIAL VEHICLES: FROM THE LABORATORY TO THE FACTORY FLOOR 91

STP 1594, 2016 / available online at www. astm. org / doi: 10. 1520/STP159420150056

Roger Bostelman, 1,2 Joseph Falco, 1 Mili Shah, 1,3 and


Tsai Hong Hong 1

Dynamic Metrology Performance


Measurement of a Six Degrees-of-
Freedom Tracking System Used in
Smart Manufacturing
Citation
Bostelman, R., Falco, J., Shah, M., and Hong Hong, T., “Dynamic Metrology Performance
Measurement of a Six Degrees-of-Freedom Tracking System Used in Smart Manufacturing,”
Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor, ASTM STP1594,
R. Bostelman and E. Messina, Eds., ASTM International, West Conshohocken, PA, 2016,
pp. 91–105, doi:10.1520/STP159420150056 4

ABSTRACT
Multi-camera motion capture systems are commercially available and typically are
used in the entertainment industry to track human motions for video gaming and
movies. These systems are proving useful as ground truth measurement systems to
assess the performance of robots, autonomous ground vehicles, and assembly
tasks in smart manufacturing. In order to be used as ground truth, the accuracy of
the motion capture system must be at least ten times better than a given system
under test. This chapter creates an innovate artifact and test method to measure
the accuracy of a given motion capture system. These measurements will then be
used to assess the performance of the motion capture system and validate that it
can be used as ground truth. The motion capture system will then serve as ground
truth for evaluating the performance of an automatic guided vehicle (AGV) with an
onboard robot arm (mobile manipulator) and for evaluating the performance of
robotic workstation assembly tasks that utilize robot arms and hands.

Manuscript received June 19, 2015; accepted for publication October 13, 2015.
1 National Institute of Standards and Technology, 100 Bureau Dr., Gaithersburg, MD 20899-8230
2
IEM, Le2i, Université de Bourgogne, BP 47870, 21078 Dijon, France
3
Department of Mathematics and Statistics, Loyola University Maryland, 4501 N. Charles St., Baltimore,
MD 21210-2699
4
ASTM Workshop on Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor on
May 26–30, 2015 in Seattle, Washington.

Copyright VC 2016 by ASTM International, 100 Barr Harbor Drive, PO Box C700, West Conshohocken, PA 19428-2959.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
92 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

Keywords
dynamic, performance measurement, robot, tracking system

Introduction
Numerous optical tracking systems, including motion capture systems, have been
developed in research centers and commercialized. Over the past several years,
these systems have gained enormous market share [1] in the entertainment indus-
try, neuroscience, biomechanics, flight and medical training, and in simulations
[2–8]. As a result, there have been several advances in improving the accuracy of
such human motion caption systems as documented in two surveys. The first [2]
analyzes research up to the year 2000; the second [3] analyzes research from 2000
to 2006 and the overview of a history of motion capture systems in 2013 [4]. These
surveys cite more than 350 articles with topics such as novel methodologies for
automatic initialization, reliable tracking and pose estimation in natural scenes, and
movement recognition.
Tracking systems also have been used in the field of robotics [5–10]. Specific
applications have included programming by demonstration, imitation, tele-operation,
activity or context recognition, and humanoid designs. This chapter presents yet
another use for these systems in robotics: to provide ground truth for assessing the
performance of robot and robot vehicle motion. Specifically, this chapter focuses on a
test method to validate the accuracy of a tracking system within the work volume of a
given robotic system under test by using a novel metrology bar artifact. This method
will ensure that the tracking system is capable of providing the necessary measure-
ment uncertainty to be used as ground truth by guaranteeing the tracking system is at
least an order of magnitude better than the expected performance of the given robotic
system under test.
As the field of robotics advances and expands to new application spaces, such
as assembly, performance measures are needed to fully understand robot capabil-
ities. Tracking systems that can provide ground truth measurement for dynamic
robots are critical for supporting robot performance evaluation. The National
Institute of Standards and Technology (NIST) conducts research on the safety and
performance of robot arms and hands, automatic guided vehicles (AGVs), and inte-
grated systems such as those comprised of arm, hand, and perception components,
as well as collaborating robots, in support of standards development. The Interna-
tional Organization for Standards (ISO) 9283:1998 [11] and the American National
Standards Institute/Robotic Industries Association (ANSI/RIA) 15.05 [12] are avail-
able standards used to assess the performance of an industrial robotic arm as an
individual unit. The recently formed ASTM Committee F45 on Driverless Auto-
matic Industrial Vehicles [13] will be used to assess the performance of AGVs.
It is predicted that future smart manufacturing systems will include robot arms
performing high-tolerance assembly tasks, AGVs making fine adjustment of docking

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN ET AL., DOI 10.1520/STP159420150056 93

positions, and complex coordination of mobile manipulators (i.e., robot arms


mounted on mobile bases) that offer the combination of high mobility and manipu-
lability. For example, an ideal utilization of the kinematic redundancy in the mobile
manipulator is to perform assembly tasks on a moving vehicle body [6,7,14]. It is
also predicted that next-generation robot systems will be more flexible with multiple
degree-of-freedom “robotic hands” and will provide levels of versatility and control
closer to that of a human. This flexibility will enable much more rapid retasking,
making robotics a viable alternative to support small- and medium-sized manufac-
turers. As such, ground truth measurement systems that can capture system-level
robot performance will aid researchers in evaluating robot designs, developments,
capabilities, and standard task (pose and motion) performance.
This chapter discusses a test method for six degree-of-freedom (6DOF) track-
ing systems and a new design for relative pose, error/uncertainty artifacts used to
compare systems under test. Experiments using the artifacts in two test spaces (i.e.,
a robot arm space and a larger space used for AGV testing) and using two different
tracking systems are then discussed. Data analysis and results follow.

Test Method for Tracking Systems


We present a test method to validate tracking system measurement errors/
uncertainties (standard deviation). Currently, a method from the manufacturer
provides unknown system uncertainties within the operational space of a robotic
work volume. One of the goals of the method is to provide assurance that measure-
ments made by the tracking system are at least an order of magnitude better than
the expected performance of a given robotic system under test. If measures do not
meet these expectations, then the tracking system must be reconfigured and recali-
brated to satisfy the intended benchmarking requirements in order to be used as
ground truth.
The test method looks for errors in the fixed configuration of two marker clus-
ters fixed on opposing ends of a metrology bar, called the artifact, in terms of posi-
tion and angular errors as measured by the tracking system. Two measurement
types are performed within the test method. The dynamic measure reports errors as
the artifact is moved about the entire measurement space.
This test method was implemented in two NIST robot testbeds, each retrofitted
with a passive marker optical tracking system for ground truth measures. The first
implementation is for an AGV with an onboard robot arm (mobile manipulator) to
develop performance measures for AGVs and mobile manipulation. Safety and per-
formance test method developments are frequently reported to the AGV industry
and are used as reference to propose revisions to the ANSI/Industrial Truck
Standards Development Foundation B56.5 AGV safety standard [15] and the
ASTM Committee F45 AGV performance standard. The second is for a tabletop
robotic work cell being used to develop performance measures for perception,
grasping, and assembly.
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
94 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

Two metrology bars, 620 mm and 320 mm in length, were used, each having
five reflective markers attached to prongs on each end (see Fig. 1 ). The metrology
bars were used to measure the tracking system measurement uncertainties within
the vehicle lab and robotic work cell, respectively. The metrology bar markers on
each end form two perpendicular planes to the bar that define the bar length. The
bar length was shortened for the robotic work cell in an attempt to maximize
metrology bar movement. Carbon fiber bars were chosen based on a combination
of cost and reduction of the effects of thermal expansion on the position uncer-
tainty. The latter is defined by using the standard metrics that were developed in
ASTM E2919, Standard Test Method for Evaluating the Performance of Systems that
Measure Static, Six Degrees of Freedom (6DOF) Pose [16].
Actual positions and motions were only approximated because the metrology
bar was randomly held and moved by a person throughout the test spaces. For the
vehicle lab/static case experiments, the 620-mm metrology bar initially was placed
in the center of the space, approximately 1.5 m above the floor. For the vehicle lab/
dynamic case experiments, the bar was carried by a researcher at a height of
approximately 2.5 m above the floor (i.e., overhead) and walked in a raster scan pat-
tern throughout the space to maximize coverage. Note that the approximate height
of the AGV navigation sensor is 2.1 m above the floor. Similarly, for the robotic
work cell/static case experiments, the 320-mm bar was placed approximately 0.2 m
above a table. For the robotic work cell/dynamic case experiments, the bar was
moved by a researcher throughout the volume created by the camera field of views
and was reachable by a robot arm to be mounted within the space. Velocities of bar

FIG. 1 NIST metrology bars, (a) 620 mm long and (b) 320 mm long, used to measure
static and dynamic ground truth system uncertainty. The bars are sitting on a
holder that is on the NIST reconfigurable mobile manipulator apparatus.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN ET AL., DOI 10.1520/STP159420150056 95

motion also were not measured and are approximated at a slow walk, perhaps
0.5 m/s, for the vehicle lab and between 0.5 m/s and 1 m/s for the robotic work
cell. Future measurements of the vehicle lab will include programmed vehicle
movement of the metrology bar throughout the space.

Measurement of a Tracking System


This section describes a test method that provides a set of statistically based perform-
ance metrics and test procedures to quantitatively compute the performance of a given
6DOF optical tracking measurement system. Specifically, the test method looks for
variations in the measurements from an optical tracking system of two marker sets rig-
idly attached to opposing ends of a metrology bar as shown in Fig. 1 . These variations
are then decomposed into uncertainties (position and angle errors) as outlined here.
Two measurement types, static and dynamic, are performed within this test method.
The static measurement reports system errors as the artifact is statically placed in dif-
ferent locations within the entire workspace, and the dynamic measurement reports
system errors as the artifact is moved about the entire workspace. The resulting system
errors are used to calculate measurement statistics outlined as follows.

POSITION AND ANGLE ERRORS/UNCERTAINTY


The position and angle errors are defined in the following way: For each instance of
time t, the optical tracking system outputs the left object pose and the right object
pose of the metrology artifact. The pose error is then defined as the difference
between the left object pose and the right object pose at time t and represented as
the homogeneous matrix:
? ?

ðÞ¼
^ t
H
^ t
R ðÞ T ðÞ
^ t
(1 )
0 1

where R^ ð tÞ is a 3 by 3 rotation matrix representing the orientation of the object at

time t and T ^ ð tÞ is a 3 by 1 translation vector representing the position of the object

at time t. The ground truth of the relative pose is assumed to be known and meas-
ured by a coordinate measuring machine and represented as the homogeneous
matrix:
?
ðÞ ðÞ
?

ðÞ¼
H t
R t
0
T t
1
(2)

where R(t) is the 3 by 3 rotation matrix representing the known orientation of the
relative pose and T(t) is the 3 by 1 translation vector representing the known posi-
tion of the relative pose. The ground truth relative pose is measured by a coordinate
measurement machine.
The position error, eT, can then be computed as follows:

¼ k Tk ? k T^ ð tÞk ¼ ? Length of T^
? ? ? ?

eT ? ? ?
Length of T ?
(3)

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
96 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

The rotation error matrix can be computed as follows:


DR ¼ R? 1 ? R^ ð tÞ ¼ RT ? R^ ð tÞ (4)

The angle error of DR can then be computed as [12]:


? ?
traceð DRð tÞÞ ? 1
0 ? eRelAngle t ¼ cos? 1 p
2
;
< (5)

UNCERTAINTY STATISTICS
The error statistics from the position error and angle error can be calculated as:
1. Computing the average error:
PN
ek
k¼ 1
?e ¼
N (6)

2. Computing the standard deviation of the errors:


PN !
r ¼ sqrt k ¼ 1 ð e k ? ?eÞ
2
N? 1 (7)

3. Computing the maximum of the errors:


emax ¼ maxð e1 e2 … eN Þ
; ; ; (8)

Here, N is the number of poses collected.

Experimental Spaces and Equipment


VEHICLE LAB AND VEHICLE
A relatively large, 9 m by 22 m lab at NIST is used to research the safety and per-
formance of AGVs (see Fig. 2). An AGV approximately 3 m wide by 8 m long by
2 m high and weighing approximately 1137 kg is moved along paths or segments
and is positioned at specified points for docking purposes.
Performance measurements were made of vehicle uncertainty (e.g., segment
deviation, etc.) when navigating along segments and when stopped at points (e.g.,
docking repeatability, docking accuracy).
VEHICLE LAB MOTION CAPTURE SYSTEM
A multi-camera ground truth (GT) system5 referred to as “GT1” was set up in the
AGV lab. Twelve cameras were mounted to the four lab walls at a height of 4.3 m
5
Discl aimer: Commercial equipment and materials are identified in order to adequately specify certai n pro-
cedures. I n no case does such iden ti fication imply recommendation or endorsem ent by the National I nstitute
of Standards and Technology, nor does it imply that the materials or equipment identified are necessarily
the best avai lable for the purpose.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN ET AL., DOI 10.1520/STP159420150056 97

FIG. 2 AGV test lab (left) and screenshot of the multi-camera system ground truth
system space showing cameras and AGV rigid body (right).

Ground
truth
system
cameras
(1 2
each)

AGV

above the floor and used to view the area where the experiments are performed.
Cameras have 4.1 megapixel resolution, 120 frames-per-second, and 51 ? field of
view with focus and aperture-opening adjustments. Eighteen markers are grouped
into a rigid body, as shown in Fig. 2, and tracked by the GT1 system.
ROBOT TEST SPACE AND ROBOTS
In comparison, a relatively small 2 m by 2 m robot test space (see Fig. 3 ) at NIST is
used to research the safety and performance of collaborative robot arms and
advanced, multi-fingered robotic hands. Findings are frequently reported to the
industrial robot industry and used as a reference to propose revisions to the RIA
15.06 [17], ISO 10218-1, -2 [11] safety standards subcommittees, and to the
robotic hands research community through NIST’s robot hand performance test
portal [8].

FIG. 3 Robot test space showing the GT2 cameras mounted to a frame above the
space and the 320-mm metrology bar (centered) on a red bar holder.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
98 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

ROBOT TEST SPACE MOTION CAPTURE SYSTEM


A multi-camera GT system referred to as “GT2” was set up in the robot test lab.
Fig. 3 shows the robot test space, the GT2 cameras mounted to a frame above the
space, and the 320-mm metrology bar (centered). Eight cameras were mounted to
the robot test space frame approximately 1.5 m above the robot base mount surface
and view the volume where the robotic experiments are performed. Cameras have
1 megapixel resolution, 120 frames per second, and 4 mm—12 mm zoom with
zoom, focus, and aperture-opening adjustments.

Results
We found no published nominal uncertainties for the GT1 tracking system from
the manufacturer because they describe the system uncertainty as “sub-millimeter.”
The GT2 system manufacturer published the uncertainties as 0.5 mm or more of
translation and 0.5 ? of rotation in a 4 m by 4 m volume using 9-mm diameter
markers. The descriptions also do not include procedures for ensuring traceability
of measurement uncertainty.
We tested both tracking systems using the NIST developed test method described
earlier. We tested the tracking systems in their calibrated states. We provide both the
GT1 and GT2 tracking distance and angle uncertainties in the following subsections.
VEHICLE LAB MEASUREMENTS
The GT1 system was first calibrated by mainly adjusting the focus on the cameras
in the system. After calibration, the system was measured using the 620-mm
metrology bar over an approximate 10 m in width by 8 m in length lab center work-
space where most of the AGV testing is performed. The metrology bar was placed
at the workspace center. Analysis shows average measurement uncertainty of the
static metrology bar length was r ¼ 0.02 mm, and for the static angle, it was
r ¼ 0.05 ? . In this experiment, N (number of poses) is greater than 30,000 points.
The metrology bar was then moved throughout the workspace. The dynamic me-
trology bar position uncertainty was calculated as r ¼ 0.26 mm, and the dynamic
angle uncertainty was calculated as r ¼ 0.20? . Fig. 4a shows the dynamic bar length
uncertainty, and Fig. 4b shows the dynamic angle uncertainty. Each block in the fig-
ure graphs uses a natural-neighbor interpolation to obtain the value.
ROBOT SPACE MEASUREMENTS
In contrast to the GT1 system, the GT2 system used a 320-mm-long metrology bar.
Similar to the GT1 system, calibration consisted of adjusting the zoom, focus, and
aperture of the cameras in the system.
The metrology bar was placed at the workspace center. Analysis shows average
measurement uncertainty over three runs of the static metrology bar length was
r ¼ 0.004 mm and, for the static angle, r ¼ 0.006 ? . The bar was then moved
throughout the entire robot work volume. The dynamic position uncertainty was
r ¼ 0.60 mm, and the dynamic angle uncertainty was r ¼ 0.29 ? . Fig. 5a shows the
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN ET AL., DOI 10.1520/STP159420150056 99

FIG. 4 GT1 data captured from the 620-mm metrology bar (a) length and (b) angle
within the AGV lab.

(a)

(b)

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
100 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

FIG. 5 GT2 data captured of the 320-mm metrology bar (a) length and (b) angle
within the robot space.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN ET AL., DOI 10.1520/STP159420150056 1 01

FIG. 6 Mobile manipulator being tested using the NIST RMMA.

Manipulator
RMMA
AGV

metrology bar length dynamic uncertainty data, and Fig. 5b shows the bar angle
dynamic uncertainty data.
Interestingly, we noticed a degradation of uncertainty in the robot space on
consecutive dynamic test runs. This behavior is currently being investigated.
MOBILE MANIPULATOR MEASUREMENTS
A recent application for the calibrated GT1 system was to measure mobile manipu-
lator performance of a robot arm installed onboard the AGV as shown in Fig. 6.
A NIST reconfigurable mobile manipulator artifact (RMMA) was developed as a
possible concept for comparing ground truth technologies such as tracking systems,

FIG. 7 Screen captures from the GT1 system showing the AGV, manipulator, and the
RMMA rigid bodies formed from markers on each device.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
102 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

FIG. 8 GT data points relative to the GT system origin (in mm) of (a) the stationary
AGV and (b) the RMMA movement over time (in minutes), shown by the
varying colors, while the manipulator moves.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN ET AL., DOI 10.1520/STP159420150056 103

laser trackers, and so on. The RMMA is further detailed in Bostelman, Hong, and
Cheok [6]. Experiment 1 included placing a mobile manipulator next to the
RMMA and then moving the manipulator to various points on the RMMA as
shown in Fig. 6. Results showed the average position uncertainty (calibration)
between the RMMA and AGV to be x ¼ 0.07 mm and y ¼ 0.02 mm, both being near
the static measurement range of the GT1 system (i.e., r ¼ 0.02 mm and r ¼ 0.05? ).
However, further experiments were performed and suggested surprising results.
Experiment 2 measured the uncertainty of the static AGV when the manipulator
uses noncontact positioning above the RMMA points. The AGV, RMMA, and ma-
nipulator were measured using a single ground truth system, GT1, resulting in
motion tracking and relative measurements of the components. Experiment 3
measured the static RMMA movement during the noncontact Experiments 1 and 2.
A screenshot of the rigid bodies formed in the GT1 system is shown in Fig. 7, and
uncertainty results are shown in Fig. 8.
Experiments 2 and 3 proved that both the AGV and the RMMA were moving
even while the AGV was stopped while the manipulator was moving. This occurred
despite the fact that the AGV weight was nearly 40 times that of the manipulator,
with tests conducted on the ground level with concrete flooring. Results show that
position uncertainty spans from approximately 0.15 mm in x and 0.25 mm in y for
the AGV to 0.5 mm in x and 0.6 mm in y for the RMMA. These results showed that
the ground truth optical tracking measurement system used in the mobile
manipulator experiments was accurate enough to detect motion of a static table
(RMMA) and a relatively heavy vehicle due to onboard lightweight manipulator
motion. When these uncertainties are combined, maximum uncertainties can be
r ¼ 0.52 mm in x and r ¼ 0.65 mm in y, which could induce enough position offset
of the manipulator to affect the results of manufacturing operations, such as a rela-
tively high-tolerance assembly operation.

Conclusions
Multi-camera motion capture systems are now commercially available, and their
application as ground truth systems for robots and vehicles is on the horizon.
This chapter describes a test method and metrics for evaluating and validating
tracking system calibration within the operational space of a robotic work volume.
The goal of the method is to provide assurance that measurements made by the
tracking system are at least an order of magnitude better than the expected
performance of the robotic system under test. The test method used is exemplified
on two different motion capture systems each in a different size workspace. An
example application of one system was used to measure the performance of an
AGV with an onboard robot arm (mobile manipulator). Experiments on a mobile
manipulator showed that tracking systems in large spaces can even measure small
wall, floor, and equipment movements despite their static conditions. This test
method and metrics can be used to measure and analyze the performance of any
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
104 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

tracking system that computes the pose of an object while the object is moving.
It also helps provide the performance of optical tracking systems to improve opti-
cal tracking systems.

ACKNOWLEDGMENTS
The authors would like to thank Sebti Foufou, Qatar University, Doha, Qatar, for his
guidance in this research.

References
[1] “Motion Capture Software Developers in the US: Market Research Report,” IBISWorld, 2014,
http://www.ibisworld.com/industry/motion-capture-software-developers.html (accessed
May 21, 2015).
[2] Moeslund, T. B. and Granum, E., “A Survey of Computer Vision-Based Human Motion
Capture,” Computer Vision and Image Understanding, Vol. 81, No. 3, 2001, pp. 231–268.
[3] Moeslund, T. B., Hilton, A., and Krüger, V., “A Survey of Advances in Vision-Based Human
Motion Capture and Analysis,” Computer Vision and Image Understanding, Vol. 104,
No. 2, 2006, pp. 90–126.
[4] Fischer, R., “History of Motion Capture,” 2013, http://motioncapturesociety.com
(accessed August 10, 2013).
[5] Field, M., Stirling, D., Naghdy, F., and Pan, Z., “Motion Capture in Robotics Review,”
Proceedings of the IEEE International Conference on Control and Automation (ICCA),
Christchurch, New Zealand, 2009, pp. 1697–1702.
[6] Bostelman, R. V., Hong, T.-H., and Cheok, G., “Navigation Performance Evaluation for
Automated Guided Vehicle,” Proceedings from the 7th Annual IEEE International Con-
ference on Technologies for Practical Robot Applications (TePRA), Boston, MA, 2015.
[7] Bostelman, R., Hong, T.-H., and Marvel, J., Performance Measurement of Mobile Manipu-
lators, Proceedings of the SPIE-DSS Commercial Sensing Conference, Baltimore, MD,
April 20–24, 2015.
[8] National Institute of Standards and Technology Engineering Laboratory, “Performance
Metrics and Benchmarks to Advance the State of Robotic Grasping,” National Institute
of Standards and Technology, Gaithersburg, MD, 2014, http://www.nist.gov/el/isd/
grasp.cfm (accessed April 1, 2015).
[9] Yang, P. F., Sanno, M., and Bruggemann, G. P., “Evaluation of the Performance of a
Motion Capture System for Small Displacement Recording and a Discussion for Its
Application Potential in Bone Deformation in Vivo Measurements,” Proceedings of the
Institution of Mechanical Engineers, Vol. 226, No. 11, 2012, pp. 838–847.
[10] Summan, R., Pierce, S. G., Macleod, C. N., Dobie, G., Gears, T., Lester, W., Pritchett, P., and
Smyth, P., “Spatial Calibration of Large Volume Photogrammetry Based Metrology Sys-
tems,” Journal of Measurement, Vol. 68, 2015, pp. 189–200.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN ET AL., DOI 10.1520/STP159420150056 1 05

[11] ANSI/RIA 15.05-2, “Industrial Robots and Robot Systems—Path-Related and Dynamic
Performance Characteristics,” American National Standards Institute (ANSI), Washing-
ton, DC, 1992, www.ansi.org
[12] ANSI/RIA 15.05, American National Standards Institute (ANSI), Washington, DC, 1999,
www.ansi.org
[13] ASTM, “Committee F45 on Driverless Automatic Guided Industrial Vehicles,” ASTM Inter-
national, West Conshohocken, PA, 2014, http://www.astm.org/COMMITTEE/F45.htm
(accessed April 1, 2015).
[14] Hamner, B., Koterba, S., Shi, J., Simmons, R., and Singh, S., “Mobile Robotic Dynamic
Tracking for Assembly Tasks,” Proceedings of the 2009 IEEE/RSJ International Confer-
ence on Intelligent Robots and Systems, St. Louis, MO, October 10–15, 2009.
[15] ANSI/ITSDF B56.5, Safety Standard for Driverless, Automatic Guided Industrial Vehicles
and Automated Functions of Manned Industrial Vehicles, ANSI, Washington, DC, 2014,
www.itsdf.org
[16] ASTM E2919-14, Standard Test Method for Evaluating the Performance of Systems That
Measure Static, Six Degrees of Freedom (6DOF), Pose, ASTM International, West
Conshohocken, PA, 2014, www.astm.org
[17] ANSI/RIA R15.06-2012, American National Standard for Industrial Robots and Robot
Systems—Safety Requirements, Robotic Industries Association, Ann Arbor, MI, 2013,
http://www.robotics.org

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
106 AUTONOMOUS INDUSTRIAL VEHICLES: FROM THE LABORATORY TO THE FACTORY FLOOR

STP 1594, 2016 / available online at www. astm. org / doi: 10. 1520/STP159420150050

Zdenko Kovačić, 1 Michael Butler, 2 Paolo Lista, 3


Goran Vasiljević, 1 Ivica Draganjac, 1 Damjan Miklić, 1
Tamara Petrović, 1 and Frano Petric1

Harmonization of Research and


Development Activities Toward
Standardization in the Automated
Warehousing Systems
Citation
Kovačić, Z., Butler, M., Lista, P., Vasiljević, G., Draganjac, I., Miklić, D., Petrović, T., and Petric, F.,
“Harmonization of Research and Development Activities Toward Standardization in the
Automated Warehousing Systems,” Autonomous Industrial Vehicles: From the Laboratory to
the Factory Floor, ASTM STP1594, R. Bostelman and E. Messina, Eds., ASTM International, West
Conshohocken, PA, 2016, pp. 106–128, doi:10.1520/STP1594201500504

ABSTRACT
In this chapter, we describe some ideas of robotic system standardization based on
ongoing research and development processes in a European FP7 project named
EC-SAFEMOBIL, which is focused on estimation and control technologies for safe,
wireless, high-mobility cooperative systems. Strongly influenced by the European
Commission, demand has been to commercialize as many project results as
possible, EC-SAFEMOBIL researchers and developers needed some standards to
follow for the main project application areas—unmanned aerial systems (UAS) and
automated warehousing systems (AWS). Although many aspects of UAS are
covered by adequate standards, this does not hold true for automated warehouses.
In the given analysis of possible standardization of automated warehousing
systems, we elaborate on ideas on how to overcome evident gaps between
academic achievements and viable industry practice. Paying particular attention to
process and development standards, as well as function-specific standards, we

Manuscript received June 14, 2015; accepted for publication August 12, 2015.
1
University of Zagreb, Electrical Engineering and Computing, Unska 3, 10000 Zagreb, Croatia
2
Selex ES Ltd., Sigma House, Christopher Martin Road, Basildon, Essex SS14 3EL, UK
3
Euroimpianti SpA, Via Lago di Vico 80, 36015 Schio, Italy
4
ASTM Workshop on Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor on
May 26–30, 2015 in Seattle, Washington.

Copyright VC 2016 by ASTM International, 100 Barr Harbor Drive, PO Box C700, West Conshohocken, PA 19428-2959.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
KOVAČIĆ ET AL., DOI 10.1520/STP159420150050 107

describe our view of reaching new standards in automated warehousing systems,


particularly with a number of deployed automated guided vehicles (AGVs). This
involves adopting or extending existing standards from other application areas
(UAS), creating new ones, and defining standard benchmark tests. We have
proposed a few benchmark scenarios for testing two system functionalities—
marker-less indoor localization and distributed control.

Keywords
warehouses, automated guided vehicles, localization, distributed control,
benchmark tests, standardization

Introduction
Logistical systems for automated warehouses have been in existence for a relatively
long time, with an apparent lack of universally recognized standards. This report
attempts to explain the development of a practical pathway toward systematic
standardization, based mainly on the experience gained through a decade-long
collaboration of academia and industry and quite recently through an international
collaboration via an FP7 project named EC-SAFEMOBIL, funded by the European
Commission within the FP7 research framework [1]. Here EC-SAFEMOBIL is a
personification of a collaboration composed of partners from academia, research
institutes, original equipment manufacturers, and end users.
The EC-SAFEMOBIL project is aiming to develop estimation and control
technologies for safe, wireless, high-mobility cooperative systems, with specific focus
on unmanned aerial systems and autonomous warehouse robotic applications. Both
of these applications encompass complex system architectures, and the standards
that must be considered in an implementation such as this example of the whole
system are wide ranging.
The purpose of this document is to identify and describe existing international
standards that are of direct relevance to the development work being carried out
under the EC-SAFEMOBIL project (i.e., considering only the estimation and control
aspects of the larger, more complex encompassing applications). As such, automated
warehouse development partners within the program paid more attention to the
processes in the research and development phases, which led to a common, well-
defined, and constrained set of standards for the considered application area. These
standards can be separated into two classes, those that describe process and develop-
ment methods without prescribing design and implementation attributes, and
those that present design requirements or constraints. The latter are application/
environment specific, while the former are more generally applicable for systems that
are conceptually similar at a higher level.
When considering application-specific standards, the applications of interest
are primarily the technology test bed demonstrations for new estimation and
control technologies.
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
108 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

The document is organized as follows. After introductory material, a brief


description of automated warehousing systems is given with an emphasis on the
technology level present in academic research versus the technology level used in
state-of-the-art industrial solutions. The review of existing process and develop-
ment standards pays particular attention to the process guideline hierarchy for an
AGV system development cycle, where safety plays a key role. The analysis of exist-
ing standards and ongoing standardization activities finishes with function-specific
standards. Identifying the noncovered parts of autonomous distributed warehous-
ing, a discussion follows concerning the introduction of new standards in high-
precision localization and decentralized coordination of AGVs. Interlacing with
safety issues is treated through functional specifications for collision avoidance
along with communication and performance of AGVs. The final sections refer to
propositions of standard benchmark scenarios for autonomous distributed ware-
housing with a focus on standardization of marker-less localization and distributed
control technologies.

Automated Warehousing Systems


Automated warehousing systems equipped with AGVs represent the backbone of
material handling operations within manufacturing facilities and distribution termi-
nals. When developing such a complex system, one should treat the system as a
handful of smaller subsystems, as shown in Fig. 1 :
– Accurate localization and mapping of the warehouse environment
– Safe and reliable communication
– Collision avoidance
– Dynamic routing
– Deadlock prevention
– Safety measures
All subsystems presented in Fig. 1 are necessary for the safe and efficient opera-
tion of an automated warehouse. Accurate localization of the AGV needs to be main-
tained in all sections of the warehouse; mission control of AGV groups and dynamic
routing with safe trajectory planning are important issues to be resolved. As shown in
Fig. 2, a top-down hierarchy starts with a mission assignment. For most industrial
warehouse systems, missions are issued in a centralized manner, coming sequentially
from a central task dispatching unit. Missions can be assigned either by a human
operator or through acquired commands from integrated warehouse components
(e.g., loading and unloading stations). An alternative way is through decentralized
task dispatching, which assumes each industrial automated vehicle bids for new
missions and negotiates with its neighboring vehicles’ future mission assignments.
When considering the introduction of standards into processes for mission
assignment and coordination, safety measures are the highest system priority. Once
safety requirements are met, vehicle coordination, centralized or decentralized,
must ensure that the operations of all warehouse segments are deadlock and
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
KOVAČIĆ ET AL., DOI 10.1520/STP159420150050 1 09

FIG. 1 Main subsystems of an automated warehouse.

livelock free. The introduction of any new coordination algorithms raises the afore-
mentioned safety questions related to the adopted process of software development.
In this analysis, we do not focus on AGVs whose operation completely relies
on prepared floors for motion guidance. Instead, we focus on freely navigating
AGVs that are used in large-scale manufacturing and logistics applications. Freely
navigating AGVs mainly use laser scanners for navigation; these provide an

FIG. 2 The top-down hierarchy of automated warehouse subsystems.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
110 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

accurate two-dimensional map of the actual environment for self-localization and


obstacle avoidance.
Low-level subsystems (as shown in Fig. 2) that involve path planning and path
following depend on the correct selection of path-planning and path-following
algorithms, resolution of on-board sensors, and most importantly the precision of
localization.

Academic Research Versus Industrial State of


the Art
Annual reports of the International Federation of Robotics represent a reliable
source of information for all branches of the robotics industry. In the 2014 annual
report on service robots [2], one can find comments on AGVs in manufacturing
environments regarding their level of distribution:
Despite there being many suppliers, only few offer freely navigating AGVs (without
fixed wires or other methods of prepared floor). However, the trend toward
integrating advanced navigation capabilities on the basis oflaser scanners or similar
sensors for geometric environmental recognition is a major source of innovation.
Information contained in this annual report [2] is related to cost-benefit
considerations and major restraints on the diffusion of AGVs in manufacturing
environments:
The main benefit of robotic solutions for factory logistics is the reduction of
the need for manual workers, especially for drivers of trucks, forklift trucks
and the like. The market for AGVs in the US was indicated to range from
US$40–100 million/year over the past 10 years which typically represented
30–80 systems. However it is expected that this market increases very signifi-
cantly over the next years as important preconditions for the investment into
AGVs will be increasingly met such as:
• Digitizing the factory floor. AGVs depend on digital data for their routing and
missions.
• Performance and flexibility increase of fully autonomous navigation without
installed markers or beacons.
In contrast to the impressive progress in robotics research, there is still a wide
gap between research achievements and their implementation (installation) in indus-
trial environments. As shown in Fig. 3, the gaps between academic research and
industrial applications are multilevel. Regarding the issue of precise indoor localiza-
tion of AGVs, two-dimensional pose estimation with laser scanners is effective
enough only with the use of artificial landmarks (markers, reflectors). At an academia
level, there are examples of a large number of indoor localization solutions using
Kalman filters, particle filters, and scan-matching algorithms [3–7], but the achieved
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
KOVAČIĆ ET AL., DOI 10.1520/STP159420150050 111

FIG. 3 Academic research (left) versus industrial state of the art (right).

precision is still an order of magnitude worse than is required for safe operation in
manufacturing environments. As a consequence of insufficiently good indoor local-
ization, path-planning algorithms are strongly affected by uneven terrain, bad trac-
tion, and slower motion. Consequently, current solutions in the industry assume that
the manufacturing facility environment is known prior to the design of facility
layouts such as docking stations, paths, turns, intersections, idle positions, and so on.
Academia can offer solutions such as different online planning strategies based on
rapidly-exploring random trees and A* algorithms for shortest path search [8], but
because of poor localization, the industry is still only using predefined paths.
At higher control levels in relation to coordination and mission control, the
industry is applying centralized control solutions, while academia is turning more
to the decentralized solutions [9–13]. Centralized control requires reliable com-
munications with all AGVs. Practice has proven repeatedly that communication
losses happen in all systems that directly affect the safety of multi-AGV systems.
The situation becomes worse when the number of vehicles in the system
increases because the density of data transferred through wireless communication
channels becomes a bottleneck for the whole control system and threatens the over-
all system functionality. As a result, when there are tens to hundreds of installed
AGVs, academia can introduce various optimization techniques, whereas the indus-
try still relies on simpler heuristics.

Academic Experimental Validation Versus


Industrial Requirements
Qualitative discussions such as those noted here can be replaced by a quantitative
one that can give new insight into the gap between academia and industry. Refer-
ring to Table 1 , based on our current knowledge, the required AGV pose accuracy
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
112 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

TABLE 1 Academic experimental validation versus indu strial requirements.

Academic validation Industrial requirements

Positioning accuracy ? 0.1 m, 1 ?


Required accuracy < 0.01 m, < 0.5 ?
Typical experiment duration of several hours 24/7 uninterrupted operation
Vehicle weight < 50 kg Vehicle weight > 1000 kg
Differential or omnidirectional steering Bicycle or car-like steering
Clean laboratory environment Harsh industrial environments with electromagnetic
noise
Different specialists available for tune-up and Out-of-the-box solution for workers, available only for
maintenance basic maintenance
Restricted access Unrestricted access

of 1 cm and 0.5 degree achieved without markers is not yet reported. It is worth
comment that the weights of vehicles used for experimenting in academia are
compared to the weights of vehicles used by the industry. One can conclude that,
in warehousing, the size does matter because AGVs represent an evident threat to
the environment in which they operate (to people, to warehouse elements, stored
goods, etc.).
Note: The first practical demonstration of 1 cm and 0.5 degree positioning
accuracy without markers was made by researchers from the University of Zagreb
in January 2015 during the EC-SAFEMOBIL consortium meeting in the manufac-
turing facility of Euroimpianti SpA (Schio, Italy). The novel AGV pose estimation
method is still being tested in different operating conditions, and publication of
results is expected at the end of the project (December 2015). One can find more
information about publicly performed experiments in Schio, Italy, by visiting the
EC-SAFEMOBIL Web page and relevant videos [1].

Process and Development Standards


For autonomous system development, although the specific functionality of each
application—and hence development—will be unique, certain considerations are
common; this is reflected in the relevant standards. The standards are typically
arranged in a hierarchical manner, with the system scope reducing at each level in
the hierarchy.
Fig. 4 shows the hierarchical relationship among the development standards for
system development that are broadly applicable to all autonomous systems. Some
of these will be discussed in greater depth in subsequent sections.

SAFETY
Any autonomous system must perform its task in a safe manner; consequently,
safety standards form the top level of the hierarchy from which all other process
standards originate.
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
KOVAČIĆ ET AL., DOI 10.1520/STP159420150050 113

FIG. 4 Process guideline hierarchy for AGV systems’ development cycle.

Safety Assessment Process Guidelines and


M ethods

System
Intended AGV Func ? on, Failure,
Design
Func ? on and Safety
Informa ? on

AGV and System Development Processes

Guidelines for Integrated


M odular AGV Systems

Electronic H ardware So?ware Development Life


Development Life Cycle Cycle

SOFTWARE SAFETY
The standard IEC 61508 specifies safety-based requirements for developed software
in a traceable manner [14]. This international standard is intended to be a basic
functional safety standard that is applicable to all industries. Part 3 of this standard
deals with specific requirements software must comply with in order to ensure its
safety. In this standard, a safety life cycle is used as the basis for the compliance
of the requirements. The IEC 61508 standard defines four safety integrity levels
(SILs) shown in Table 2, which determine risk-reduction levels related to a safety
function. A brief description of the main requirements regarding software is pre-
sented in the following.
• Software Safety Requirements Specification: Specification of the functional
safety requirements, which must be clear, accurate, verifiable, testable, main-
tainable, and feasible.

TABLE 2 Safety integrity levels defined in I EC 61 508.

Probability of dangerous failure per


Safety integrity level hour (continuous mode of operation) Risk reduction factor

SIL 4 ? 10 ? 9 to < 10 ? 8 100,000 to 10,000


SIL 3 ? 10 ? 8 to < 10 ? 7 10,000 to 1000
SIL 2 ? 10 ? 7 to < 10 ? 6 1000 to 100
SIL 1 ? 10 ? 6 to < 10 ? 5 100 to 10

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
114 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

• Software Safety Validation Planning: Plan to prove that the software satisfies
the safety requirements set in the specification. The plan must consider
required equipment, validation schedule, validation authority, modes of opera-
tion validation, foreseeable abnormal conditions, reference to safety require-
ments, and expected results.
• Software Design and Development: Includes the definition of major com-
ponents and subsystems of the software. The architectural design should
include the interconnection among components, techniques necessary to
satisfy requirements, software safety integrity levels of the components,
software-hardware interactions and tests performed to ensure safety integrity
of data, and so on.
• Integration and Testing: There must be tests applied to the integration
between the hardware and the software during the design and development
phases. It is necessary to define the test cases and data, the test environment,
tools and configuration, the test criteria, and the procedures for corrective
actions.
• Software Safety Validation : Checks to ensure the software design meets the
software safety requirements. This validation should be done in accordance
with the safety validation planning. The validation is done for each safety
function reporting a record of the validation activities, the version of the
validation plan, the safety function that is validated, the test environment, and
the results of the validation.
• Operation and Modification: Modifications should always be made under
authorization, taking into account the procedures described in the safety
planning phase.
• Software Verification: Tests the results of the consistency analysis of the soft-
ware safety life cycle phases.
• Software Functional Safety Assessment: Concludes the level of safety achieved,
including all phases of the safety life cycles.

SYSTEM DEVELOPMENT
System development covers the process from concept through to system implemen-
tation, including requirements capture and analysis, system design and implemen-
tation, and verification. Application of relevant standards for system development
will result in a more efficient program implementation and in increased acceptance
of the developed technologies by industrial partners.
Referring to the 2014 annual report on service robots [2], with Section 8 devoted
to standardization and safety in service robotics, one can learn from Fig. 5 about
establishment of a whole work group scheme for standardization organized within
the technical committee ISO TC 184/SC 2 (robots and robotic devices) [15]. The
scope of the committee covers standardization of the following topics: definition/
characterization, terminology, performance testing methods, safety, mechanical inter-
faces and end effectors, programming methods, and requirements for information
exchange.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
KOVAČIĆ ET AL., DOI 10.1520/STP159420150050 115

FIG. 5 Structure of ISO TC184/SC 2.

As described by Jacobs [16], there are several standards on robots and robotic
devices that have been published (e.g., standards on terms and conditions, coordi-
nate systems with extension for mobile robots, safety of industrial robots, and safety
of personal care robots), but the process for standardization in new application
areas has only just begun. A brief look at the structure shows that standards in the
domain of Work Groups 8 and 10 are of particular significance for the robots used
in automated warehousing.
Function Specific Standards
In addition to the development process standards that are broadly applicable
across the field of autonomous systems, there are standards specific to particular
applications—typically covering absolute functional limits, such as maximum vehicle
speed, maximum stopping distance, and so on. The following section discusses appli-
cation specific standards that were directly relevant to the project EC-SAFEMOBIL
development work on unmanned aerial systems (UAS) operations and distributed
autonomous warehouse traffic management. One can observe an obvious difference
between these two technical areas. There are many function specific standards related
to unmanned aerial vehicles systems but only a few for the AGV counterparts.
Actually, for autonomous warehousing development aspects of the EC-SAFEMOBIL
project only one standard of direct relevance has been identified [17]:
• ISO 15534-3, Ergonomic design for the safety of machinery—Part 3: Anthro-
pometric data
* This standard provides anthropometric data on the speed of personnel

within industrial environments (walking speed, etc.). This, in conjunction


with information on the stopping capabilities of the autonomous vehicles,
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
116 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

allows the derivation of state estimation accuracies required from simulta-


neous localization and mapping methods and control algorithms developed
by system developers.
The similarities of UAS and AGV systems are in the limitations on their use,
resulting from safety assessments. Each type of system can be a serious threat to
humans and other living beings and to the environment. A general rule for creating
function-specific standards is that safety requirements that currently cannot be met
by system design must be met instead by restrictions on usage. In some instances,
these standards define performance goals to be progressed through technical devel-
opment rather than specific constraints that must be met within a research and
development project.
There is a distinct absence of standards related specifically to the safety of
autonomous vehicles relevant to development work within automated warehousing
systems. Standards exist that describe requirements for stopping devices, protective
equipment, and hazard zones, but none have been identified relating to aspects
such as the speed of autonomous vehicles, location accuracy and collision avoidance
strategies, and so on.

EXISTING STANDARDS APPLICABLE TO AUTOMATED WAREHOUSING


Automated guided vehicles, often considered as mobile robots, are covered by
general safety standards for machinery, which detail some directions for the design
of safely operating robots. Referring to Jacobs [16], these involve the following
standards:
• ISO 12100, Safety of machinery—General principles for design—Risk assess-
ment and risk reduction
* This standard provides general design principles and basic requirements

for the safe mechanical design of machines.


• IEC 60204-1, Safety of machinery—Electrical equipment of machines—Part 1:
General requirements
* This standard provides general design principles and basic requirements

for the safe electrical design of machines.


• ISO 13849-1, Safety of machinery—Safety-related parts of control systems—
Part 1: General principles for design
* This standard provides definitions of performance levels, defining a mini-

mum required control system reliability.


• IEC 62061, Safety of machinery—Functional safety of safety-related electrical,
electronic, and programmable electronic control systems
* This standard provides definitions of SILs, defining a minimum required

control system reliability.


• ISO 10218-1, Robots and robotic devices—Safety requirements for industrial
robots—Part 1: Robots
* This standard provides safety requirements for the design of industrial

manipulators.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
KOVAČIĆ ET AL., DOI 10.1520/STP159420150050 117

• ISO 10218-2, Robots and robotic devices—Safety requirements for industrial


robots—Part 2: Robot systems and integration
* This standard provides safety requirements for the integration of industrial

robots into automation systems.


• ISO 13482, Robots and robotic devices—Safety requirements for personal care
robots
* This standard provides safety requirements for the design of personal care

robots such as mobile servant robots, person carrier robots, and physical
assistant robots.
By way of example, current automated guided vehicles of Euroimpianti are
declared to conform to the following European directives:
• 2006/42/EC, machinery directive
• 2006/95/EC, low-voltage directive
• 2004/108/EC, electromagnetic compatibility directive
Also, Euroimpianti’s AGVs are declared to conform to the following interna-
tional technical standards:
• UNI EN ISO 12100-1, Safety of Machinery—Basic Concepts, General Principles
for Design 1 , 2005
• UNI EN ISO 12100-2, Safety of Machinery—Basic Concepts, General Principles
for Design 2, 2005
• UNI EN 1525, Safety of Driverless Industrial Trucks, 1999
• UNI EN 1175-1, Electrical Requirements for Battery Powered Trucks, 1999
• UNI EN ISO 13849-1, Design of Safety-Related Parts of Control Systems,
2007
• CEI EN 60204-1, Safety of Machinery—Electrical Equipment of Machines—
General Requirements, 2006

Discussion About New Standards for


Autonomous Distributed Warehousing
In this section, we discuss possible standardization options of research and develop-
ment processes related to autonomous distributed warehousing. Here, we assume
that standardization encompasses the following segments ofthe system embodiment:
• High-precision localization
• Path planning for pallet delivery
• Decentralized coordination
COLLISION AVOIDANCE
In industrial systems with autonomous vehicles, it is crucial to avoid collision
among the vehicles or with other objects in the workspace. This implies definition
of the following functional specifications:
• Collision detection sensors (safety laser sensors, sonars, red-green-blue-depth
(RGB-D) sensors, cameras, bumpers, artificial skin)
• Distribution of sensors (total space coverage, redundancy)
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
118 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

• Unconditional criteria for vehicle stoppage (discrete)


• Collision avoidance methods
* A global typical technique in collision avoidance consists of exploring a

uniform grid in configuration space (C space) or configuration time space


[18,19].
* The obstacles must be mapped into the configuration space, which imposes

requirements on the precision of maps. This leads to establishment of


navigation areas, which may be treated as terminals, corridors (passages),
and intersections.
* Preplanned paths (facility layouts)—selection of alternative no-colliding

paths.
* Navigation areas (free-ranging vehicles): A path through the configuration space

must be found for the points representing the way the center of an autonomous
vehicle is moving. Usage of this method in dynamic complex environments
with more vehicles sharing the same navigational area leads to dynamic routing.
* Requirements of the dynamic properties of autonomous vehicles (maxi-

mum speed, the size of a corresponding blocking area, deceleration/


acceleration rate, minimum range of sensing obstacles).
* Definition of the maximal capacity of each navigation resource (path or

area, including places/zones for parking and recharging).


* Requirements of the maximum allowed occupancy of navigation resources

to prevent deadlock and livelock situations.


* Definition of allowable collisions during docking, loading/unloading,

recharging, convoying.
* Definition of travel changes according to carried loads.

New standards for industrial systems with autonomous vehicles should address
the fundamental restrictions on usage of vehicles in the areas where human encoun-
ters are possible as the requirement for operations only within a structured work-
space unless an acceptable detect and avoid system is used. A similar standard
(CAA CAP 722) exists for unmanned air systems [20] where detect and avoid is
defined in the glossary as “the capability to see, sense, or detect conflicting traffic or
other hazards and take the appropriate action.”

COMMUNICATIONS
Reliable communications are the backbone of correctly functioning automated
warehouses. The distinctions among requirements concerning communications can
be defined within the methods in that the automated industrial vehicles are
controlled. Accordingly, automated warehousing applications can be divided into
two categories with regards to the type of control and the number of deployed au-
tonomous industrial vehicles:
• Warehouse installations of 1 to n vehicles controlled in a centralized way:
* Communication among the control center, each autonomous vehicle,

and all other integrated equipment capable of sending/receiving infor-


mation (palletization cells, recharging stations, automatic doors,
elevators etc.).
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
KOVAČIĆ ET AL., DOI 10.1520/STP159420150050 119

• Warehouse installations of more than n vehicles controlled in a decentral-


ized way:
* Communication between vehicles and central task dispatching center (if

any) and among all neighboring vehicles (vehicles positioned within the
effective communication range).
This implies definition of the following functional specifications to
communications:
• Communication infrastructure in the industrial warehouse facility must
provide a continuous flow of information:
* Latency in the communication channels must be low enough to allow

active remote control in timescales comparable to the fastest autonomous


vehicle being deployed in the AGV system.
* Communications must be robust enough to allow normal operation of all

autonomous vehicles even in the event of a single failure in the command


and control system.
* Requirements on the security of communications, depending on a particu-

lar application (encryption and other ways of protection from accepting


unauthorized commands).
* Requirements on the characteristics of middleware—there is a need for

appropriate standards for the communication links between a control


station and an AGV. Note that the control station may be another AGV,
ground station, or manned vehicle. Implementation of this standard will
allow interoperability of any developed systems with existing and future
systems with respect to the communication channels. This in turn will
accelerate the adoption of any system developments by the industrial
partners.
PERFORMANCE
The experience gained in the EC-SAFEMOBIL project revealed many similarities in
UAS and AGV systems. Among the performance indicators are the constraints on
autonomous decision making. As stated in the UAS standard CAA CAP722 [20],
Section 2, Chapter 7, Section 3.4.1:
The decisions made by an autonomous system are made on a rational basis. In
addition, to ensure consistent behavior that will encourage human trust, the
system’s decision-making should be repeatable. That is, the system should
exhibit the same behavior each time it is exposed to identical circumstances,
and it should not produce large changes in behavior for small changes in
inputs. An obvious exception to this is where the input to the system results in
a “yes/no” decision, such as a point of no return (e.g., deciding to return to the
departure airfield instead of continuing to the destination due to a very small
difference in the amount of fuel remaining).
By directly replacing just a few words (e.g., recharging station for departure air-
field and energy for fuel), we are one step closer to a standard that is applicable to
automated warehousing. Discussing this in terms of academic efforts in developing
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
120 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

advanced, highly responsible, and cognitive (self-learning) decision-making algo-


rithms, the fact emerges from the aforementioned constraints that autonomy result-
ing in unexpected or emergent behavior (due to run-time learning systems) is not
consistent and repeatable and therefore not capable of being certified.

Proposition of Benchmark Scenarios for


Autonomous Distributed Warehousing
The aim of Antonelli’s [21] recent article with the provocative title, “Robotic
Research: Are We Applying the Scientific Method?” is to raise a question about the
validity of methods used by the researchers and developers of robotic systems. The
author notes: “The grand challenge for the robotics community is to discuss, from
its foundations up, the way its research is conducted. It is a huge effort involving
complex interactions among the institutions, the ministries, the funding agencies,
and the individual researchers’ careers. Research is funded by selection of proposals,
at each call more and more imaginative which, however, most of the time end with
more or less disappointing demos.”
It is necessary to create well-established, standardized scenarios for validation
of ongoing research and development results. Once approved by the robotics
academia and industry community, these scenarios should turn into standard
benchmarks for testing new robotic solutions.
In this section, we present our proposals for several benchmark scenarios
that were accepted by industry partners in the EC-SAFEMOBIL consortium as
relevant and sufficient for evaluation of new localization and distributed control
methods.
The demonstration environment was located at the indoor warehouse environ-
ment test bed at the Euroimpianti facilities in Schio, Italy (see Fig. 6). This test bed
allows the use of real AGVs in a real-world environment, equipped with markers
used by current commercial solutions for very accurate AGV localization. The test
bed is suitable for demonstrating the real-time capability of the developed algo-
rithms, incorporating real-world effects such as vehicle dynamics and pose estima-
tion uncertainties. However, it is less suitable for testing system scalability and
long-term scenario evolution due to the size of the environment and the number of
the available platforms. This will be the subject of further work following this study.

BENCHMARK SCENARIO FOR EVALUATION OF MARKER-LESS LOCALIZATION


Prior to demonstration of new localization algorithms, it was expected that many
simulation and laboratory test bed experiments had been carried out. After being
tested thoroughly and approved by an industrial partner, the implementation of a
new localization algorithm was tested in the following benchmark scenario:
1. Select one AGV for the localization accuracy (repeatability) experiment.
2. Define the demonstration layout (the one shown in Fig. 7) with the size of
the demonstration area equal to 15 m by 20 m. Choose locations for three
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
KOVAČIĆ ET AL., DOI 10.1520/STP159420150050 1 21

FIG. 6 Euroimpianti testing environment (Schio, Italy).

conveyer belts (pick-up positions) and define the position of the unloading
station (Station A).
3. Provide marker-based localization infrastructure for achieving the best possible
accuracy oflocalization as a reference for assessment of new localization methods.
4. Move the AGV from the first pick-up position (e.g., Conveyor Belt 3) to the
Station A unloading site n times in a row and set markings to record the posi-
tions at which the AGV stopped.
5. Repeat this sequence for different starting positions of the AGV and different
pick-up positions (e.g., Conveyor Belts 1 or 2).
6. Repeat experiments for low speed and regular speed of AGV motion.
This benchmark scenario was used at the use case demonstration during the
Sixth Consortium Meeting in Schio, January 19–21, 2015. At both AGV docking
sites, there were markers on the floor to demonstrate the accuracy of the position-
ing system, which reflects the accuracy of the localization. Additionally, the orienta-
tion was measured on each site [1].

BENCHMARK SCENARIOS FOR EVALUATING DECENTRALIZED CONTROL


The scenario idea was to have demonstrations with at least two AGVs and two
unloading stations (Station A and Station B) that find themselves in four different
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
122 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

FIG. 7 Localization accuracy and repeatability demonstration layout.

benchmark situations, constructed to demonstrate the capabilities of the developed


decentralized control system. In our demonstration [1], we had two types of AGVs,
the smaller LGV1000 and the bigger LGV1400 ( Fig. 6).

BENCHMARK SCENARIO 1: VEHICLE WAITING TO RECEIVE A PATH FROM THE


PLANNER
In this situation (depicted in Fig. 8) an LGV1000 is positioned in front of the
unloading Station A, while an LGV1400 is still at its “home” position (because it
has not received any mission communication commands).
In the first part of this demonstration, vehicle LGV1000 receives the mission to
pick up a pallet from Conveyor Belt 1 and transport it to the unloading station (Sta-
tion B). During this time, LGV1400 is still idle because it has not received the mis-
sion command. Upon LGV1000 reaching Belt 1, it starts loading the pallet while
LGV1400 receives the mission to pick up the pallet from Belt 3 and transport its
load to the unloading station (Station A).
Now, two missions are active at the same time (marked with the same dashed
line of the paths shown in Fig. 8):
• LGV1400 pickup from Belt 3
• LGV1000 delivery at Station B
For the successful execution of these missions, vehicles need to negotiate and
avoid each other in a confined space. LGV1000 unloaded the pallet at Station B
while LGV1400 simulated unloading at Station A. These are also the initial posi-
tions of vehicles for the next benchmark scenario.

BENCHMARK SCENARIO 2: VEHICLES RESOLVING HEAD-ON CONFLICT IN A


CONFINED SPACE
As can be seen in Fig. 9, at the start of this situation, LGV1000 is at (unloading)
Station B while LGV1400 is at (unloading) Station A. LGV1000 will receive the
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
KOVAČIĆ ET AL., DOI 10.1520/STP159420150050 1 23

FIG. 8 Vehicle waiting to receive the path.

mission to pick up the pallet from Conveyor Belt 1 and transport it to Station A,
while LGV1400 will receive the request to pick up the pallet from Conveyor Belt 1.
While LGV1000 is moving toward the loading site, LGV1400 will be stopped by a
human to demonstrate that safety features are enabled and to produce enough delay
in the execution of the LGV1400 mission for the conflict to arise.
When LGV1000 loads the pallet from Conveyor Belt 1, LGV1400 will be
released. At this time, two missions are active (marked with the same dashing of the
paths in Fig. 9 ):
• LGV1000 delivery at Station A
• LGV1400 pickup from Belt 1
This situation means that the vehicles will need to exchange their positions, which
will cause them to start moving toward each other (head-on), and a control system will
resolve this situation by altering their paths so that they can bypass each other safely.
At the end of this situation, LGV1000 unloaded the pallet at Station A while
LGV1400 picked up the pallet from Conveyor Belt 1. These were the initial condi-
tions for the next benchmark scenario.
BENCHMARK SCENARIO 3: REMOVAL OF THE IDLE VEHICLE FROM THE PATH
OF ANOTHER VEHICLE
This situation starts with LGV1000 successfully unloading a pallet at Station A and
LGV1400 picking up a pallet at Conveyor Belt 3 (see Fig. 10).
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
124 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

FIG. 9 Two automated guided vehicles resolving a head-on conflict.

LGV1400’s mission is to transport the pallet and unload it at Station A.


LGV1000 receives no new mission commands and therefore goes to the “idle” state
and position. This position is on the path of LGV1400.
At this point in time, LGV1400’s path planner will plan the shortest path,
and the control system will deduce that LGV1000 is in the way of LGV1400.

FIG. 10 Removal of the idle vehicle.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
KOVAČIĆ ET AL., DOI 10.1520/STP159420150050 125

FIG. 11 Resolving the vehicle failure.

Therefore, LGV1400 will issue a removal request to LGV1000. LGV1000, having no


mission, will comply and move away from the planned path of LGV1400.

BENCHMARK SCENARIO 4: RESOLVING THE VEHICLE FAILURE


As the continuation of the previous benchmark situation, LGV1400 is placed at
Station A while LGV1000 is idle in the position where it moved after a removal
request, as depicted in Fig. 11 .
LGV1400 will now receive the mission to pick the pallet up from Conveyor
Belt 1. At this time, the failure of LGV1400 should be simulated. LGV1000 will now
take over the delivery mission instead of the troubled vehicle.
Additionally, a diagnostic part of the control system of LGV1000 will notify
the planner that LGV1400 is not communicating, and the planner will plan the
path around the last known position of LGV1400.

Conclusion
Adherence to standards is key to the certification of any system implementation.
Automated warehousing systems lack standards that would contribute to system
developers as guidance toward quicker acceptance of new products on the auto-
mated manufacturing and logistics systems global market.
The analysis of the existing standards shows that many standards from other
application areas, such as unmanned aerial systems, can be a starting point for
adoption and adaptation for automated warehousing systems. Due to the fact that
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
126 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

autonomous industrial vehicles represent an apparent threat to humans, other


vehicles, and to the environment itself, standards related to the safety of a vehicle’s
operation must be published first.
The lack of standards for automated guided vehicles partially is the conse-
quence of a large gap between current academic achievements on one side and
industrial practice on the other. The intention of this analysis is to overcome that
gap by defining as many standard benchmarks for evaluation of these new concepts
as possible.
Based on the experience gained during research and development activities of the
EC-SAFEMOBIL project, we focused on estimation and control technologies for safe,
wireless, high-mobility cooperative systems, with various benchmark scenarios related
to the assessment of accurate marker-less localization and distributed control algo-
rithms being created. So far, these scenarios have been used successfully for public
demonstration purposes within the mentioned project; very positive feedback was
obtained from industry partners and the end-user community, providing encourage-
ment for proposing them as candidates for standardized benchmark scenarios.
ACKNOWLEDGMENTS
This work was supported by the European Commission FP7-ICT-2011-7 project,
“Estimation and Control for Safe Wireless High Mobility Cooperative Industrial
Systems” (EC-SAFEMOBIL, Project No. 288082).

References
[1] Estimation and Control for Safe Wireless High Mobility Cooperative Industrial Systems
(EC-SAFEMOBIL, Project No. 288082), http://ec-safemobil-project.eu (accessed March 1,
2016).
[2] World Robotics Service Robots, IFR Statistical Department, Frankfurt, Germany, 2014.

[3] Jensfelt, P. and Christensen, H. I., “Laser Based Position Acquisition and Tracking in an
Indoor Environment,” Proceedings of the IEEE International Symposium on Robotics
and Automation , Institute of Electrical and Electronics Engineers, Leuven, Belgium,
May 16–20, 1998.
[4] Chen, L., Hu, H., and McDonald-Maier, K., “EKF Based Mobile Robot Localization,”
Proceedings of the Third International Conference on Emerging Security Technologies,
Institute of Electrical and Electronics Engineers, Lisbon, September 5–7, 2012,
pp. 149–154.
[5] Teslic, L., Skrjanc, I., and Klancar, G., “EKF-Based Localization of a Wheeled Mobile
Robot in Structured Environments,” Journal of Intelligent & Robotic Systems, Vol. 62,
No. 2, 2011, pp 187–203.
[6] Kummerle, R., Pfaff, P., Triebel, R., and Burgard, W., “Monte Carlo Localization in Outdoor
Terrains using Multi-Level Surface Maps,” Journal of Field Robotics, Vol. 25, No. 6–7,
2008. pp. 346–359.
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
KOVAČIĆ ET AL., DOI 10.1520/STP159420150050 1 27

[7] Grisetti, G., Stachniss, C., and Burgard, W., “Improved Techniques for Grid Mapping with
Rao-Blackwellized Particle Filters,” IEEE Transactions on Robotics, Vol. 23, No. 1, 2007,
pp. 34–46.
[8] LaValle, S. M., “Planning Algorithms,” Cambridge University Press, Cambridge, UK, 2006.
[9] Weyns, D., Holvoet, T., Schelfthout, K., and Wielemans, J., “Decentralized Control of
Automatic Guided Vehicles: Applying Multi-Agent Systems in Practice,” Companion to
the 23rd ACM SIGPLAN Conference on Object-Oriented Programming Systems Lan-
guages and Applications, Association for Computing Machinery, New York, 2008,
pp. 663–674.
[10] Yamamoto, H. and Yamada, T., “Control of AGVs in Decentralized Autonomous FMS
Based on a Mind Model,” Agent and Multi-Agent Systems. Technologies and Applica-
tions, G. Jezic, M. Kusek, N.-T. Nguyen, R. Howlett, and L. Jain, Eds., Springer, Berlin
Heidelberg, 2012, pp. 186–198.
[11] Herrero-Perez, D. and Martinez-Barbera, H., “Decentralized Coordination of Automated
Guided Vehicles,” Proceedings of the 7th International Joint Conference on Autonomous
Agents and Multiagent Systems, Vol. 3, International Foundation for Autonomous
Agents and Multiagent Systems, Richland, SC, 2008, pp. 1195–1198.
[12] Ayanian, N., Rus, D., and Kumar, V., “Decentralized Multirobot Control in Partially Known
Environments with Dynamic Task Reassignment,” Proceedings of the Third IFAC
Workshop on Distributed Estimation and Control in Networked Systems, International
Federation of Automatic Control, Santa Barbara, CA, September 14–15, pp. 311–316.
[13] Digani, V., Sabattini, L., Secchi, C., and Fantuzzi, C., “Toward Decentralized Coordination
of Multi Robot Systems in Industrial Environments: A Hierarchical Traffic Control
Strategy,” Proceedings of the 2013 IEEE International Conference on Intelligent
Computer Communication and Processing, Institute of Electrical and Electronics Engi-
neers, Cluj-Napoca, Romania, September 5–7, 2013, pp. 209–215.
[14] IEC-61508, Functional Safety of Electrical/Electronic/Programmable Electronic Safety-
Related Systems—Part 3: Software Requirements, International Electrotechnical
Commission, Geneva, Switzerland, 2010, www.iec.ch
[15] TC 184/SC 2, Robots and Robotic Devices, International Organization for Standardiza-
tion, Geneva, Switzerland, www.iso.org, (accessed March 1, 2016).
[16] Jacobs, T., “Standardisation and Safety in Service Robotics,” World Robotics Service
Robots, IFR Statistical Department, Frankfurt, Germany, 2014, pp. 255–259.
[17] ISO 15534-3, Ergonomic Design for the Safety of Machinery—Part 3: Anthropometric
Data, International Organization for Standardization, Geneva, Switzerland, www.iso.org,
(accessed March 1, 2016).
[18] Barraquand, J., Langlois, B., and Latombe, J. C., “Numerical Potential Field Techniques
for Robot Path Planning,” IEEE Transactions on Systems, Man, and Cybernetics, Vol. 2,
No. 2, 1992, pp. 224–241.
[19] Hwang, Y. K. and Ahuja, N. A., “Potential Field Approach To Path Planning,” IEEE Trans-
actions on Robotics and Automation , Vol. 8, No. 1, 1992, pp. 23–32.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
128 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

[20] Civil Aviation Authority, Unmanned Aircraft System Operations in UK Airspace–


Guidance, 6th ed., Civil Aviation Authority, London, 2015.
[21] Antonelli, G., “Robotic Research: Are We Applying the Scientific Method?” Frontiers in
Robotics and AI, Vol. 2, No. 13, 2015, doi:10.3389/frobt.2015.00013

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
AUTONOMOUS INDUSTRIAL VEHICLES: FROM THE LABORATORY TO THE FACTORY FLOOR 129

STP 1594, 2016 / available online at www. astm. org / doi: 10. 1520/STP159420150055

Roger Bostelman 1

Recommendations for
Autonomous Industrial Vehicle
Performance Standards
Citation
Bostelman, R., “Recommendations for Autonomous Industrial Vehicle Performance
Standards,” Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor,
ASTM STP1594, R. Bostelman and E. Messina, Eds., ASTM International, West Conshohocken,
PA, 2016, pp. 129–141, doi:10.1520/STP159420150055 2

ABSTRACT
A workshop on “Autonomous Industrial Vehicles: From the Laboratory to the
Factory Floor” was held at the 2015 Institute of Electrical and Electronic
Engineers International Conference on Robotics and Automation. Nine research
papers were presented, followed by a discussion session. All of the findings are
summarized in this chapter and are intended to be used in the standards
development process within ASTM International Committee F45 Driverless
Automatic Guided Industrial Vehicles. This paper provides feedback from the
discussion and suggests recommendations for standards that evolved from the
discussion.

Keywords
standards, mobile robots, automatic guided vehicle (AGV), recommendations

Introduction
A workshop entitled “Autonomous Industrial Vehicles: From the Laboratory to
the Factory Floor” was held as a part of the Institute of Electrical and Electronic

Manuscript received June 16, 2015; accepted for publication November 3, 2015.
1
National Institute of Standards and Technology, 100 Bureau Dr., MS 8230, Gaithersburg, MD 20899-8230
2
ASTM Workshop on Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor on
May 26–30, 2015 in Seattle, Washington.

Copyright VC 2016 by ASTM International, 100 Barr Harbor Drive, PO Box C700, West Conshohocken, PA 19428-2959.

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
130 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

Engineers (IEEE) Robotics and Automation Society’s International Conference on


Robotics and Automation (ICRA) [1] on May 30, 2015, at the Washington State
Convention Center in Seattle, WA. The annual conference is a premier interna-
tional forum for robotics researchers to present their work. The workshop drew
more than 60 attendees and participants who presented papers from organizations
and countries around the world, including:

O rg a n i z a ti o n Co u n t ry

Adept Technology, Inc. a USA


Amazon USA
ASTM International USA
Boeing USA
Brain Corporation USA
Clearpath Robotics Canada
Crown Equipment New Zealand
Elettric 80, Inc. USA
Johns Hopkins University USA
Kirinson—Hokuyo Automatic Co., Ltd. USA
Microsoft USA
Mujin Japan
National Institute of Standards and Technology USA
Orebro University Sweden
RWTH Aachen University Germany
ShanghaiTech University China
Sick Germany
University of Modena and Reggio Emilia Italy
University of Massachusetts Lowell USA
University of Zagreb Croatia
Vecna Technologies USA
aCertain commercial equipment, instruments, or materials are identified in this paper in order to
specify the experimental procedure adequately. Such identification is not intended to imply recom-
mendation or endorsement by the National Institute of Standards and Technology, nor is it
intended to imply that the materials or equipment identified are necessarily the best available for
the purpose.
The purpose of the workshop was to solicit researcher input for the develop-
ment of consensus standards within ASTM International Committee F45 on Driv-
erless Automatic Guided Industrial Vehicles [2] and to inform researchers of a
standards-based mechanism for enabling rapid technology transfer from the labora-
tory to industry. Additionally, the outputs from the workshop will help guide smart
manufacturing robotics research for projects within the National Institute of Stand-
ards and Technology (NIST) Robotic Systems for Smart Manufacturing Program.
Specifically, projects focusing on development of a performance assessment frame-
work for robotic systems and performance of collaborative robot systems expect to
utilize the workshop results.
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN, DOI 10.1520/STP159420150055 131

Presenters and attendees were given application-oriented questions to consider


prior to the event to help focus the workshop. These included:
• What various lighting, dust, and floor conditions are evident in industrial
manufacturing environments?
• What associated vehicle speeds, tolerances, and equipment access conditions
are required in these environments?
• What communication speeds, integration issues, and control strategies are
useful on today’s typically closed industrial automatic guided vehicle (AGV)
controllers versus tomorrow’s potentially more open controllers?
• What is the minimum knowledge of its environment that a mobile robot
requires in order to adapt to its surroundings? This is in contrast to current
AGVs that only have path knowledge, navigation knowledge, and that have
limited to no adaptability.
• What onboard or interactive equipment for AGVs (or mobile robots) should
be considered, such as robots that access AGVs or are onboard AGVs as
advanced mobile manipulation systems?
Research papers presented at the workshop covered topics most closely related
to ASTM F45 background and status, obstacle detection and avoidance, navigation,
planning, ground truth measurement in support of AGV test method development,
mobile robot and AGV capabilities, and evolution of industrial vehicle technologi-
cal innovations from inception to commercial use. As such, considerations of
many of the aforementioned questions listed were embedded in presentations and
postpresentation discussions.
This paper summarizes the ASTM Committee F45, the F45 standards activities
that previously existed, discussion points from the ICRA 2015 Workshop, and rec-
ommendations toward standards developments within F45. Summarized notes
from the final discussion session during the workshop are included here. The notes
have been collected and formalized to be used toward F45 committee and subcom-
mittee standards development.

ASTM Committee F45


ASTM Committee F45 involves performance test methods and terminology for
autonomous vehicles operating in industrial environments. The committee was
formed to dovetail with current AGV safety standards, such as American National
Standards Institute/Industrial Truck Safety Development Foundation (ANSI/ITSDF)
B56.5:2012, Safety Standard for Driverless, Automatic Guided Industrial Vehicles
and Automated Functions of Manned Industrial Vehicles [3]. The F45 scope is as
follows:
The development of standardized nomenclature and definitions of terms,
recommended practices, guides, test methods, specifications, and performance
standards for driverless automatic guided industrial vehicles. The Committee
will encourage research in this field and sponsor symposia, workshops, and
publications to facilitate the development of such standards. The work of
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
132 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

this Committee will be coordinated with other ASTM technical committees


and other national and international organizations having mutual or related
interests.
The thrust of this effort is toward vehicles working in industrial environments,
which includes AGVs and mobile robots. The development of autonomous mobile
robots has been applied to other fields, resulting in vehicle technology that can provide
advancements to the manufacturing vehicle industry. Some robot companies (such as
workshop attendees Adept Technology, Inc. and Vecna Technologies) are already oper-
ating in manufacturing and other domains. As such, standards and test method develop-
ments that may provide advancements to both types ofvehicles should be considered.
The ASTM Committee F45 structure is as follows:
• Subcommittee F45.01 on Environmental Effects
• Subcommittee F45.02 on Docking and Navigation
• Subcommittee F45.03 on Object Detection and Protection
• Subcommittee F45.04 on Communication and Integration
• Subcommittee F45.91 on Terminology

Existing F45 Standards Activities Prior to the


Workshop
To date, three initial working items have been submitted to ASTM Committee F45
and are being developed by task groups with regards to navigation, docking, and
terminology. The ASTM standards process begins with the definition of a “work
item,” which proposes a standard to be developed, describes its scope, and lists key-
words. After that, a working (draft) document is developed. The working document
is refined through a series of interactions, including by electronic means, until it is
deemed ready for balloting by the pertinent subcommittee.
Test methods already under development within subcommittee ASTM F45.02 on
Docking and Navigation are for evaluating a vehicle’s ability to traverse through a space
or along a path of varying characteristics (or both). The test method design allows for
many navigation methods, such as computer-aided design (CAD) model point-to-
point, line segment, path following, and simultaneous localization and mapping
(SLAM) navigation. CAD model-commanded navigation is more traditional for
AGVs, whereas SLAM is mainly used in mobile robots, although it recently has been
implemented in some AGVs. Fig. 1 shows an example of a defined area layout for a
navigation test method that allows for various vehicle sizes and capabilities by using
variable settings for course width, length, and so on. The blue line depicts the tradi-
tional AGV path followed while the red lines depict moveable walls to allow SLAM
navigation within reconfigurable corridors. The test method is agnostic to the naviga-
tion solution used by the vehicle; the walls can be used for localization by the vehicle (if
appropriate) or as defined space obstacles that cause the vehicle to stop (or for both).
Alternatively, open-space autonomous industrial vehicle navigation has yet
to be defined as a test method in the ASTM F45.02 navigation working document.
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN, DOI 10.1520/STP159420150055 1 33

FIG. 1 Example navigation test method setup for defined areas as shown in the ASTM
F45.02 navigation working document.

Open-space tests could be defined as simple geometric shaped paths (e.g., square, circle,
straight line) for the vehicle to navigate. These tests could be used to evaluate the
vehicle’s accuracy in maintaining its commanded path over time. As with the defined
space navigation test method, this one will be agnostic to the manner in which the paths
are commanded to the vehicle, as long as the geometric shapes, dimensions, and so on
are consistent. Combinations of defined and open space navigation test methods should
also be considered where barriers may define one side of the vehicle and, for example,
a tape line defining a pedestrian walkway may define the other side ofthe vehicle.
The ASTM F45.02 subcommittee has also started a working document on
docking for industrial vehicles. Challenges for docking are the positioning uncer-
tainty and repeatability to which the vehicle can dock to a location and the speed at
which the docking can occur; again, as with navigation, various sizes and types of
vehicles are taken into account within the document, as shown in Fig. 2.
Fig. 2a and 2c show unit load vehicles of different sizes, and Fig. 2b shows a tug-
ger vehicle. (Not shown is a forklift vehicle.) Fig. 2d shows examples of vehicle size
variations, and Fig. 2e shows an AGV procured and used by NIST with an added
onboard robot arm (mobile manipulator) being used for performance test method
development for assembly tasks. All of the vehicles require docking with varying
levels of precision; for example, the NIST mobile manipulator requires much less
docking uncertainty than a typical unit load or tugger vehicle because the vehicle
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
134 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

FIG. 2 Top view of example AGV size variability: (a) low-profile, (b) industrial tug [4],
and (c) container AGV. (d) Vehicle size variability examples. (e) NIST mobile
manipulator being used for performance test method development for
assembly tasks.

position can be compensated by the onboard manipulator. One concept for generic
docking, shown in Fig. 3 , is to command the vehicle to access a point (a) followed
by a second point (b) or to contact both point (a) and point (b) simultaneously as
with the Fig. 3 (right) photo showing two forklift tines simultaneously docking to an
apparatus. The taped points on the tines are to align with the apparatus repeatedly
with uncertainty measurement from the tape point to the target centers to be
recorded. Various tines’ heights could also be measured.
The ASTM F45.02 subcommittee has also received further recommendations
toward standards developments for docking and navigation. Specifically, three
questions were documented and distributed to the committee to foster discussion
toward supporting current or developing new working documents:
1. With what accuracy does the AGV need to stop at dock/assembly mating locations?
• Pallets (low or high pick/place)—least accuracy needed
• Tray stations, International Organization for Standardization (ISO)
lock insertion—more accuracy needed
• Peg/part insertion (sheet goods, long rods, etc.) into assemblies—high
accuracy needed
• Pick up/place delicate equipment—high accuracy needed
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN, DOI 10.1520/STP159420150055 1 35

FIG. 3 Example docking test method showing (left) block vehicle and apparatus dock
points (a) and (b) for docking tests using various AGVs. Points (a) and (b) are
fixed points in a facility or on an apparatus as shown in the photo (right).
Approach vectors and sensor point spacing and locations are variable.

2. How accurately does the AGV need to navigate?


• Straight or curved paths
• Ackerman, all-wheel, or crab/quad steering
• What is the tightest turning radius at various speeds?
• When programmed to make the tight turn, does it actually accom-
plish it or navigate a different curve?
• Between error-correcting fiducials or markers (e.g., inertial, magnets,
radio frequency identification [RFID], etc.)
• Between obstacles, racks, other infrastructure
3. What does the vehicle do when it senses a human versus another obstacle?
The ASTM F45.91 subcommittee task group has started working on a terminol-
ogy document to define typically used terms within the AGV and mobile robot
industries. Initial document development began with terms defined by three organi-
zations: ANSI/ITSDF, Material Handling Industry of America [4], and ISO/FDIS
8373:2011 [5]. The terminology from this document will include much of the
language used in other subcommittee documents.
Additional questions and parameters were also distributed for input and com-
ment for ASTM F45.01, ASTM F45.03, and ASTM F45.04 subcommittees to foster
standard test method developments, including:
F45.01 Environmental Effects
1. How fast and capable is the vehicle to navigate in the following environments?
• Indoor, under conditions such as
• Temperature (e.g., freezer)
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
136 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

Lighting (e.g., none, sunlight)


Humidity (e.g., none, wet)


• Dust/dirt
• Outdoor, under conditions such as
• Temperature (e.g., extreme heat, cold)
• Lighting (e.g., day, night)
• Humidity/precipitation (e.g., dry, rain, snow)
• Fog, smoke
• Dust/dirt
• Surfaces
• Smooth/rough terrain
• Floor gaps
• Dusty/dirty
• Wet
• Surface Slope
• Level
• Slope angle > 0?
• Areas
• Defined
* Walls
* Obstacles (safety guards, rails, columns, etc.)
* Other agents

• Open
• Entrance and exit to/from areas
* Softwall curtain partitions
* Automated doors
* Open doorway spaces

• Interaction with other agents


• Humans working on the line
• Humans operating other machinery
• Humans working side-by-side with collaborative robots
• Humans operating/programming the vehicle
• Other vehicles performing similar tasks
2. What procedure is required to implement the vehicle within an environment
with the characteristics outlined above?
• Map of the space
• Manually program/provide map to the vehicle
• Drive vehicle around the space to build its own map
• Augment space with path following/boundary edge markers
• Modifications to task space (activities aside from navigation)
F45.03 Object Detection and Protection
1. How well does the vehicle react to situations?
• Obstacles appearing in the path
• Potential obstacles headed toward the path
• Unstructured areas not on the original planned path or that rapidly
change
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN, DOI 10.1520/STP159420150055 1 37

2. What conditions cause the vehicle to violate its commanded path?


• Offset-pitched/rolled vehicle can’t see reflectors, magnets, wire, etc.
• Detection tape is worn or broken
• Terrain causes “bouncing” or unintended moving of navigation
sensors
3. Human detection
• Represented by test pieces, mannequins, humans
• Coverings (e.g., clothes worn)
4. Interaction with manual operations (e.g., forklifts, machines)
5. Intelligence
• Autonomy level (e.g., based on the Autonomous Levels for
Unmanned Systems [ALFUS] framework [6]
• Situation awareness (e.g., LASSO [7])
• Location (where it is within a global map; orientation with respect
to landmarks?)
• Activities (what activities it is performing or should be; progress
toward completing its task/mission?)
• Surroundings (what obstacles are nearby, what type of terrain it is
on, local information?)
• Status (battery health, damaged sensor, askew camera, current
mode)
• Overall mission (total progress toward completing a task/mission
involving all agents; i.e., for these purposes the entire manufac-
turing floor or a particular group working on the same overall
task)
F45.04 Communication and Integration
1. What is the communications integrity?
• How well does the vehicle function with intermittent or complete
communications loss?
2. Systems integration
• Addition of equipment, sensors, algorithms
• Autonomous/manually reconfigurable
• Assistance using sensors (e.g., RFID) or specific factory clothes
worn by workers
• Given the type of power system, such as AC or DC, low-voltage or
high-voltage, lead-acid/sealed-lead acid batteries/hydrogen fuel cell,
etc. and given payload, daily use, system longevity, environmental
effects:
• Mean time between failures/maintenance?
• Mean time between battery charge?
• Synchronization among vehicles
• Wait to pick up load
• Not cause congestion
• Reliability—fewer faults
• Reduced dependence on operators
• Maintenance
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
138 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

• Diagnostics
• Changes
• Repairs

Recommendations for Development of


Standards within ASTM Committee F45
A discussion was held during the closing session of the ICRA “Autonomous Indus-
trial Vehicles: From the Laboratory to the Factory Floor” workshop. The discussion
session was an open forum for workshop presenters and attendees to discuss their
views on developing standards within ASTM Committee F45 for AGVs and mobile
robots. The response was impressive with much participant interaction. The work-
shop hosts captured responses in bulleted form and displayed the written responses
on screen during the discussion for audience viewing. The responses were as
follows:
• Representative facility components within test methods:
• How closely related do the components need to be to real-world objects?
• Use test piece coatings that represent worst-case scenarios for sensing
• Physical relationships between facility components should be relevant to
the application, task, system, etc.
• Performance test methods vs. safety test methods
• If safety standards don’t include test methods, perhaps performance test
methods should be standardized for the “safety” situations
• Performance standards should “dovetail” with safety standards
• Obstacle detection and avoidance:
• Exists in two forms: “stop” or “drive around obstacles”
• Multiple vehicles (e.g., 2 vs. 10 vs. 100); how to test multiple vehicles when
manufacturers and users don’t have so many vehicles?
* Develop standards for virtual modeling of vehicles so that the test does

not require many vehicles


• Dynamic routing provides variable options for avoiding obstacles
• Vehicle-to-vehicle (local) communication vs. vehicle-to-central controller
(broad) communication
• Understand the environment—environments that remain the same
(floor, dust, light, infrastructure, communication, etc.) vs. environments that
change
• Autonomy and learning—how to measure?
• Over long and short periods of time
• Need standard basis for the performance of perception systems
• Testing mapping accuracy and the repeatability of created maps
• Mean time to failure for communication between vehicles and central
controllers
• Communication interference measurement
• “Heartbeat” communication from central controller/monitor vs. no need
for continuous communication (e.g., intelligent vehicles)
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN, DOI 10.1520/STP159420150055 139

• Test methods should not be designed specifically for a particular vehicle sys-
tem and should instead allow any vehicle design the developers choose to be a
viable option
• Networking should be standardized for connections to vehicles so that all systems
installed in a facility can communicate regardless of the network or manufacturer
• Building integration/interface standards—should there be a working group in
this area?
* Similar to elevators and fire doors
* Standards that allow vehicles to adapt to the facility, including communi-

cating with any of the facility components


• Central vs. decentralized vehicle control performance comparison
• Measure communication—level of autonomy dependent
• For example, a remote switch for safety—how reliable is it?
• Navigation with or without physical markers or fiducials
• Don’t over-specify how vehicles navigate—should it be absolute or relative
accuracy?
• Is it based on accuracy, speed, etc.?
• Standards for communications interfaces to robots, vehicles, and facility sensors
• Standard interfaces and data sets of facilities—warehouses, hospitals, etc., used
to allow manufacturers to develop and test their vehicle systems prior to inte-
gration into facilities
• Standard benchmarks and standard testbeds to support this industry
• Integration of multiple-vendor components and vehicles
* There is not only one vehicle system and therefore need to demonstrate

integration from multiple manufacturers


* Eliminate friction to adoption of autonomous vehicles by providing open

source
• Develop generalized test methods to test the relevant part or activity of the sys-
tem so that the component, system, etc., performance can be measured as
compared to the task
* Can’t test every possible combination of the system as compared to a task,

therefore generalize the test method to capture the most important aspects
• ASTM E54.08.01 [8] and other standards can be used as a good model for ve-
hicle performance standards development
The workshop presentations and closing discussion provided several areas that have
not been previously considered toward standards developments. The enthusiasm of the
workshop presenters and attendees demonstrated an obvious need for developing new
industrial vehicle performance standards, as well as the components (e.g., communica-
tion/network, virtual test data sets, testbed facilities, etc.) that support these systems.

Summary and Conclusions


NIST and ASTM organized a workshop called “Autonomous Industrial Vehicles:
From the Laboratory to the Factory Floor” to bring together representatives from
the research, industrial, and standards communities. The workshop was designed to
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
140 STP 1594 On Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor

promote autonomous vehicle developments by highlighting existing and emerging au-


tonomous vehicle implementations as examples to inspire attendees to consider stand-
ards that could benefit the community. Three working documents for potential ASTM
F45 standards are currently being developed. A postpresentation discussion session
allowed workshop attendees to provide input for new ASTM F45 and other standards
document developments. This enthusiastic session provided continuous flow of brain-
storming ideas that, in summary, can be organized under the following key topics that
directly match ASTM F45 subcommittee thrusts, except terminology:
• Environmental Effects
• Docking and Navigation
• Object Detection and Protection
• Communication and Integration
Other key areas identified were in standardized building infrastructure proto-
cols, networking, testbeds, and other important standards development areas.
Future efforts will utilize this workshop summary to develop new standard per-
formance test methods for autonomous industrial vehicles.

ACKNOWLEDGMENTS
The author would like to thank the IEEE International Conference on Robotics and
Automation “Autonomous Industrial Vehicles: From the Laboratory to the Factory
Floor” workshop attendees and participants. Their feedback and support for the work-
shop provided necessary standard development focus. As well, the author would like
to thank Sebti Foufou, University of Qatar, for his editorial guidance.

References
[1] International Electrical and Electronics Engineering (IEEE) International Conference
on Robotics and Automation, Seattle, WA, May 26–30, 2015, http://icra2015.org
(accessed April 3, 2016).
[2] ASTM Committee F45 for Driverless Automatic Guided Industrial Vehicles, ASTM Inter-
national, West Conshohocken, PA, 2015, www.astm.org
[3] ANSI/ITSDF B56.5:2012, Safety Standard for Driverless, Automatic Guided Industrial
Vehicles and Automated Functions of Manned Industrial Vehicles, Industrial Truck Stand-
ards Development Foundation, Washington, DC, 2012, www.itsdf.org
[4] Material Handling Industry of America, “Glossary, Automatic Guided Vehicle Systems,”
MHI, Charlotte, NC, 2014, www.mhi.org/glossary (accessed April 3, 2016).
[5] ISO/FDIS 8373:2011(E/F), Robots and Robotic Devices—Vocabulary, International Orga-
nization for Standardization, Geneva, Switzerland, 2014.
[6] Huang, H.-M., Messina, E., Wade, R., English, R., Novak, B., and Albus, J., “Autonomy
Measures for Robots,” ASME 2004 International Mechanical Engineering Congress
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
BOSTELMAN, DOI 10.1520/STP159420150055 1 41

and Exposition , American Society of Mechanical Engineers, New York, NY, 2004,
pp. 1241–1247. Proceedings of AUVSI’s Unmanned Systems North America 2005.
[7] ISO/FDIS 18646-1, Robots and Robotic Devices—Performance Criteria and Related
Test Methods for Service Robots—Part 1: Locomotion for Wheeled Robots, International
Organization for Standardization, Geneva, Switzerland, 2016 (in review), http://
www.iso.org/iso/home/store/catalogue_tc/catalogue_detail.htm?csnumber=63127
[8] ASTM E54.08.01, Robots for Urban Search and Rescue, Performance Metrics and Stand-
ards, ASTM International, West Conshohocken, PA, 2015, www.astm.org

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
www. a s tm .org

ISBN: 978-0-8031 -7633-1


Stock #: STP1594

Copyright by ASTM Int'l (all rights reserved); Tue May 16 20:41:07 EDT 2017
Downloaded/printed by
Coventry University (Tongji University) pursuant to License Agreement. No further reproductions authorized.

You might also like