Download as pdf or txt
Download as pdf or txt
You are on page 1of 21

by

Dr. Shantipal S. Ohol


Associate Professor,
Department of Mechanical Engineering,
College of Engineering Pune (COEP),
MS, INDIA

1
Content
1. Autonomous Mobile Robots
1. Need and applications,
2. sensing,
3. localisation,
4. mapping,
5. navigation and control
2. The Basics of Autonomy
3. Robot Navigation
4. Embedded electronics

Prof. Dr. Shantipal S. Ohol, College of Engineering Pune


2
Autonomous Mobile Robots – need and applications, sensing,
localisation, mapping, navigation and control.
The Basics of Autonomy (Motion, Vision and PID), Programming
Complex Behaviors (reactive, deliberative, FSM), Robot Navigation
(path planning), Robot Navigation (localization), Robot Navigation
(mapping), Embedded electronics, kinematics, sensing, perception,
and cognition

Prof. Dr. Shantipal S. Ohol, College of Engineering Pune


3
Autonomous Mobile Robots

Prof. Dr. Shantipal S. Ohol, College of Engineering Pune


4
Autonomy is the ability to make your own decisions.
• An autonomous robot / autorobot / autobot, is a robot that performs
tasks with a high degree of autonomy i.e. without human intervention.

• It is expected that Autonomous Robots can carry out specific tasks in


dynamic and unstructured environments by adapting to the environment
and learning beyond what the designer or the user programmed

• Autonomous robotics technology is based on artificial intelligence.

• Autonomous robots can recognize and learn from their surroundings


and make decisions independently

Prof. Dr. Shantipal S. Ohol, College of Engineering Pune


5
Need of Autonomous Mobile Robot :-

1. The global Autonomous Mobile Market was USD 1.67 in 2020. The autonomous mobile robots

market is projected to grow from $1.97 billion in 2021 to $8.70 billion in 2028 at a CAGR of

23.7% in the forecast period

2. Robotics can help automate tasks that are repetitive, dangerous, or vulnerable to human

error. However, automation without intelligence creates a system that cannot respond to

variables, new environments, or dynamic requirements.

Prof. Dr. Shantipal S. Ohol, College of Engineering Pune


Areas of developments in Advantages of
Autonomous Robots Autonomous Robots

✓Increased efficiency / Productivity


✓Artificial intelligence
✓Increased quality / reduced rework
✓Navigation
✓Improved Safety / lowering a risk
✓Efficient Sensors
✓Perform low value operations /
✓Cost
Allow human for meaningful work

Prof. Dr. Shantipal S. Ohol, College of Engineering Pune


7
Types of Autonomous Robotic
A programmable robot is a first-generation robot with an actuator
1. Programmable facility on each joint.
2. Non-programmable
3. Adaptive Non-programmable robots is not even considered as a robot but is an
exploiter lacking reprogrammable controlling device. The mechanical arms
4. Intelligent used in industries are some of the examples of these types of robots
wherein the robots are generally attached to the programmable devices

Adaptive robots are also industrial robots that can be adapted independently to
various ranges in the process. However, these robots are more sophisticated than
programmable robots. These can be adapted up to a certain extent, and after
evaluation they can perform the action required in that adapted area. These robots
are mostly equipped with sensors and control systems.

Intelligent robots performs with situation-based analyzing and task performing abilities.
Intelligent robots can sense the senses like pain, smell and taste and are also capable
of vision and hearing, and – in accordance, perform the actions and expressions like
emotions, thinking and learning.

Prof. Dr. Shantipal S. Ohol, College of Engineering Pune


Applications of Autonomous Mobile Robot

A fully autonomous robot’s applications are –


Lawn-mower robot
Swimming pool cleaning robot
Floor cleaning robot
Entertainment robot
Companion robot
Semi-autonomous mobile robots have a large potential
market, from rescue robots to robots for maintenance of
nuclear plants, and have several military applications, such
as reconnaissance flying drones.

Prof. Dr. Shantipal S. Ohol, College of Engineering Pune


Prof. Dr. Shantipal S. Ohol, College of Engineering Pune
Mobile Robot Navigation
Sensing,
Localisation, Mapping
Cognition / Navigation
and
Motion Control.

Prof. Dr. Shantipal S. Ohol, College of Engineering Pune


Courtesy : https://smprobotics.com/technology_autonomous_mobile_robot/

Prof. Dr. Shantipal S. Ohol, College of Engineering Pune


Prof. Dr. Shantipal S. Ohol, College of Engineering Pune
Sensors used for mobility Global Positioning Sensor (GPS) provides latitude,
Typical sensors used in ground mobile robots and drones longitude and altitude information. Over the years, GPS
include: accuracy has increased significantly and highly accurate
1. Inertial measurement units (IMUs) modes, such as RTK, also exist. GPS-denied areas,
2. GPS such as indoor areas, tunnels and so forth, and slow
3. Laser Sensors update rates remain a GPS’s top limitations. But they are
4. Encoders important sensors for outdoor mobile robots and provide
5. Vision Sensors an accurate periodic reference.
6. Ultrasonic Sensors
7. Pulsed and Millimeter Wave Radars Depending on whether they are indoor or outdoor robots
and the speed at which the robot moves, laser
sensors can vary significantly in price, performance,
Inertial measurement units (IMUs) typically combine
robustness, range and weight. Most are based on time
multiple accelerometers and gyroscopes. They can also
of flight principles. Signal processing is performed to
include magnetometers and barometers. Instantaneous
output points with range and angle increments. Both 2D
pose (position and orientation) of the robot, velocity (linear,
and 3D lasers are useful. Laser sensors send a lot of
angular), acceleration (linear, angular) and other
data about each individual laser point of the range data.
parameters are obtained through the IMU in 3D space.
To take full advantage of lasers, a lot of compute power
MEMS sensor technology advances have benefitted IMUs
is needed. Lidars are also very popular in mapping.
significant. IMUs suffer from drifts, biases and other errors.

Prof. Dr. Shantipal S. Ohol, College of Engineering Pune


Encoders count the precise number of rotations of the If there is an object in the range of an ultrasonic
robot wheels, thereby estimating how far the robot has sensor pulse, part or all of the pulse will be reflected
travelled. The terms odometry or dead-reckoning are used back to the transmitter as an echo and can be
for distance calculation with wheel encoders. They suffer detected through the receiver. By measuring the
from long-term drifts and hence need to be combined with difference in time between the transmitted pulse and
other sensors. the received echo, it is possible to determine object’s
Vision sensors, such as cameras, both 2D and 3D, as well range. Sonars are impacted by multipath reflections.
as depth cameras, play a very critical role in AMRs.
Computer vision and deep learning on the sensor data can Pulsed and millimeter wave radars detect objects at
aid object detection and avoidance, obstacle recognition long range and provide velocity, angle and bearing
and obstacle tracking. Visual odometry and visual-SLAM parameters typically measured to the centroid of the
(simultaneous localization and mapping) are becoming object. They work in all weather conditions while most
more relevant for autonomous robots operating in both other sensors fail in complex environments, such as
indoor and outdoor environments where lighting conditions rain, fog and lighting variations. But their resolution is
are reasonable and can be maintained. 3D cameras, depth limited as compared to lidar or laser.
and stereo vision cameras provide pose, i.e., position and Robust and accurate localization schemes combine
orientation, of an object in 3D space. In industrial data received from IMUs, wheel encoders, GPS, laser,
environments, well-established machine vision techniques radar, ultrasonic and vision software algorithms to
combined with pose can help solve a number of problems implement SLAM techniques. Depending on the
from grasping to placement to visual servoing. Thermal and application and specification of navigation and object
infrared cameras are used when working in difficult lighting avoidance, the fusion can be limited to few sensors or
conditions, such as the dark or fog. all sensors.

Prof. Dr. Shantipal S. Ohol, College of Engineering Pune


Localization of Mobile Robots
The robot should
localize itself
concerning the map of
its environment, which
is better than tracking
of its motion
using odometry data.
For localization, the
robot utilizes its
exteroceptive sensors
such as laser sensor,
vision and ultrasonic
sensor to make
observation about its
environment. Ref : Autonomous Mobile Robot, Siegwart et al., 2011

Prof. Dr. Shantipal S. Ohol, College of Engineering Pune


The Challenge of Localization: Noise and Aliasing
The existing GPS network provides accuracy to within several meters, which is unacceptable for localizing
human-scale mobile robots as well as miniature mobile robots such as desk robots and the body-navigating
nanorobots of the future. Furthermore, GPS technologies cannot function indoors or in obstructed areas and are
thus limited in their workspace.

If Robot intends to reach a particular location, then localization may not be enough. The robot may need to
acquire or build an environmental model, a map, that aids it in planning a path to the goal.

localization means more than simply determining an absolute pose in space; it means building a map, then
identifying the robot’s position relative to that map.

1. Sensor noise
Sensor noise induces a limitation on the consistency of sensor readings in the same environmental state and,
therefore, on the number of useful bits available from each sensor reading.

Sensor noise reduces the useful information content of sensor readings. Clearly, the solution is to take multiple
readings into account, employing temporal fusion or multisensor fusion to increase the overall information
content of the robot’s inputs.

Prof. Dr. Shantipal S. Ohol, College of Engineering Pune


2. Sensor aliasing
Mobile robot sensors causes them to yield little information content, further exacerbating the
problem of perception and, thus, localization. The problem, known as sensor aliasing,

The problem posed to navigation because of sensor aliasing is that, even with noise-free sensors, the amount
of information is generally insufficient to identify the robot’s position from a single-percept reading. Thus,
techniques must be employed by the robot programmer that base the robot’s localization on a series of
readings and, thus, sufficient information to recover the robot’s position over time.

3. Effector noise

The robot effectors are also noisy. E.g. a single action taken by a mobile robot may have several different
possible results.

Prof. Dr. Shantipal S. Ohol, College of Engineering Pune


Localisation, Mapping
• In order for the robot to navigate successfully, it must determine its position in the workplace. So
localization together with perception and motion control are key issues in robot navigation.
• Localization is closely related to representation. If an accurate GPS system could be installed on a
robot, the localization problem would be solved. The robot would always know where it was. But at
the moment, this system is not available or is not accurate enough to work with. In any case,
localization implies not only knowing the robot’s absolute position on Earth but also its relative
position with respect to a target.
• The control system also plays a role. If the robot intends to reach a specific location, it needs an
environment model or map so that it can plan a path to reach the target. This means that localization
is a broad issue that includes not only determining the absolute position of the robot in space but
also building a map and determining the robot’s position relative to the map. Therefore, sensors
(perception system) are fundamental in the task of localization. Any inaccuracy and incompleteness
of the sensor will affect the robot’s localization. Also, sensor noise and aliasing reduces the useful
information picked. Uncertainty and error must be minimized. They must help in mapping the robot
and itsenvironment

Ref : - “A review of mobile robots: Concepts, methods, theoretical framework, and applications” by Francisco Rubio, Francisco Valero and C
Llopis-Albert in International Journal of Advanced Robotic Systems

Prof. Dr. Shantipal S. Ohol, College of Engineering Pune


19
• Finally, it is by means of localizing on a map that the robot tries to recover its
position and detect when it has reached at the target location. A key issue for
map localization is representation.
• The techniques need to identify and distinguish between permanent, static
obstacles and moving obstacles. The real world is dynamic, so real obstacles
can move. Estimating the motion vector of transient objects is still a research
problem.
• Another challenge is that of wide open spaces such as parking lots, fields, and
indoor atriums such as those found in conference centers. They set a difficulty
due to their relative sparseness. A classic example involves occlusion by human
crowds. Also the issue of sensor fusion must be addressed, as it strengthens the
outcoming percepts far beyond that of any individual sensor’s readings.
• The ideal scenario would be that the robot’s sensors together with the mapping
strategy should immediately identify its particular location, uniquely and
repeatedly.

Prof. Dr. Shantipal S. Ohol, College of Engineering Pune


20
Prof. Dr. Shantipal S. Ohol, College of Engineering Pune

You might also like