Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

AUTOMATION AND LOCALIZATION OF A ROBOTIC CAR

Michelle M. Santos 1, Elias José R. Freitas, Matheus N. W. Vinti, Paulo Iscold, Leonardo A. B. Torres,
Guilherme A. S. Pereira

Grupo de Pesquisa e Desenvolvimento de Veículos Autônomos – PDVA,


Escola de Engenharia, Universidade Federal de Minas Gerais,
Belo Horizonte, MG, 31270-010 Brazil.
1
michelle@cpdee.ufmg.br

Abstract: The development of a vehicular robotic platform University vehicle, named Junior [1]. Junior was developed
equipped with in-house-built drive-by-wire system is from a Passat Weagon 2006 and was equipped with a Global
presented. Different sensor fusion algorithms and sensors Positioning System (GPS), an Inertial Navigation System
configurations, used to estimate the vehicle trajectory, were (INS), and steering encoders. It still had a group of laser
tested. It is shown that the best configuration was achieved range finders capable of identifying landmarks on the
with acceleration, steering wheel angle, and GPS racecourse, increasing the precision of localization. The
measurements. sensor fusion algorithm used in the vehicle localization was
the UKF (Unscented Kalman Filter), which provides
Keywords: Localization, Autonomous Vehicle, Robotics. information to the path planning system and to the control
system. The results on the challenges show the success of
1. INTRODUCTION the vehicle. Junior was the second place in the 2007
competition, and its “older brother”, Stanley won the first
In the Mobile Robots field, several research groups have place in the 2005 DARPA Grand Challenge, which was a
been focused on the development of autonomous vehicles. off-road competition.
Following these efforts, the Research and Development of The localization task is one of the most important steps
Autonomous Vehicles Group (PDVA), from UFMG, works in the autonomous vehicle development. It consists in
in a project which aims to develop a robotic platform based obtaining precise information about position and orientation
on a standard automobile. Besides achieving the final goal of the vehicle from noisy information provided by different
of having an autonomous vehicle, several sub products, such sensors, which can be redundant or complementary. To
as driver assistant systems and safety equipments, may be combine the information provided by the sensors, some
generated during the project development. techniques may be used. The most common ones are
This article presents the current development of the recursive estimators, such as the Kalman Filter, the
platform, by presenting all the developed hardware, Information Filter, or the Particle Filter and their variations
including driving, braking, and acceleration mechatronic [2].
systems, and shows a combination of Kalman Filter-like A very common set of sensors used in external
techniques and different models to robot pose estimation. environments localization is composed by a GPS and an
The development of a car which contains automated Inertial Measurement Unit (IMU). Along with [1], the works
functions may contributes in many applications such as developed by Caron et al and Zhou et al present examples of
safety, by, for example, the design of systems able to detect systems with such a set of sensors [3] [4].
danger situations and warn the driver; comfort, by Another way to increase the precision in localization is
facilitating the driving or by informing the localization of the use of computer vision to find and identify landmarks on
the vehicle in a map to the driver; or accessibility, by the environment. By using a camera attached to the vehicle
helping people who have special needs. The car endowed it is possible to know, for example, the relative pose
with instrumentation and control systems may also be used between the vehicle and the landmark, and, thus, assuming a
as an educational environment for teaching robotics and know landmark, the vehicle location. This technique was
control. used, for example, in [5].
Research and development of autonomous vehicles is The work of Roumeliotis and Bekey [6] describes a
rising in many universities through the world. Some implementation that combine the information of relative
initiatives, like the DARPA Grand Challenge, aim to sensors, like potentiometers that measure the wheels angle,
increase the advances in this area. This competition was gyrometers and encoders with the information of a global
created in 2003 with the objective to show that computer sensor which provides the vehicle orientation based on the
controlled vehicles can cover a significant distance without Sun position. The sensor fusion was made through the
human intervention. The three DARPA Grand Challenges Extended Kalman Filter algorithm (EKF) on a sequential
organized until now have provided several advances on the way, supplying the location system with information based
robotics science. on the encoders when the robot does not see the Sun.
One of the cars that participated of the 2007 DARPA
Grand Chalenge, which was an urban race, was the Stanford

1
AUTOMATION AND LOCALIZATION OF A ROBOTIC CAR
Michele M. Santos , Elias José R. Freitas, Matheus N. W. Vinti, Paulo Iscold, Leonardo A. B. Torres, Guilherme A. S. Pereira

The localization system presented in this work is based Gear Board


on the combination of GPS and IMU, as was done in [1],[3], Accelerator Board
and [4]. We are currently using the Extended Kalman Filter Steering Board
as [4] and [6], which inspired us to use the sequential
version of the filter to deal with the infrequent data from the
GPS sensor.
The paper is organized as follows: Section 2 presents a
hardware description of the system formed by the vehicle,
sensors and actuators; Section 3 shows the localization task
based on sensor fusion with different models; and Section 4
presents the conclusion and future works. USB communication

2. SYSTEM DESCRIPTION
Figure 1: PCB with the PIC18F2550 microcontroller.
The car used is this work is a 2003/2004 Chevrolet
Astra, endowed with a system with hydraulic steering,
automatic gear, electronic accelerator, and ABS brake
controlled by electronic control units (ECU) and
interconnected by the CAN bus. These characteristics
facilitate the automation of the vehicle, as will be shown
next.
2.1. Actuators
The first stage of the project consists in installing the
devices that allow a computer to act, in real time, changing
the vehicle’s operational conditions.
The development of an in-house-built system was the
choice in this project since it significantly reduces the total
cost comparing with OEM parts designed for this purpose.
Another advantage is the development of an assembly that Figure 2: Expansion PCB attached to PIC for accelerator control.
fits to the specific needs of the project and that is adapted to
the car’s peculiarities. In order to improve safety and to 2.1.2. Accelerator
facilitate the test, the system was implemented in a way that The accelerator hardware was designed using the PCB
the driver may assume the control whenever is needed. Also, kit described in the previous section. It simply replaces the
each system is independent from each other so they can be electric signal from the accelerator pedal by a filtered PWM
tested independent. Therefore, besides the the main switch signal, sending it to the Electronic Control Unit (ECU) of
that disables the whole automatic system and allows the user the vehicle.
to drive manually, individuals switches for each actuator are The microcontroller receives the acceleration command
available. via USB communication and generates de PWM signal. A
The driver’s inputs in a car are received by the steering switch was implemented to select two accelerator modes:
wheel, pedals (clutch, brake and accelerator) and gear stick. (1) manual mode: pedal signal is sent to the ECU and (2)
In this work, the vehicle’s transmission is automatic, so automatic mode: the acceleration is sent by the software.
there is no clutch pedal. The others commands were treated For safety purposes, a watchdog timer was implemented
individually and different solutions were developed for each in the PIC. If any failure in the data transmission occurs
one as shown next. (e.g. shutdown in the host computer), the automatic
2.1.1. Electronic Circuits for actuators control accelerator system will be disabled and the PIC will wait for
a new user connection. The manual mode (through pedal)
A basic kit for the actuators control circuits was will be also enabled.
developed, allowing faster implementations and making it
easy the modification of the circuits. It consists of two 2.1.3. Gear Stick
printed circuit boards (PCB). The first one, shown in Figure A Warner Electric linear DC actuator equipped with a
1, contains a Microchip PIC18F2455 microcontroller [7], potentiometer was attached to the gear stick to provide the
the basic circuit for the PIC's operation and a connector necessary displacement to change the driving modes
which allow the main ports of the PIC available to be used (neutral, park, forward and reverse). The actuator was
by an expansion board. In addition it enables the previously calibrated using its potentiometer, linking each
communication between the computer and device through driving mode with its the respective position.
an USB port. The second board is a double-sided universal The driver circuit for this actuator was projected using
PCB, used for expansion circuits. Figure 2 shows it attached the basic kit of section 2.1.2. A LMD18245 full-bridge
to the microcontroller board. The circuit on this board will power amplifier [8] was used to incorporate to the system all
be individual for each actuator. the circuit blocks required to drive and control current in a
brushed type DC motor.
A PIC sends to the LMD18245 the direction and a PWM
signal to drive the motor. This information also came to the

2
microcontroller via through USB communication, but now,
Steering
the system also has push-buttons to manual control.
To ensure safety, an emergency button was added to the Laptop computer
system, which has priority over the other actions in this
driver. Each time it is pressed, the motor is immediately
locked in its actual position. Switch Panel

2.1.4. Steering
Joystick
The steering actuation is also made by an electric DC
motor. Some experiments demonstrated that it was
necessary at least a 30Nm of torque to a turn the driving
system. A RE40 motor, from MAXON Motors was selected
to execute the task. The motor has a reduction and an
encoder which provides the control system with position
information. The driver used is an EPOS 24/5, also from Gear Stick
MAXON, which was considered the best choice for this Brake
application. It was attached behind the steering wheel, and
the rotation is transmitted by gears and chain, with a ratio of Figure 3: The whole actuation system
3:1. The EPOS allows the motor position, velocity, and
current control. The set points for these variables may come 2.2. Sensors
from a computer through RS-232 serial communication. The vehicle perception of the environment is made
2.1.5. Brake possible by several sensors. At this stage of development,
the system uses one absolute sensor, the GPS (Global
The solution developed to the brake system was to act Positioning System) and some relative sensors, as described
directly in the brake pedal. A 12V motor and a gearbox were next.
used, and the rotational movement was transmitted to the
pedal by a lever. The result was the full capability of The GPS receiver is equipment that collects information
braking, since the lever reached the pedal maximum travel. from geostationary satellites with known position. The
The application was designed to accommodate the lever and observer position (x,y,z) can be obtained from, at least, three
the driver’s foot in the same pedal, allowing the driver to satellites. The quality of the measure is directly related with
brake the car in emergency situations, which guarantees the number of satellites which the receptor can see. This
safety in test stage. number can vary between 0 and 12. So, in environments
It was implemented a motor command through PIC and where the view is damaged by obstacles, the measurement
a power motor driver MD03 from Robot-Electronics [9]. of position by a GPS receptor may be impossible. The
The motor's power is controlled by using PWM signal at a receiver used here is the Garmin↧ GPS18. This instrument
frequency of 15kHz. outputs information with 1Hz sample rate and its precision
This command allows the motor to turn in both is 15m with 95% confidence interval. The GPS information
directions and the force applied to the brake pedal to be includes latitude and longitude, which is converted from
controlled through current sense of MD03. The motor degrees to meters using the UTM coordinate system.
direction, the PWM signal and the interface user-motor is The vehicle used on this work is also equipped with a
made on the PIC, which is also responsible for ensuring IMU (Inertial Measurement Unit). This instrument outputs
security in emergency and failure situations. linear acceleration, angular rate, gyro angle and magnetic
2.1.6. Testing Software field by using accelerometers, gyroscopes and
magnetometer as sensors. The model used here is a
All the microcontroller boards and the steering controller Crossbow↧ AHRS400CD.
were connected to a single laptop computer running
Windows Vista. A software was developed for testing The measurement of the driving angle is made by an
purposes, which enables the user to send commands to encoder attached to the actuation system, as mentioned
actuators using a joystick. The software was built using before. From this measure it is possible to obtain the
C++. The buttons on the joystick select the actuators and the steering wheel angle through the path. It is important to
movement of the stick determines, for example, the position notice that this information is given by the actuation system,
of the steering wheel or the acceleration of the car. having no necessity of any additional sensor.
If one of the actuators is not ready to receive data or if In order to relate the encoder reading and the steering
one of the emergency buttons is pressed, the software aborts angle, a calibration procedure was made. During this
the data transmission and notifies the user. procedure, the steering angle was measured on the both
Figure 3 shows a picture of all the system as described front wheels. To obtain an equation which describes this
on the whole section 2.1. relation, it was used the mean of the two wheels, according
to the Ackerman model [9], which will be detailed on
section 3.1. Figure 4 shows the result of the calibration.

3
AUTOMATION AND LOCALIZATION OF A ROBOTIC CAR
Michele M. Santos , Elias José R. Freitas, Matheus N. W. Vinti, Paulo Iscold, Leonardo A. B. Torres, Guilherme A. S. Pereira

Calibration Curve (Encoder X Steering Angle) x(k) = Ax(k - 1) + Bu(k - 1) + w(k - 1) , (1)
0.3

y(k) = Cx(k) + v(k) , (2)


0.2

where x(k) represents the state vector, u(k) represents the


0.1
inputs of the model, y(k) is the measurement vector, w(k)
Steering Angle (rad)

represents the process noise, and v(k) is the measurement


0
noise.
-0.1 The state transition matrix A may be, for example, given
by:
-0.2
 Px ( k ) Px (k − 1) + V( k − 1) cos(θ( k − 1)) ∆t 
-0.3
 P (k )   P ( k − 1) + V (k − 1)sen (θ(k − 1)) ∆t  , (3)
x (k ) =  y  =  y 
 V( k )   V ( k − 1) + A x (k − 1)∆t 
-0.4    
 θ( k )  θ( k − 1) + ω∆t
-6 -4 -2 0 2 4 6
Encoder Reading 4  
x 10

where Px and Py are the vehicle position relative to a fixed


Fig. 4. Calibration Curve – Encoder Reading X Steering Angle reference frame, θ is the vehicle orientation, and V is the
robot velocity.
3. LOCALIZATION This model is very usefull to represent non-holonomic
The task of autonomous locomotion consists, basically, robots and is mainly suitable for differential driven robots,
in answering two important questions: What is the vehicle that can turn in place. For a car-like robot, an additional
real position (localization problem)? Where does it have to constraint on θ may be added to indicate that this behaviour
go (planning problem)? From the answers to these is impossible. A second model that includes the driving
questions, the control system can send command signals to angle is the Ackerman model [9]. With this model the
the actuation system which behaves in an appropriate way to vehicle is represented by a rectangular body over four
achieve the goal. wheels. The back wheels orientation is fixed in relation to
Therefore, the estimation of the vehicle position over the body and the front wheels are actuated by the driving
time, along with other states that are fundamental to system (Figure 6).
guarantee the control system performance, is one of the
most important steps to the robot autonomous locomotion.
The purpose of this section is to discuss the use of
different sensor fusion algorithms and different
mathematical models in order to fulfill this estimation task
with enough precision and adequate sampling frequency to
supply the control system with updated information.
3.1. Models
This subsection presents some dynamic models to the
movement of a car. These models can be used in a sensor
fusion algorithm for estimating the states of interest. The
models presented here are based on the coordinate system
showed in Figure 5.

Fig. 6. Ackerman Model

The Ackerman model represents the steering angle by a


virtual wheel, which is located midway between the two
front wheels. The virtual wheel angle is given by the mean
of the two front wheel angles. The wheelbase (L) should be
considered on the estimation of the vehicle orientation. The
state transition equation is showed on Eq. 4
Px ( k − 1) + V (k − 1) cos(φ( k − 1)) cos(θ( k − 1))∆t 
 P ( k − 1) + V (k − 1) cos(φ(k − 1))sen(θ(k − 1))∆t 
 y  . (4)
Fig. 5. Coordinate System on the Car x (k ) =  V (k − 1) + A x (k − 1)∆t 
 V 
All the models used in this work are discrete kinematic  θ(k − 1) + tan(φ) ∆t 
models of the form:  L 

4
The models in Equations (3) and (4) were compared presented in Equation (7) and the input equation is the same
experimentally on sensor fusion algorithms that used three of Equation (6).
different combinations of models and sensors. These
combinations are showed on section 3.2. PxGPS (k )
y(k ) = PyGPS (k )
(7)
3.2. Sensor Fusion Algorithms  VGPS (k ) 
One of the main challenges of mobile robots localization
The third configuration tested involves the driving angle
task is to combine the information provided by different
measurement and the Ackerman model. The state transition
sensors, with different properties, and different sampling
matrix was shown in Equation (4). The measurement vector
rates. The information fusion can be done by using various
contains GPS measurements for position as in Equation (6).
techniques. One of them is the Kalman Filter. The Kalman
The input vector brings data from IMU acceleration in x
Filter (KF) is an optimal recursive algorithm to estimate
direction and the driving angle, which is converted to the
unknown states of a dynamic linear system from noisy
virtual wheel angle by the calibration function.
measurements (zero mean white noise) and a discrete linear
model of the system [10]. This model must be in form of
Equations (1) and (2), where x is the state vector, u is the 3.4. Localization Results
input vector, A, B and C are the model matrices. Variables w The three configurations were tested offline by
and v are process and measurement noise respectively, reconstructing the vehicle trajectory, which was performed
which are random, Gaussian, independent and zero mean as a closed loop in the UFMG Campus. The position (x,y)
variables whose covariance matrices are Q and R. was computed in all the configurations mentioned and their
From the model, the algorithm aims to minimize the efficiency was compared. To avoid GPS offset errors, the
estimation root-mean-square error. The estimation algorithm first GPS measurement was considered to be (0,0).
KF is, basically, formed by two steps: Prediction and The tests aim to answer two questions: Can the
Update. On the prediction step, the states are calculated algorithms provide to the system good information between
from a discrete model, which uses an estimate obtained on two successive GPS measurements? How long do the
the previous timestep (k-1) to estimate the states on the algorithms estimate the vehicle trajectory with an
current timestep (k). The estimate obtained on the prediction uncertainty smaller than that associated to the GPS
step is called a priori estimate. The update step considers measurements, in the event of GPS loss?
the a priori estimate, and the measurement vector to obtain a
new estimate, the a posteriori state estimate. To evaluate the algorithms performance it was computed
the trace of the covariance matrix associated to the estimated
Since in this work we are dealing with a non-linear position. This was compared to the trace of the GPS
model, it is necessary to use an approximation of the KF. As measurement noise covariance matrix. The covariance
a first approach we chose to use the Extended Kalman Filter matrix of the estimated position error, when compared to the
(EKF) with approximate the model by a linearization around GPS, indicates the degree of confidence in the estimated
the current state. position.

3.3. Experiments For all the configurations ellipses were plotted which
represents the covariance of GPS position measurement
The two models in Section 3.1 were tested with some error with a confidence level of 95%. An estimated
combinations of sensors and models. The first combination trajectory was plotted on Figure 7, to illustrate the general
uses the kinematic model and the measures from GPS and procedure. If the computed trajectory is inside the ellipses, it
IMU. The acceleration in x direction and the angular rate means that this estimation cannot be discarded as false. The
around z direction provided by the IMU are used as input of ‘+’ points are the GPS measurements and the full line is the
the model. The position in x and y directions provided by estimated trajectory.
the GPS receiver are used in the update step. The input and
Configuration 1

measurement matrices are then:


-120

A ( k ) 
u (k ) =  x  , (5) -140

 ω(k )  -160

-180

PxGPS (k ) . (6)
y( k ) =  
-200

PyGPS ( k )  -220

-240

The second configuration tested with the EKF uses


velocity information provided by the GPS receiver. The -260

model in Equation (3) was used, but the measurement vector -400 -350 -300 -250 -200

now contains one more data. The measurement vector is Fig. 7. Estimation, Ellipse and Measurement of GPS

5
AUTOMATION AND LOCALIZATION OF A ROBOTIC CAR
Michele M. Santos , Elias José R. Freitas, Matheus N. W. Vinti, Paulo Iscold, Leonardo A. B. Torres, Guilherme A. S. Pereira

The first configuration produced a covariance matrix


120
trace smaller than the trace of the associated GPS
measurement error covariance matrix while the GPS signal Estimation
was considered to be available. The GPS measurements 100 GPS
were used with an update rate of 1Hz. The result of the first
configuration showed that IMU measurements suffer from 80
significant bias influence. Therefore, bias estimation became
necessary to improve the quality of the estimates. To solve 60
this problem two more pseudo-states, representing the bias
terms, were included on the state vector, as showed in
40
equation 8.
Px (k ) Px (k − 1) + V(k − 1) cos(θ(k − 1))∆t 20

Py (k ) Py (k − 1) + V (k − 1)sen (θ(k − 1))∆t


V (k ) V(k − 1) + A x (k − 1)∆t , (8) 0
x (k ) = = 0 2000 4000 6000 8000 10000 12000

θ( k ) θ(k − 1) + ω∆t
b Ax (k ) b Ax (k − 1) Fig. 9. Trace of covariance matrix associated to the position
estimation using the second configuration.
bω (k ) b ω (k − 1)

A typical result obtained for the trace of the estimation 120

covariance matrix with the first configuration and bias


estimation, considering full availability of GPS signal, is 100 Estimation
GPS
shown in Figure 8.
80
120

60
Estimation
100
GPS

40
80

20
60

0
0 2000 4000 6000 8000 10000 12000
40

20 Fig. 10. Trace of covariance matrix associated to the position


estimation using the third configuration.

0 The obtained results have shown that:


0 2000 4000 6000 8000 10000 12000

• configuration 1 was able to endure GPS failure for


Fig. 8. Trace of covariance matrix associated to the position estimation using less than one second;
the first configuration.
• configuration 2 was able to stand GPS failure for
The second configuration was made with the same up to 2 seconds;
dynamic model, but increasing the measurement model with
another equation corresponding to GPS velocity • and configuration 3 was able to remain with small
measurement. It was verified that the addition of this data uncertainty for up to 13 seconds.
decreased the covariance matrix trace, as shown in Figure 9.
The Ackerman model used as a third possible 4. DISCUSSION AND CONCLUSIONS
configuration produced the smallest covariance matrix trace In this paper it was presented initial results of a research
among the configurations analyzed on this work, as shown which aims to develop an autonomous car. The hardware
in Figure 10. described here is the first step to design the car actuation and
The second aspect to be evaluated in this work is how control systems. In order to close the control system loop in
long the algorithm can maintain a small uncertainty, when near future, a localization system has been developed and
compared to the GPS uncertainty, in the event of GPS loss. some sensor fusion algorithms, based on different sets of
To investigate this issue, the GPS measurement data was sensors, were tested and evaluated.
suppressed for an arbitrary amount of time, which was The tests have showed that the best configuration in
increased until the trace of the estimation error covariance terms of performance on estimating the position of the
matrix became greater than the trace of the GPS error vehicle was the third one. For this configuration, in which
covariance matrix. acceleration, vehicle steering angle, and GPS position was

6
used, the algorithm was able to remain producing reliable [9] Howie Choset, Kevin M. Lynch, Seth Hutchinson,
estimates, after GPS loss, for a longer period. George Kantor, Wolfram Burgard, Lydia E. Kavraki,
The future work include the analysis of the dynamic Sebastian Thrun. “Principles of Robot Motion.” The
model of the vehicle and the addition of other sensors, MIT Press, 2005.
mainly velocity sensors, to make the estimation more [10] Luis A. Aguirre. “Introdução à Identificação de
independent of GPS measurements and more robust to GPS Sistemas.” 3rd. ed., Ed. Campus. Belo Horizonte, 2007.
loss.

ACKNOWLEDGMENTS
This work is supported by FAPEMIG (Fundação de Amparo
a Pesquisa do Estado de Minas Gerais). The authors also
thanks CNPq and CAPES for their scholarships.

REFERENCES
[1] Michael Montemerlo, Jan Becker, Suhrid Bhat, Hendrik
Dahlkamp, Dmitri Dolgov, Scott Ettinger, Dirk
Haehnel, Tim Hilden, Gabe Hoffmann, Burkhard
Huhnke, Doug Johnston, Stefan Klumpp, Dirk Langer,
Anthony Levandowski, Jesse Levinson, Julien Marcil,
David Orenstein, Johannes Paefgen, Isaac Penny, Anna
Petrovskaya, Mike Pflueger, Ganymed Stanek, David
Stavens, Antone Vogt, Sebastian Thrun, “Junior: The
Stanford entry in the Urban Challenge”, Journal of
Field Robotics Volume 25 Issue 9, pp. 569-597,
September 2008.

[2] S. Thrun, W. Burgard, D. Fox, “Probabilistic Robotics”,


The MIT Press, 2005.

[3] F. Caron, E. Duflos, D. Pomorski, and P. Vanheeghe,


(2006). “GPS/IMU data fusion using multisensor
kalman filtering: introduction of contextual aspects”,
Information Fusion, Volume 7, pp. 221–230.

[4] J. Zhou, H. Bolandhemmat, “Integrated INS/GPS


system for an autonomous mobile vehicle.”
International Conference on Mechatronics and
Automation (ICMA 2007), pp. 694–699, 2007.

[5] M. A. G. Moreira, H. N. Machado, C. F. de Castro


Mendon¸ca, G. A. Pereira, “Mobile robot outdoor
localization using planar beacons and visual improved
odometry”, Proceedings of the IEEE/RSJ International
Conference on Intelligent Robots and Systems
(IROS’07), pp. 2468–2473, 2007.

[6] S. Roumeliotis, G. Bekey, “An extended kalman filter


for frequent local and infrequent global sensor data
fusion”, International Symposium on Intelligent
Systems (SPIE), 1997.

[7] PIC18F2455/2550/4455/4550 Data Sheet.


http://ww1.microchip.com/downloads/en/DeviceDoc/39
632b.pdf. Accessed at 2008-10-22

[8] LMD18245 Data Sheet.


http://www.datasheetcatalog.org/datasheet/nationalsemi
conductor/DS011878.PDF. Accessed at 2008-10-22

You might also like