Download as pdf or txt
Download as pdf or txt
You are on page 1of 39

US 20060223637A1

(19) United States


(12) Patent Application Publication (10) Pub. No.: US 2006/0223637 A1
Rosenberg (43) Pub. Date: Oct. 5, 2006
(54) VIDEO GAME SYSTEM COMBINING Publication Classification
GAMING SIMULATION WITH REMOTE
ROBOT CONTROLAND REMOTE ROBOT (51) Int. Cl.
FEEDBACK G06F 9/00 (2006.01)
(75) Inventor: Louis B. Rosenberg, Pismo Beach, CA (52) U.S. Cl. ................................................................ 463/47
(US)
Correspondence Address:
SINSHEIMER JUHNKE LEBENS & MCIVOR, (57) ABSTRACT
LLP
101O PEACH STREET An interactive apparatus is described comprising a portable
P.O. BOX 31 gaming system and a mobile toy vehicle connected by a
SAN LUIS OBISPO, CA 93406 (US) wireless communications link. The mobile toy vehicle has a
drive system, a video camera, a communications link, a
(73) Assignee: Outland Research, LLC, Pismo Beach, computer system, and vehicle control Software. The gaming
CA (US) system comprises a visual display, a user interface, a com
(21) Appl. No.: 11/278,120 munications link, a computer system and gaming software.
The gaming system can display the real-time real-world
(22) Filed: Mar. 30, 2006 images captured by the video camera mounted on the mobile
toy vehicle overlaid with simulated gaming objects and
Related U.S. Application Data events. In this way a combined on-screen off-screen gaming
experience is provided for the user that merges real-world
(60) Provisional application No. 60/666,805, filed on Mar. events with simulated gaming actions. The apparatus allows
31, 2005. for single player and multiplayer configurations.

116
125 123
111 110 Collision
Detect
Weapons Source
1

1.
129 Mobile
Toy N 180

ty Range
117
O2 - Portable Gaming System
is
O 115 130
121
114
( - 160
100 ul
Patent Application Publication Oct. 5, 2006 Sheet 1 of 18 US 2006/0223637 A1

116
125 123
110 Collision
Detect Light
W

1. 19
140
s's 118-127 150 Y
129 Mobile Light 180 -r
Toy Light 7
134 2
11 132 AA 155
Range
115 130 -Portable Gaming System
121

too -
( )- 160
Patent Application Publication Oct. 5, 2006 Sheet 2 of 18 US 2006/0223637 A1

Mobile Robotic Toy Vehicle


Patent Application Publication Oct. 5, 2006 Sheet 3 of 18 US 2006/0223637 A1

11 O'

Mobile 110"
Vehicle H1 Mobile
Vehicle #2

Gaming 130"
System #1 Gaming
Patent Application Publication Oct. 5, 2006 Sheet 4 of 18 US 2006/0223637 A1

2–1iesn &#
Patent Application Publication Oct. 5, 2006 Sheet 5 of 18 US 2006/0223637 A1

Select 910
Simulated Weapon 1
A1 900

Aim Weapon
at Vehicle 920

Weapon Fires -930

940
-CHN Weapon Hits

Simulated Affect
Explosion Functionality

Fig. 3C
Patent Application Publication Oct. 5, 2006 Sheet 6 of 18 US 2006/0223637 A1

Select 1110
Glue Gun 1100

Shoot Gun at 1120


Opponent

Display
Simulated Glue 1130
Stream

1140

1150
Affect User 160"
Drive System

Fig. 3D
Patent Application Publication Oct. 5, 2006 Sheet 7 of 18 US 2006/0223637 A1

Select 1210
Blinding Light 1200
Gun

Shoot Beam
of Light 1220

Hits
Opponent

Overlay 1240
Graphical Light
On User 160'

Fig. 3E
Patent Application Publication Oct. 5, 2006 Sheet 8 of 18 US 2006/0223637 A1

500 na
Vehicle 110
Control
System

134 Simulated Objects 510

140
150

Fig. 4
Patent Application Publication Oct. 5, 2006 Sheet 9 of 18 US 2006/0223637 A1

610
Simulated
Functions

630

Simulated Simulated Simulated


Opponents Objectives Strategy elements

Fig. 5
Patent Application Publication Oct. 5, 2006 Sheet 10 of 18 US 2006/0223637 A1

Fig. 6A

Fig. 6B
Patent Application Publication Oct. 5, 2006 Sheet 11 of 18 US 2006/0223637 A1

Raw Video 710


Input 1N1

Determine Area 720


of Modification

Modify Area of 730


Video Input

Processed 740
Video Input

Fig. 6C
Patent Application Publication Oct. 5, 2006 Sheet 12 of 18 US 2006/0223637 A1

Fig. 7
Patent Application Publication Oct. 5, 2006 Sheet 13 of 18 US 2006/0223637 A1

21CS
- A e
VTRE
EH e

Fig. 8
Patent Application Publication Oct. 5, 2006 Sheet 14 of 18 US 2006/0223637 A1

Fig. 9
Patent Application Publication Oct. 5, 2006 Sheet 15 of 18 US 2006/0223637 A1

Fig. 10
Patent Application Publication Oct. 5, 2006 Sheet 16 of 18 US 2006/0223637 A1
Patent Application Publication Oct. 5, 2006 Sheet 17 of 18 US 2006/0223637 A1

Fig. 12
Patent Application Publication Oct. 5, 2006 Sheet 18 of 18 US 2006/0223637 A1

v/ SCOREO DISTANCEO DAMAGE:O

Fig. 13
US 2006/0223637 A1 Oct. 5, 2006

VIDEO GAME SYSTEM COMBINING GAMING 0009. The mobile toy vehicle further comprises: a, a
SIMULATION WITH REMOTE ROBOT CONTROL weapons system; a vehicle location system; a video camera;
AND REMOTE ROBOT FEEDBACK a vehicle communications link interface; a power Supply; a
Software configurable vehicle computer control system;
0001) This application claims benefit under 35 U.S.C. S wherein said software configurable vehicle computer control
119(e) to U.S. Provisional Application No. 60/666,805 filed system operatively controls the drive system, the weapons
Mar. 31, 2005. system, the vehicle location system, the video camera, the
vehicle communications link interface; and wherein the
BACKGROUND OF THE INVENTION gaming system further comprises: a screen; a user interface;
0002) 1. Field of the Invention a software configurable gaming computer processor;
wherein said software configurable gaming computer pro
0003. The invention is in the field of personal gaming cessor operatively controls the screen and user interface;
system 130 in general and personal gaming system 130 that wherein the mobile toy communications link interface sends
interact with mobile robotic toy devices in particular. data to the gaming system using the communications link
interface.
0004 2. Discussion of the Related Art
0010 Also provided is a method for controlling an appa
0005 Gaming systems are popular way for people to ratus that entertains, said method comprising: obtaining an
entertain themselves and interact with other users. An image from mobile toy vehicle; transferring the image to a
example of a gaming system is the Sony PSP (PlayStation user game console; overlaying the image with a virtual
Portable), is handheld, weighs approximately 1 lb., has a object; displaying the overlaid image with the virtual object
Small screen to view images, has user control buttons, and a on the screen.
wireless interface. This device also communicates with
other gaming system to allow for interactive playing BRIEF DESCRIPTION OF THE DRAWINGS
between two or more individuals.
0011 Preferred embodiments of the invention will be
0006 Mobile toys are also well known and a popular described in conjunction with the following drawings, in
means of entertainment. Most mobile toys consist of a which:
remote controller to operate the toy (e.g. move the toy
forward, turn it right and left, etc.). The remote controller is 0012 FIG. 1 is a block diagram of the preferred embodi
typically connected with a wireless connection so that the ment of the gaming system; and
operator may stand at one place and move the toy using a 0013 FIG. 2 is an example of the physical implementa
control panel. tion of gaming system as depicted in FIG. 1; and
0007 Whether implemented on a personal computer, 0014 FIG. 3a is a block diagram of a two player system
television-based gaming console, or handheld gaming sys where each player has a gaming system and a mobile toy
tem 130, traditional video games allow users to manipulate vehicle; and
on-screen characters and thereby engage in on-screen chal 0015 FIG.3b is a block diagram of a multiplayer system
lenges or competitions. While Such on-screen challenges or where each player has a gaming system and there is a single
competitions are fun and engaging for users, they often pull mobile toy vehicle; and
users away from the real physical world and cause them to
sit mesmerized in a single location for hours at a time, 0016 FIG. 4 is a block diagram of the gaming system
fixated upon a glowing screen. This is very different from with a simulated input module; and
traditional toys that allow users to engage the world around 0017 FIG. 5 is a block diagram of the simulated input
them, incorporating their physical Surroundings into their module; and
creative and physically active play activities. For example,
a child playing with toy blocks or toy cars or toy planes will 0018 FIG. 6a is a picture of the screen of the gaming
focus upon the toys but will also incorporate their physical system where the display is unaltered; and
Surroundings into their play behavior, turning their room or
their house or their yard into the field of play. This offers 0019 FIG. 6b is a picture of the screen of the gaming
children a more diverse and creative experience than sitting system where the display has been altered, in this case
in front of a screen and engaging simulated world. Computer darkened, by the simulated inputs module; and
generated challenges and competitions can be rich with 0020 FIG. 6c is a flowchart showing the software pro
stimulating content that is more dynamic and inspiring than cess of altering the display by the simulated inputs module:
an unchanging toy car or truck or plane. What is therefore and
needed is a novel means of combining the dynamically
engaging benefits of computer generated content with the 0021 FIG. 7 is a picture of a gaming system showing
physically engaging benefits of traditional toys. computer generated cracks; and the simulated inputs mod
ule; and
SUMMARY 0022 FIG. 8 is the screen display of the gaming system
0008. The preferred embodiment of an apparatus for user where the aiming system consisting of crosshairs is shown:
and
entertainment, said apparatus comprising: a plurality of
mobile vehicles; a plurality of gaming systems; and a 0023 FIG. 9 is the screen display of the gaming system
plurality of communication links between the mobile toy where a simulated laser weapon has been fired at a beanbag
vehicles and the gaming systems. chair in the real world; and
US 2006/0223637 A1 Oct. 5, 2006

0024 FIG. 10 is the screen display of the gaming system interface. A vehicle communications interface 127 supports
showing the virtual effects on the bean bag chair in the real the wireless interface 150 which connected to the portable
world of the simulated laser beam; and gaming system 130. All of these interfaces are controlled
0.025 FIG. 11 is the screen display of the gaming system and coordinated by a vehicle software 120.
showing the placement of simulated images, in this instance 0032) The vehicle software 120 may be implemented
a pyramid; and using any number of popular computer languages, such as,
0026 FIG. 12 is the screen display of the gaming system C, Java, Perl, PHP, and assembly language. Executable code
showing the placement of simulated images, in this instance is loaded on the vehicle computer 118. The code may be
a barrier; and modified during operation based on inputs and outputs from
aforementioned interfaces.
0027 FIG. 13 is the screen display of the gaming system
showing a fuel meter and ammunition meter for the mobile 0033. Those skilled in the arts will appreciate that the
toy vehicle being operated. individual subsystems of the mobile toy vehicle may be
placed in different configurations without an appreciable
DETAILED DESCRIPTION change in functionality. Example embodiments of these
configurations are further disclosed in this application.
0028. While describing the invention and its embodi
ments various terms will be used for the sake of clarity. 0034. The video camera 112 is affixed to its chassis such
These terms are intended to not only include the recited that the video camera 112 moves along with the mobile toy
embodiments, but also all equivalents that perform Substan vehicle 110 and can capture video images in the forward
tially the same function, in Substantially the same manner to direction of travel of the mobile toy vehicle 110. Alternately,
achieve the same result. the video camera may be mounted on a rotating platform to
Mobile Toy and Gaming System view in additional directions. Video data (not shown) from
the video camera 112 affixed to the mobile toy vehicle 110
0029 Now turning to FIG. 1, a block diagram of the is transmitted by electronics aboard the mobile toy vehicle
preferred embodiment 100, is shown and described. The 110 across the wireless communication connection to the
apparatus of the preferred embodiment includes a mobile toy portable gaming system 130. The portable gaming system
vehicle 110 equipped with a wireless communications inter 130 receives the video data from the video camera 112 and
face 180 connected to a portable gaming system 130. A user incorporates the video data into the visual display 140.
160 interacts with the portable gaming system 130.
b. Gaming System
a. Mobile Toy Vehicle
0035. The portable gaming system 130 is a handheld
0030 The mobile toy vehicle 110 is equipped with some computer controlled apparatus that includes one or more
or all of the following; a microphone 111, a video camera computer processors 132 running gaming Software 134, a
112, a drive system 114, a ranging system 115, a collision visual display 140, a communications interface 145, and a
detection system 116, one or more light detectors 117 a user-interface controls 155. The portable gaming system
vehicle computer 118, a vibration detection system 119, a generally also includes an audio display system including
position location system 121, one or more light Sources 123, speakers and/or headphones. The portable gaming system
simulated weapons 125, an orientation sensor 218, and a may also include one or more locative sensors such as a GPS
vehicle communications interface 127. Vehicle computer position sensor and/or a magnetometer orientation sensor for
software 120 is loaded into internal Read only Memory determining the position and/or orientation of the gaming
(ROM) and Random Access Memory (RAM) (both not system with respect to the physical world.
shown).
0031. The controlling device of the mobile toy vehicle 0036) The portable gaming system 130 may be a com
110 is the vehicle computer 118. The vehicle computer 118 mercially available device, such as a PlayStation Portable by
is connected to the microphone 111 via an analog to digital Sony, Gameboy Advance from Nintendo, a Nintendo DS
converter (not shown). The vehicle computer 118 is con gaming system from Nintendo, or an N-Gage gaming sys
nected to the video camera 112 either by an analog to digital tem from Nokia. An example of a typical portable gaming
converter (not shown) or by a digital interface. The drive system 130, a Sony PlayStation Portable, is shown in FIGS.
system 114 is connected to the vehicle computer 118 using 6-13. Alternately, the portable gaming system 130 may be a
a digital to analog interface and drive circuitry. The ranging device that is dedicated for this particular application.
subsystem 115 is connected to the vehicle computer 118 0037. The gaming processor 132 provides the central
using a digital or analog interface. The collision detection control of the Subsystems on the gaming console. The visual
subsystem 116 is connected to the vehicle computer 118 display 140 is connected to the gaming processor 132. The
using eitheran analog to digital or digital interface. The light user-interface controls 155 are connected to the gaming
sensor subsystem 117 is connected to the vehicle computer processor 132. The communications interface 145 is con
118 using either a digital or analog interface. The vibration nected to the gaming processor 132 and communications
detection subsystem 119 is connected to the vehicle com link 180.
puter 118 using a digital or analog interface. The position
location subsystem 121 is connected to the vehicle computer 0038. The gaming software 134 may be implemented
118 using a digital or analog interface. The light source 123 using any number of popular computer languages, such as,
is connected to the vehicle computer 118 using a digital or C, Java, Perl, PHP, and assembly language. The code may
analog interface. The simulated weapons 125 are connected also be generated from user libraries specially provided by
to the vehicle computer 118 using a digital or analog the manufacturer of the gaminge device. Executable code is
US 2006/0223637 A1 Oct. 5, 2006

loaded on the gaming processor 132. The code may be relative orientation of the camera with respect to the chassis
modified during operation based on inputs and outputs from of the mobile toy vehicle 110 can be achieved in some
aforementioned interfaces. embodiments by mounting the camera to the chassis of the
c. Interaction of Gaming System and Mobile Toy Vehicle vehicle through a motor controlled gimbal or turret. In
addition to the controlling the relative orientation of the
0.039 The portable gaming system 130 receives and camera with respect to the chassis of the mobile toy vehicle
processes video data received from the video camera 112 110, the control signals from the portable gaming system
located on the mobile toy vehicle 110 and updates the 130 can optionally control the Zoom focus of the camera, the
gaming Software 134. control signals being sent to the mobile toy vehicle 110 from
the portable gaming system 130 in response to user 160
0040. The portable gaming system 130 sends control manipulations of the manual user-interface controls upon the
signals 150 to the mobile toy vehicle 110, the control signals portable gaming system 130.
150 being used by the mobile toy vehicle 110 to control the
motion of the vehicle about the user 160 physical space. 0046. Other sensors can be optionally mounted upon the
mobile toy vehicle 110. Data from these sensors are sent
0041. The control signals 150 based in whole or in part back to the portable gaming system 130 over the wireless
upon the user 160 interaction with the manual user-interface communication interface 180, the data from the sensors
controls 155 present upon the portable gaming system 130. being used by the game processor 132 within the portable
For example, in an embodiment the portable gaming system gaming system 130 to update or modify gaming software
130 sends control signals 150 to the mobile toy vehicle 110. 134. For example, collision sensors 116 can be mounted
the control signals 150 based in part upon how the user 160 upon the mobile toy vehicle 110, the collision sensors 116
manipulates the manual user-interface controls 150 that are detecting if the vehicle collides with a physical object within
incorporated into the portable gaming system 130, the its local space. The collision sensors 116 can be binary,
control signals 150 controlling the direction and speed by indicating yes/no if a collision has occurred. The collision
which the mobile toy vehicle 110 moves within the local sensors 116 can be analog, indicating not just if a collision
physical environment of the user. As the mobile toy vehicle has occurred but also a magnitude or direction for the
110 moves under the control of the control signals 150, collision.
updated video images from the camera upon the mobile toy 0047 A ranging sensor 115 such as an ultrasound trans
vehicle 110 are sent back to the portable gaming system 130 ducer can be mounted upon the mobile toy vehicle 110, the
and displayed to the user 160 along with other gaming ranging sensor 115 detecting the distance of objects from the
content. In this way the game player can see first-person mobile toy vehicle 110, the vehicle computer 118 within the
images sent back from the mobile toy vehicle 110 similar to mobile toy vehicle 110 sending data representative of the
the images one would see if he or she was scaled to the size distance back to the portable gaming system 130, the dis
of the mobile toy vehicle 110 and riding upon it. The images tance information being used by the vehicle computer 118 of
are a real-time changing perspective view of the local the portable gaming system 130 to update the gaming
physical space of the user 160 that is incorporated into the software 134.
displayed gaming action upon the portable gaming system
130. The local view is merged with computer generated 0.048. A light detector 117 (Visible, UV, or Infra Red) can
gaming content allowing the user 160 to play not just on a be mounted upon the mobile toy vehicle 110, the light
screen, but play within his or her view of the physical local detector 117 detects if a light of a particular frequency or
Space. modulation is shining upon the mobile toy vehicle 110. The
vehicle computer 118 located in the mobile toy vehicle 110
0.042 A real-time camera image is one that seems to the sending data representative of the output of the light sensor
user that is Substantially reflecting the present conditions of back to the portable gaming system 130, the sensor infor
the remote mobile toy vehicle. There will generally be a mation being used by the processor of the portable gaming
Small time delay due to image capture and image commu system 130 to update the gaming software 134.
nication processes, but this delay is Small compared to the 0049. A vibration sensor 119 (such as an accelerometer)
time frames required by the human perceptual system. can be mounted upon the mobile toy vehicle 110, the
0043. The mobile toy vehicle 110 is connected to the vibration sensor 119 detecting a level of vibration experi
portable gaming system 130 using the wireless communi enced by the mobile toy vehicle 110 as it moves over a
cations interface 180. The gaming software 134 controls the particular terrain. The vehicle computer 118 sends data
computer processors 132 that are connected to the visual within the mobile toy vehicle 110 sending data representa
display 140. tive of the output of the vibration sensor back to the portable
gaming system 130, the sensor information being used by
0044) The portable gaming system 130 communicates the processor of the portable gaming system 130 to update
with the mobile toy vehicle 110 over the wireless commu the gaming Software 134.
nications interface 180.
0050 Also a microphone 111 can be mounted upon the
0045. In addition to the controlling the speed and direc mobile toy vehicle 110, the microphone detecting sound
tion of the mobile toy vehicle 110, the control signals from signals local to the mobile toy vehicle 110 as it moves about
the portable gaming system 130 can optionally control the a particular room or environment, the electronics within the
orientation of the camera relative to the chassis of the mobile mobile toy vehicle 110 sending data representative of the
toy vehicle 110, the control signals being sent to the mobile Sound signals back to the portable gaming system 130, the
toy vehicle 110 from the portable gaming system 130 in sound information being displayed to the user 160 through
response to user 160 manipulations of the manual user the portable gaming system 130 along with other processor
interface controls upon the portable gaming system 130. The generated Sounds relating to the gaming software 134.
US 2006/0223637 A1 Oct. 5, 2006

0051. Also position or motion sensors 121 can be reflecting sensors for tracking lines drawn or tape laid upon
mounted upon the mobile toy vehicle 110, the position or the floor, IR detectors, UV detectors, or vibration sensors.
motion sensors 121 detecting the relative or absolute dis 0056. Also not shown, but optionally included in the
tance traveled by the vehicle in a particular direction within mobile toy vehicle 110, is an electronically controllable
the real physical space of the user. The electronics within the weapon turret. In some embodiments the electronically
mobile toy vehicle 110 sending data representative of the controllable weapon turret includes a video camera affixed
distance or motion back to the portable gaming system 130, Such that the orientation of the weapon turret is the same as
the processor 132 upon the portable gaming system 130 the orientation of the camera aim, giving the user who is
updating the gaming action based in part upon the distance viewing the camera image upon his portable gaming system
or motion data. The position or motion sensors 121 in some 130 a first person view of what the weapon turret is aimed
embodiments can be relative motion sensors that track the
direction and spin of the wheels of the vehicle thereby at. In addition a light emitter can be included upon the
tracking the relative motion of the vehicle over time. The weapon turret Such that a light (constant or modulated) is
position or motion sensors 121 can in other embodiments be shined in the direction that the turret is pointed when a
absolute position sensors, such as GPS sensors, that track the simulated weapon is fired, the light falling upon a light
absolute position of the vehicle within the space of the user sensor of an opponent vehicle when the turret is appropri
160 during operation of the gaming software 134. ately aimed at the opponent mobile robotic vehicle. In this
way weapon's fire hit can be determined (as described
0.052 Also one or more light sources 123 can be mounted elsewhere in this document) from one vehicle to another and
upon the mobile toy vehicle 110, the light source sending a reported to one or more portable gaming system 130 over the
light beam as it moves about a particular room or environ bi-directional communication links. Also not included in
ment. The light sources may be, for example, visible light FIG. 2, but optionally included in some embodiments of the
Sources, UV light Sources, or IR light Sources, and may mobile toy vehicle 110 are the light source 123 to illumi
optionally be modulated with a carrier frequency. The gam nating dark spaces, the headlights being activated or deac
ing software 134 enables the light source 123 within the tivated by on-board electronics in response to control signals
mobile toy vehicle 110. 150 received from the portable gaming system 130.
Example Embodiment of a Mobile Robotic Toy Vehicle 0057. In addition to the portable gaming system 130
0053. Now referring to FIG. 2 shows an example of a running gaming software 134 and the mobile toy vehicle 110
simple mobile toy vehicle 110 with the top cover removed, as described throughout this document, other supplemental
the mobile toy vehicle 110 in wireless communication with hardware can be used within the real space to Support
a portable gaming system 130. As shown the mobile toy gaming action. For example, physical targets, beacons, or
vehicle 110 is comprised of many components including but barriers can be placed about a real physical space to enhance
not limited to a vehicle chassis with wheels and a suspen game play. For example a physical target can be a object of
Sion, a drive motor, control electronics, communication a particular shape or color that is placed within the physical
electronics, an antenna for bi-directional wireless commu playing space and is detected by sensors upon the mobile toy
nication with portable gaming system 130, wheels that can vehicle 110. Detection can be performed using video image
be steered under electronic control (actuator to steer wheels data processed by image processing routines running upon
not shown), bumpers with bumper sensors (bumper sensors the portable gaming system 130. Detection can also be
not shown), power electronics, a battery pack, and a video performed using emitter/detector pairs such that an electro
camera 112. Although the example shown in FIG. 2 shows magnetic emitter is affixed to the physical target and is
the camera rigidly attached to the frame of the vehicle, other detected by appropriate sensors upon the mobile toy vehicle
embodiments include additional actuators that allow the 110. In one embodiment the emitter is infra-red light source
camera change its orientation under electronic control with such as an LED that is modulated to vary it’s intensity at a
respect to the frame of the vehicle particular frequency such as 200 HZ. The detector is an
infra-red light sensor affixed to the mobile toy vehicle 110
0054 Although the example shown in FIG. 2 shows a such that it detects infra-red light that is directionally in front
single drive motor, other embodiments may include multiple of the vehicle. In this way the vehicle can move about,
drive motors, each of the drive motors being selectively varying its position and orientation under the control of the
activated or deactivated by on-board electronics in response user as moderated by the intervening game Software upon
to control signals 150 received from the portable gaming the portable gaming system 130, thereby searching for an
system 130 and in coordination with the game software 134. infra-red light signal that matches the characteristic 200 Hz
0.055 Although the examples shown in FIG. 2 show a modulation frequency. A variety of different frequencies can
single camera, multiple cameras are used in other embodi be used upon multiple different objects within the physical
ments. Not shown in FIG. 2 are other sensors and actuators space Such that the sensor can distinguish between the
that may be included in various embodiments of mobile toy multiple different objects. In addition to targets, beacons and
vehicle 110 such as, but not limited to, light sensors 117. barriers can be used to guide a user or limit a user, within a
particular playing space.
microphones 111, speakers, robotic grippers, robotic arm
effectors, electro magnets, accelerometers, tilt sensors, pres 0058. In addition to targets, beacons, and barriers, other
Sure sensors, force sensors, optical encoders to track wheel vehicles can be detected using the emitter/detector pair
motion, sensors to track steering angle, GPS sensors to track method disclosed herein. For example if a plurality of
vehicle location, ultrasound transducers to do spatial ranging mobile toy vehicle 110 were used in the same physical space
of objects in the environment, stereo camera systems to as part of the same game action, each could be a light Source
provide 3D visual images or ranging data, reflective sensors 123 affixed with an emitter (ideally on top such that it was
to identify the surface characteristics of the floor or ground, visible from all directions) and a light sensor 117 (ideally in
US 2006/0223637 A1 Oct. 5, 2006

front such that it can detect emitters that are located in front 130', 130", 130", that is connected over the corresponding
of it). Using the sensor each mobile toy vehicle 110 can wireless links 180", 180", 180" to single mobile toy vehicle
thereby sense the presence of others within the space. By 110'. In this scenario the three users 160", 160", and 160" via
using a different emission modulation frequency for each of game software (not shown) in each game system 130", 130",
the plurality of mobile toy vehicle 110, each can be distin and 130", engage in shared control of mobile vehicle #1.
guished. In this way each player's vehicle can sense the The shared control may be performed sequentially, each user
presence of others, even for example, when playing in a dark taking turns controlling the vehicle. The shared control may
or dim playing space, or even, depending upon the form of be performed simultaneously, each user controlling a differ
emission, when there are physical obstructions that block ent feature or function of the mobile vehicle. The shared
optical line of sight between users. In addition, based upon control may also be collaborative, the plurality of users
the strength of the signal received by the sensor from the jointly controlling the mobile robot through a merging of
emitter the Software running upon the portable gaming their respective control signals. This may be performed, for
system 130 of a particular user can infer the distance to example, by averaging the control signals received from the
various targets. Such distance information can be displayed plurality of users when controlling mobile vehicle actions
graphically upon the screen of the portable gaming system through their gaming systems.
130, overlaid upon the real video feedback from the mobile 0063. In another embodiment, the system can be
toy vehicle 110. designed to Support a larger number of users, each with their
Other Embodiments of the Toy Vehicle own gaming systems 130 and their own mobile toy vehicles
0059. It should be noted that the toy vehicle need not be 110. In addition the mobile toy vehicle 110 need not be
identical in form or function.
in the literal form factor of a car or truck, including for
example other mobile robot form factors. In addition, the toy User to User Interaction
vehicle need not be ground-based, including for example a a. Simulated Weapons
toy plane, a toy Submarine, or a toy boat.
Multiple User Play 0064) Referring now to FIG. 3c, a flowchart 900 depicts
the process of selecting and firing simulated weapons 125.
0060) Now referring to FIG. 3a and FIG. 3b that depict 0065. As shown, a simulated weapon is selected 910 for
various embodiments of multi-user systems. use by the mobile toy vehicle 110. The weapon can aim 920
0061. In FIG. 3a, a system diagram 300 is shown of a in preparation of “firing upon'930 the other user. A simu
two player system where each users 160", 160" have mobile lated weapon 125 such as a light beam 123 that selectively
toy vehicles 110, 110" connected each to a portable gaming shines from one vehicle in a particular direction based upon
system 1301, 130". In this example two users, each control the position and orientation of the vehicle and control
ling their own mobile toy vehicle 110", 110" through their signals 150 from the users 160", 160" and their respective
own portable gaming system 130'. 130" can be present in the gaming systems 130", 130", the control signals being gen
same local space and can play games that are responsive to erated in part based upon users 160", 160" manipulation of
sensor data from both mobile toy vehicles 110, 110". In the the manual user interface controls 150', 150" upon the
preferred embodiment the portable gaming system 130 of portable gaming system 130'. 130".
the two users are coordinated through an inter-game com
munication link 190. This allows the game software (not 0.066 Whether or not the simulated weapon 125 hits 940
shown) to be coordinated between both portable gaming the other of the two mobile toy vehicles 110", 110" is
systems 130, 130" can be coordinated between the two users determined by light detectors 117 upon one or both of the
160", 160". The two users of the two portable gaming system mobile toy vehicle 110", 110". For example in one embodi
130'. 130" can thereby engage a shared gaming experience, ment the light detector 117 upon a mobile toy vehicle 110 is
the shared gaming experience dependent not just upon the used to determine of that vehicle has been hit by a simulated
processing of each of their portable gaming system 130'. weapon represented by a beam of light shot by another
130" but also dependent upon the motions and sensing of mobile toy vehicle 110. If a hit was determined (as a result
each of their mobile toy vehicles 110. This becomes par of the light detector 117 triggering, for example, above a
ticularly interesting because a first player can see the second certain threshold or with a certain modulation, data is sent to
player's mobile toy vehicle 110", 110" as captured by the the gaming systems 130", 130" of one or both users and the
Video camera (not shown) mounted upon the first players game software 134, 134" is updated based upon the data
mobile toy vehicle 110' and displayed by the first players received from the mobile toy vehicles 110", 110". The
portable gaming system 130'. Similarly the second player updating of the game software 134, 134" can include, for
can see the first player's mobile toy vehicle 110' as captured example, the portable gaming system 130'. 130" of one or
by the camera mounted upon the second player's mobile toy both users displaying a simulated explosion image overlaid
vehicle 110" and displayed by the second player's portable upon the camera image that is being displayed upon the
gaming system 130". In this way the two users can control screen of the gaming systems 130'. 130" (or systems). The
their mobile toy vehicles 110", 110" to track, follow, com updating of the game software 134, 134" can also include,
pete, fight, or otherwise interact as moderated by the dis for example, the portable gaming system 130'. 130" of one
played gaming action upon their portable gaming system or both users 160", 160" displaying a simulated explosion
130'. 950 sound upon the portable gaming system 130', 130" The
updating of game software 134 can also include, for
0062 FIG. 3b depicts an alternate embodiment of the example, user scores 960 being updated upon the portable
multiplayer configuration, a system 400. where three users gaming system 130", 130". The updating of game software
160", 160", 160" operates a corresponding game system 134 can also include the computation of or display of
US 2006/0223637 A1 Oct. 5, 2006

simulated damage upon the portable gaming system 130'. Software if the glue stream hit the opponent. If the opponent
130", the simulated gaming creating a condition of ham was hit, 1140 the simulated glue weapon causes the vehicle
pered functionality 970 of the mobile toy vehicle. of the opponent to function as if it was stuck in glue using
the methods described above.
0067 For example, if a player's vehicle has suffered
simulated damage (as determined by the Software running 0073 For example, the user 160 who is controlling the
upon one or more portable gaming system 130) that vehicle vehicle that was hit by the simulated glue weapon may only
can be imposed with hampered functionality 970. The be able to move his or her mobile toy vehicle 110 at reduced
hampered functionality 970 could limit the user's ability to speed 1150 and in reduce directions until that vehicle has
control his or her mobile toy vehicle 110 through the control moved a sufficient distance as to pull free of the simulated
signals 150 being sent from his or her portable gaming glue (as monitored by the gaming Software running upon
system 130 in response to the user's manipulation of the one or more portable gaming system 130). In this way
manual user-interface controls upon his or her portable simulated computer generated effects can be merged with
gaming system 130. In this way the game Software can physical toy action to create a rich on-screen off-screen
impact the real-world control of the physical toy that is gaming experience.
present in the users physical space, merging the on-screen 0074. In an alternate embodiment, the mobile toy vehicle
and off-screen play action. that fires the simulated weapon includes a light sensor or
0068 If a user's vehicle has suffered hampered function other emission detector that is aimed in the direction of the
ality 970 as determined by the gaming software 134 running mock weapon (i.e. in the direction of a mock gun turret upon
upon that user's portable gaming system 130, the control the toy vehicle). The opposing vehicle includes a light
signals sent to that user's mobile toy vehicle 110 can be emitter (or other emitter compatible with the emission
limited or modified such that the vehicle has reduced turning detector) upon one or more outer surfaces of the vehicle. In
capability, reduce speed capability, or other reduced control Such a configuration the system can determine of the mock
capability. weapon is aimed at the opposing vehicle if the light sensor
(or other emission detector) detects the presence of the light
0069. In addition, if a user's vehicle has suffered ham emitter (or other compatible emitter) in its line of sight.
pered functionality 970 as determined by the gaming soft
ware 134 running upon that user's portable gaming system c. Blinding Light Gun
130, the display of sensor data received from that user's 0075) Referring now to FIG.3d, a flowchart 1200 depicts
mobile toy vehicle 110 can be limited or modified such that the process of selecting 1210 and firing simulated weapon
the vehicle has reduced sensor feedback capability for a 125 known as the “Blinding Light Gun”.
period of time as displayed to the user 160 through his or her
portable gaming system 130. The reduced sensor feedback 0.076 With respect to the example above, the user 160
capability can include, for example, such as reduced video might choose other weapons through the user 160 interface
140 feedback display fidelity, reduced microphone 111 feed upon the portable gaming system 130. He or she might
back display fidelity, eliminated camera 112 feedback dis choose a “blinding light gun' that shoots 1210 a simulated
play, eliminated microphone 111 feedback display, reduced beam of bright light at an opponent. This may cause a
or eliminated distance sensor 115 capability, reduced or graphical display of a bright beam of light being overlaid
eliminated collision sensor 116 capability, or reduced or upon the real video captured from that user's mobile toy
eliminated vibration sensor 119 capability. vehicle 110. Depending upon sensor data from the mobile
toy vehicle 110, it may be determined in software if the
0070 If a user's vehicle has suffered hampered function blinding light beam hit the opponent who was aimed at. If
ality 970 as determined by the gaming software running the opponent was hit 1230, the simulated blinding light
upon that user's portable gaming system 130, the gaming weapon causes the visual feedback displayed to the player
software 134 can reduce or eliminate the simulated weapon who is controlling that vehicle to be significantly reduced or
125 capabilities of that player's vehicle for a period of time. eliminated all together.
This can be achieved by reducing in the gaming software
134 the simulated range of the vehicle's simulated weapons, 0077. For example, the player's video feedback 1240
reducing in software the simulated aim of the vehicle's from the camera on his or her vehicle could turn bright white
simulated weapons 125, or eliminated the weapon capability for a period of time, effectively blinding the user 160 of his
of the vehicle all together for a period of time. or her visual camera feedback for that period of time. If the
light beam was not a direct hit, only a portion of the user's
b. Glue Gun visual display of camera feedback might turn bright white.
0071 Referring now to FIG.3d, a flowchart 1100 depicts Alternatively instead of that user's camera feedback display
the process of selecting 1110 and firing simulated weapon being obscured by the computer generated image of bright
125 known as the “Glue Gun. white, the camera feedback might be displayed with reduced
fidelity, being washed out with brightness but still be par
0072 For example a user 160 can select a weapon from tially visible (as controlled by the gaming software 134
a pool of simulated weapons 125 by using the user interface running upon one or more portable gaming system 130). In
controls 140 upon his or her portable gaming system 130. this way simulated computer generated effects can be
The weapon he or she might choose might be a “glue merged with physical toy action to create a rich on-screen
gun'1110 which can shoot a simulated stream of glue 1120 off-screen gaming experience.
at an opponent. This may cause a graphical display of a glue d. Weapons Cache
stream being overlaid upon the real video captured from that
user's mobile toy vehicle 110. Depending upon sensor data 0078. With respect to the simulated weaponry described
from the mobile toy vehicle 110, it may be determined in above, again the simulated Scenario created by the gaming
US 2006/0223637 A1 Oct. 5, 2006

software 134 can moderate the functionality of the mobile the simulated inputs 510, the user display 140, and the user
toy vehicle 110. For example, the gaming software 134 can control 150 are shown. The simulated inputs 510 refer to a
provide limited ammunition levels for each of various Software module that stores and maintains a list of simulated
weapons and when Such ammunition levels are expended the functions 610.
user 160 is no longer able to fire simulated weapons by
commanding the mobile toy vehicle 110 through the por 0082 The game software 134 is connected to the mobile
table gaming system 130. In this way simulated game action toy vehicle 110 and the simulated inputs 510. The game
moderates the physical play action of the toy, again merging software 134 is also connected to the user display 140 and
computer generated gaming scenarios with physical toy the user controls 150. During operation, the mobile toy
action to create a rich on-screen off-screen gaming experi vehicle 110 sends vehicle information 550 to the gaming
CCC. software 134. The mobile toy vehicle 110 receives control
information 540. The game software 134 sends state infor
e. Fuel Supply mation 520 and receives simulated inputs 530 from the
0079. In addition to weaponry effecting the gaming simulated objects 510 module. The user interacts with the
action and moderating under Software control a users ability game software 134 using the user display 140 and the under
to control his or her mobile toy vehicle 110 through the user control 150. The game software also receives a camera
portable gaming system 130 or moderating under Software feed from the vehicle 110 and displays it to the user upon the
control a user's feedback display from sensors aboard his or user display 140. The game Software is generally operative
her mobile toy vehicle 110, other simulating gaming factors to overlay graphics upon the display of said camera feed, as
can influence both the control of and displayed feedback described elsewhere in this document, to provide a mixed
from the mobile toy vehicle 110. For example the gaming on-screen off-screen gaming experience.
Software running upon one or more portable gaming system 0083) Now referring to FIG. 5, the simulated functions
130 can track simulated fuel usage (or simulated power 610 also expand upon the gaming scenario, creating simu
usage) by the mobile toy vehicle 110 and can cause the lated objectives 620 and simulated strategy elements 630
mobile toy vehicle 110 to run out of gas (or power) when the Such as simulated power consumption, simulated ammuni
simulated fuel or power is expended. This can be achieved tion levels, simulated damage levels, simulated spatial
by the gaming Software moderating the control signals 150 obstacles and or barriers, and simulated destinations that
from the portable gaming system 130 to the mobile toy must be achieved to acquire points or power or ammunition
vehicle 110 such that it ceases the ability of the vehicle to or damage repair. In addition the simulated functions 610
move (or reduces the ability of the vehicle to move) when can include simulated opponents 640 that are displayed as
the mobile toy vehicle 110 has run out of simulated fuel or overlaid graphical elements upon or within or along side the
simulated power. The ability to move can also be restored video feedback from the real-world cameras. In this way a
under Software control based upon the gaming action, Such user can interact with real opponents or real teammates in a
as the simulated powering of Solar cells or the simulated computer generated gaming experience that also includes
discovery of a fuel or power source. In this way simulated simulated opponents or simulated teammates.
computer gaming action can be merged with physical toy
action to create a rich on-screen off-screen gaming experi 0084 Below is additional description of how this merg
ence. Similarly various functions performed by the mobile ing of simulated gaming scenarios and real-world mobile toy
toy vehicle 110, whether it is real or simulated motion vehicle 110 control are merged into a combined on-screen
functions, real or simulated sensing functions, or real or off-screen gaming experience by the novel methods and
simulated weapon function, can be made to expend simu apparatus disclosed throughout this document.
lated fuel or energy at different rates. In this way the game
player who is controlling the real and simulated functions of 0085. In the descriptions below the phrase “simulated
the vehicle must manage his or her usage of real and vehicle' is meant to refer to the combined real-world
simulated functions such that fuel is not expended at a rate functions and features of the mobile toy vehicle 110 with the
faster than it is found or generated within the simulated simulated features and functions overlaid upon display or
gaming scenario. otherwise introduced into the control interface between the
Vehicle Interaction with Simulated Objects user and the mobile robot toy vehicle by the gaming soft
ware. In this way the “simulated vehicle' is what the user
0080. As described in the paragraphs above, the mobile experiences and it is a merger of the features and functions
toy vehicle 110 that is controlled by the user to engage the of both the real world robotic toy and the simulated com
gaming experience has both real and simulated functionality puter gaming content.
that is depicted through the merged on-screen off-screen
gaming methods. The real functions are enacted by the Simulated Lighting Conditions
real-world motion and real-world sensing of the mobile toy 0086 One method enabled within certain embodiments
vehicle 110 as described throughout this document. The of the present invention merges simulated gaming software
simulated functions are imposed or overlaid upon the real 134 with real-world mobile toy vehicle 110 by adjusting the
world experience by the gaming software 134 running upon display of visual feedback data received from the remote
the portable gaming system 130. The simulated functions camera aboard the mobile robot toy vehicle based upon
can moderate the real-world functions, limiting or modify simulated lighting characteristics of the simulated environ
ing the real-world motion of the mobile toy vehicle 110 or ment represented within the computer generated gaming
limiting or modifying the feedback from real-word sensors scenario. For example, when the computer generated gam
upon the mobile toy vehicle 110. ing scenario is simulating a nighttime experience, the dis
0081. Now referring to FIG. 4, a simplified block dia play of visual feedback data from the remote camera is
gram of the mobile toy vehicle 110, the game software 134, darkened or limited to represent only the small field of view
US 2006/0223637 A1 Oct. 5, 2006

illuminated by simulated lights aboard the simulated footage. In this way simulated game action moderates the
vehicle. Similarly, simulated inclement weather conditions physical play action of the toy, again merging computer
can be represented by degrading the image quality of the generated gaming scenarios with physical toy action to
displayed camera images. This can be used, for example, to create a rich on-screen off-screen gaming experience.
represent fog, Smoke, rain, Snow, etc in the environment of Simulated Weapons
the vehicle.
0087 FIG. 6a shows raw camera footage displayed upon 0093. A method enabled within certain embodiments of
a portable gaming device as received from a camera aboard the present invention merges simulated gaming action with
a mobile robot toy vehicle over a communication link. real-world mobile robot control and feedback by overlaying
computer generated graphical images of weapon targeting,
0088 FIG. 6b shows the camera footage as modified by weapon fire, or resulting weapon damage upon the real
gaming Software such that it is darkened to represent a world visual feedback data received from the remote camera
simulated nighttime experience. aboard the mobile toy vehicle 110 to achieve a composite
image representing the computer generated gaming sce
0089. Now referring to FIG. 6ca flow chart demonstrates nario. For example, the computer generated gaming scenario
how the modification of the raw video input. The raw video might enable the simulated vehicle with weapon capabili
input 710 is sent to spatial limiting module 720. The spatial ties.
limiting module 720 determines the area of raw video input
710 that will be modified. For example, the video input 710 0094. Now referring to FIG. 8, to enable targeting of the
could be modified by gaming software Such that it is weapon within the real-world Scene a graphical image of a
darkened and limited to a small illuminated area directly in targeting crosshair is generated by the gaming Software on
front of the vehicle to represent a nighttime scene that is the portable gaming system 130 and displayed as an overlay
illuminated by simulated lights upon the remote vehicle. The upon the real world camera footage received from the
modify pixel intensity module 730 change the pixels sent mobile toy vehicle 110. As the user moves the mobile toy
from the area modification module 720 are then sent to the vehicle 110 by manipulating the buttons upon the gaming
gaming Software 134. system (for example by pressing forward, back, left, or right)
0090 There are various methods by which an image can the video image pans across the real world Scene. As the
be processed and thereby darkened or lightened or tinted to Video image moves, the cross hairs target different locations
correspond with simulated lighting conditions within the within the real world space shown in FIG. 8.
computer generated gaming scenario. As another example 0.095 As shown in FIG. 8 the vehicle is pointed in a
the image displayed upon the portable gaming system 130 is direction Such that the targeting crosshair is aimed upon the
tinted red to simulate a gaming scenario that takes place bean bag in the far corner of the room. The user may choose
upon the Surface or mars. As another example the image to fire upon the bean bag by pressing an appropriate button
displayed upon the portable gaming system 130 is tinted upon the portable game system. A first button press selects
blue to simulate an underwater gaming experience. In these an appropriate weapon from a pool of available weapons. A
ways the simulated game action moderates the physical play second button press fires the weapon at the location that was
action of the toy, again merging computer generated gaming targeted by the cross hairs. Upon firing the gaming software
scenarios with physical toy action to create a rich on-screen running upon the portable gaming system 130 generates and
gaming experience. displays a graphical image of a laser beam overlaid upon the
Simulated Terrain and Backgrounds real-world image captured by the camera upon the mobile
toy vehicle 110.
0.091 Another method enabled within some embodi 0096. As shown in FIG.9, the overlaid image of the laser
ments of the present invention merges simulated gaming weapon might appear as shown in FIG. 9. This overlaid
action with real-world mobile robot control and feedback by computer generated laser fire experience is followed by a
merging of computer generated graphical images with the graphical image and sound of an explosion as the weapon
real-world visual feedback data received from the remote
camera aboard the mobile robot toy vehicle to achieve a has its effect. When the explosion subsides, a graphical
composite image representing the computer generated gam image of weapon damage is overlaid upon the real-world
ing scenario. For example, the computer generated gaming Video image captured from the remote camera.
scenario might be a simulated world that has been devas 0097 As shown in FIG. 10, an example of an overlaid
tated by an earthquake. To achieve a composite image weapons damage image is shown below in FIG. 10. In this
representing Such a computer generated Scenario the display way simulated game action moderates the physical play
of visual feedback data from the remote camera is aug action of the toy, again merging computer generated gaming
mented with graphically drawn earthquake cracks in Sur scenarios with physical toy action to create a rich on-screen
faces such as the ground, walls, and ceiling. FIG. 6a shows off-screen gaming experience. For example the firing of
raw camera footage displayed upon a portable gaming weapons is moderated by both the real-world position and
device as received from a camera aboard a mobile robot toy orientation of the remote mobile toy vehicle 110 and the
vehicle over a communication link. simulation software running upon the portable gaming sys
tem 130.
0092 FIG. 7 shows the camera footage as augmented by
gaming Software, graphically drawn cracks in the floor are 0098. A further method by which the simulated gaming
added to represent a earthquake ravaged gaming experience. action running as Software upon the portable gaming system
Other simulated terrain images or background images or 130 can moderate combined on-screen off-screen experience
foreground objects, targets, opponents, or barriers can be of the user is through the maintenance and update of
drawn upon or otherwise merged with the real-world video simulated ammunition levels. To enable such embodiments
US 2006/0223637 A1 Oct. 5, 2006

the gaming Software running upon the portable gaming backward button upon the portable gaming system 130 a
system 130 stores and updates variables in memory repre control signal is sent to the mobile toy vehicle 110 causing
senting one or more simulated ammunition levels, the it to move backward. The mapping may also be such that
ammunition levels indicating the quantity of and optionally when a user presses a left button on the portable gaming
the type of weapon ammunition stored within or otherwise system 130 a control signal is sent to the mobile toy vehicle
currently accessible to the simulated vehicle. Based upon the 110 causing it to turn left or veer left. The mapping may also
state and status of the ammunition level variables, the be such that when a user presses a right button on the
gaming software running upon the portable gaming system portable gaming system 130 a control signal is sent to the
130 determines whether or not the simulated vehicle can fire
a particular weapon at a particular time. If for example the mobile toy vehicle 110 causing it to turn right or veer right.
simulated vehicle is out of ammunition for a particular This mapping may be modified, however, using the methods
weapon, the weapon will not fire when commanded to do so disclosed herein, based upon the simulated fuel level, power
by the user through the user interface. In this way the firing level, or damage level, Stored as one or more variables
of weapons is moderated by both the real-world position and within the portable gaming system 130. For example, if the
orientation of the remote mobile toy vehicle 110 and the power level or fuel level falls below some threshold value,
simulation software running upon the portable gaming sys the Software running on the portable gaming system 130
tem 130. may be configured to modify the mappings between button
presses and the motion of the mobile toy vehicle 110 as
0099] The word “weapon” as described above is used achieved through the sending of control signals 150 from the
above need not simulate traditional violent style weapons. portable gaming system 130 and the mobile toy vehicle 110.
For example, weapons as envisions by the current invention In a common embodiment, when the power level or fuel
can use non-violent projectiles including but not limited to level falls below some threshold value, the mapping is
the simulated firing of tomatoes, the simulated firing of Spit modified such that reduced motion or no motion of the
balls, or the simulated firing of snow balls. In addition, the mobile toy vehicle 110 is produced when the user presses
methods described above for the firing of weapons can be one or more of the buttons described above. This may be
used for other non-weapon related activities that involve achieved in some embodiments by sending reduced motion
targeting or firing such as the control of simulated water values or zero motion values within the control signals 150
spray by a simulated fire-fighting vehicle or the simulated when the simulated fuel level or simulated power level falls
projection of a light-beam by a spot-light wielding vehicle. below some threshold value (to achieve reduced motion or
Simulated Fuel, Power, and Damage Levels no motion of the real robotic toy vehicle respectively).
Similarly, if the simulated damage level (as stored in one or
0100 Another method enabled within certain embodi more variables within the portable gaming system 130) rises
ments of the present invention merges simulated gaming above some threshold value, the software running on the
action with real-world mobile robot control and mobile portable gaming system 130 may be configured to modify
robot feedback by moderating a user's ability to control the the mappings between button presses and the motion of the
mobile robot toy vehicle based upon simulated fuel levels, mobile toy vehicle 110 as achieved through the sending of
power levels, or damage levels. control signals 150 from the portable gaming system 130
0101 To enable such embodiments the gaming software and the mobile toy vehicle 110. In a common embodiment,
134 running upon the portable gaming system 130 stores and when the damage level rises above some threshold value, the
updates variables in memory representing one or more mapping is modified Such that reduced motion or erratic
simulated fuel levels, power levels, or damage levels asso motion or no motion of the mobile toy vehicle 110 is
ciated with the simulated vehicle being controlled by the produced when the user presses one or more of the buttons
user. Based upon the state or status of the variables, the described above. This may be achieved in some embodi
gaming Software 134 running upon the portable gaming ments by sending reduced motion values or distorted motion
system 130 modifies how a user's 160 input (as imparted values or zero motion values within the control signals 150
upon the manual user interface on the portable gaming when the simulated damage level rises above some threshold
system 130) are translated into control of the remote vehicle. value (to achieve reduced motion, erratic motion, or no
motion of the real robotic toy vehicle respectively).
0102) In some embodiments the gaming software 134
running upon the portable gaming system 130 achieves the 0103) The example given in the paragraph above uses
modification of how a users input gestures are translated button presses as the means by which the user inputs manual
into the control of the vehicle by adjusting the mapping commands for controlling the mobile toy vehicle 110 as
between a particular input gesture and a resulting command moderated by the intervening gaming software 134. It
signal sent from the portable gaming system 130 to the should be noted that instead of buttons, a joystick, a track
mobile toy vehicle 110. For example, when a variable stored ball, a touch pad, dials, levers, triggers, sliders, and other
within the portable gaming system 130 indicates that there analog or binary controls upon the portable gaming system
is sufficient fuel or sufficient power stored within the simu 130 or interfaced with the portable gaming system 130 can
lated vehicle to power the simulated vehicle, a particular be used. For example a joystick could be used by the user to
mapping is enabled between the users input gesture (as command a direction and speed of the mobile toy vehicle
imparted upon the manual user interface on the portable 110, a particular position of the joystick mapping to a
gaming system) and the motion of the vehicle. The mapping particular the direction and speed of the vehicle. However,
may be such that when the user presses a forward button as described above. Such a mapping can be modified by the
upon the portable gaming system a control signal is sent to gaming software based upon simulated fuel levels, power
the mobile toy vehicle 110 causing it to move forward. The levels, or damage levels associated with the simulated
mapping may also be such that when a user presses a vehicle.
US 2006/0223637 A1 Oct. 5, 2006

0104 Referring to FIG. 11, depicts a Portable Gaming real sensor feedback from the mobile toy vehicle 110. For
System displaying live real-time video received over a example, in some embodiments when the shields are turned
communication link from a camera mounted upon a mobile on by a player, the camera feedback displayed to that user is
robotic toy vehicle, the motion of said vehicle being con degraded as displayed upon the portable gaming system 130.
trolled by said user through the manipulation of the buttons This computer generated degradation of the displayed cam
shown on said portable gaming system below. Simulated era feedback represents the simulated effect of the camera
objects can be placed within gaming space as simulated needing to see through a shielding force field that Surrounds
graphical overlays upon the real-time video image. As the vehicle. Such degrading can be achieved by distorting
shown below, a pyramid is drawn as a graphical target the the camera image, introducing static to the camera image,
user has been seeking as he drove the vehicle around his blurring the camera image, reducing the size of the camera
house. Upon finding the target in this room it is drawn as image, adding a shimmering halo to the camera image,
shown below. Also shown below are graphical gaming status reducing the brightness of the camera image, or otherwise
information displayed as overlaid upon the real-time video degrading the fidelity of the camera image when the simu
from the camera on the mobile robotic toy vehicle. In this lated shield is turned on. This creates additional gaming
example below the graphical gaming status information strategy because when the shield is on the vehicle is safe
includes current fuel level and current score information. from opponent fire or other potentially damaging real or
0105 Simulated damage may be incurred as a result of simulated objects, but this advantage is countered by a the
collisions with simulated objects Such as the overlaid graphi disadvantage of having reduced visual feedback from the
cal object shown in the figure. This object is drawn as a cameras as displayed upon the portable gaming system 130.
pyramid although one will appreciate that a wide variety of Simulated Terrain Features
simulated graphical elements may be overlaid upon the
real-world imagery supplied by the camera feed. Such 0109) Another method enabled within certain embodi
graphical elements may be three dimensional as shown in ments of the present invention merges simulated gaming
FIG 11. action with real-world mobile robot control and mobile
robot feedback by moderating a user's ability to control the
0106. As for the specific technique by which three mobile robot toy vehicle based upon simulated terrain
dimensional graphical imagery may be overlaid upon a features, simulated barriers, simulated force fields, or other
video feed, commercial software exists for the seamless simulated obstacles or obstructions.
merging of real-time video with 3d graphics. For example
D' Fusion software from Total Immersion allows for real 0110. To enable such embodiments the gaming software
time video to be merged with 3D imagery with strong spatial running upon the portable gaming system 130 stores and
correlation. As another example, the paper Video See updates variables in memory representing one or more
Through AR on Consumer Cell-Phones by Mathias simulated terrain features, simulated barriers, simulated
Mohring, Christian Lessig, and Oliver Bimber of Bauhaus force fields, or other simulated obstacles or obstructions.
University which is hereby incorporated by reference, pre The variables can describe the simulated location, simulated
sents a method of using low cost cameras (such as those in size, simulated strength, simulated depth, simulated Stiff
cell phones) and low cost processing electronics (such as ness, simulated Viscosity, or simulated penetrability of the
those in cellphones) to create composite images that overlay terrain features, barriers, force fields, or other obstacles or
3D graphics upon 2D video images captured in real time. obstructions. Based upon the state or status of the variables
and the simulated location of the simulated vehicle with
Simulated Shields
respect to the terrain features, barriers, force fields, obstacles
0107 Another method enabled within certain embodi or obstructions, the gaming Software running upon the
ments of the present invention that merges simulated gaming portable gaming system 130 modifies how a users input
action with real-world mobile robot control is the generation gestures (as imparted upon the manual user interface on the
and use of simulated shields to protect the combined real/ portable gaming system 130) are translated into control of
simulated vehicle from weapons fire or other potentially the remote vehicle.
damaging simulated objects. To enable Such embodiments 0111. In some embodiments the gaming software running
the gaming Software running upon the portable gaming upon the portable gaming system 130 achieves the modifi
system 130 stores and updates variables in memory repre cation of how a users input gestures are translated into the
senting one or more simulated shield levels (ie shield control of the vehicle by adjusting the mapping between a
strengths) associated with the simulated vehicle being con particular input gesture and a resulting command signal sent
trolled by the user. from the portable gaming system 130 to the mobile toy
0108 Based upon the state and status of the shield vehicle 110. For example, when the variables stored within
variables, the gaming Software running upon the portable the portable gaming system 130 indicate that the vehicle is
gaming system 130 modifies how simulated damage is on Smooth terrain and that there are no simulated barriers or
computed for the vehicle when the vehicle is hit by weapons obstructions within the path of the simulated vehicle, a
fire and when the vehicle encounters or collides with a particular mapping is enabled between the users input
simulated object that causes damage. In this way the impart gesture (as imparted upon the manual user interface on the
ing of damage (which as described previously can moderate portable gaming system 130) and the motion of the vehicle.
or modify how the robotic mobile toy vehicle responds when The mapping may be such that when the user presses a
controlled by the user through the portable gaming system forward button upon the portable gaming system 130 a
130) is further moderated by simulated gaming action. control signal is sent to the mobile toy vehicle 110 causing
Furthermore the presence or state of the simulated shields it to move forward. The mapping may also be such that when
can effect how the player views the real camera feedback or a user presses a backward button upon the portable gaming
US 2006/0223637 A1 Oct. 5, 2006

system 130 a control signal is sent to the mobile toy vehicle terrain features, barriers, force fields, obstacles, or obstruc
110 causing it to move backward. The mapping may also be tions present within the simulated environment of the simu
such that when a user presses a left button on the portable lated vehicle.
gaming system 130 a control signal is sent to the mobile toy
vehicle 110 causing it to turn left or veer left. The mapping 0114 Simulated terrain features, simulated barriers,
may also be such that when a user presses a right button on simulated force fields, or other simulated obstacles or
the portable gaming system 130 a control signal is sent to the obstructions can be drawn by the Software running on the
mobile toy vehicle 110 causing it to turn right or veer right. portable gaming system 130 and overlaid upon the real
This mapping may be modified, however, using the methods video imagery sent back from the mobile toy vehicle 110.
disclosed herein, based upon the presence of simulated Such a barrier is shown in FIG. 12 as a graphical overlay
non-Smooth terrain features, barriers, obstacles, or obstruc displayed upon the real video feedback from the mobile toy
tions as indicated by as one or more simulation variables vehicle 110.
within the portable gaming system 130. For example, when
the variables stored within the portable gaming system 130 0115 The mobile toy vehicles described herein are roll
indicate that there is a simulated barriers or obstructions ing vehicles that work by selectively powering wheels, other
within the path of the simulated vehicle, the software forms of mobility are useable within the context of this
running on the portable gaming system 130 may be config invention. For example mobile toy vehicle 110 can use
ured to modify the mappings between button presses and the treads and other rolling mechanisms. Mobile toy vehicle 110
motion of the mobile toy vehicle 110 as achieved through the can also employ movable legs as their means of mobility.
sending of control signals 150 from the portable gaming Furthermore the mobile toy vehicle 110 need not be ground
system 130 and the mobile toy vehicle 110. In a common based vehicles but can be flying vehicles or floating vehicles
embodiment, when there is a simulated barrier or obstruc Such as toy planes or toy boats respectively. Also, although
tion within the path of the simulated vehicle, the mapping is a single camera image is used in the examples described
modified such that reduced motion or no motion of the above, Stereo camera images can be employed upon the
mobile toy vehicle 110 is produced when the user presses mobile toy vehicle 110 the stereo camera images providing
one or more of the buttons that would command the vehicle 3D visual images to users and optionally providing 3D
to move into or through the barrier or obstruction. This may spatial data to the portable gaming system 130 for use by the
be achieved in Some embodiments by sending reduced simulation Software for coordinating real-world spatial loca
motion values or zero motion values within the control tions with the simulated location of simulated objects.
signals 150 (to achieve reduced motion or no motion of the
real robotic toy vehicle respectively). Sound Generation in the Remote Toy Vehicle Space:
0112) When the variables stored within the portable gam 0.116) The mobile toy vehicle 110 as described through
ing system 130 indicate that there a simulated bumpy terrain, out this document can include additional means for inter
muddy terrain, sandy terrain, or other difficult to move acting with the real environment around it such as having
across terrain under the simulated vehicle at a particular onboard speakers through which the mobile toy vehicle 110
time, the Software running on the portable gaming system can broadcast Sound into its local environment. The Sound
130 may be configured to modify the mappings between signals that are emitted through the speakers on board the
button presses and the motion of the mobile toy vehicle 110 mobile toy vehicle 110 can include data transmitted to the
as achieved through the sending of control signals 150 from vehicle from the portable gaming system 130 over the
the portable gaming system 130 and the mobile toy vehicle communication interface. The Sound signals can include
110. In a common embodiment, when the simulated terrain game-related Sound effects Such as engine sounds, explosion
is determined to be difficult to move across by the software Sounds, weapon Sounds, damage Sounds, alarm Sounds,
running on the portable gaming system 130, the mapping is radar Sounds, or creature sounds. The Sounds can be trans
modified Such that reduced motion or erratic motion or no mitted as digital data from the portable gaming system 130
motion of the mobile toy vehicle 110 is produced when the to the mobile toy vehicle 110 at appropriate times as
user presses one or more of the buttons described above. determined by the simulation Software running upon the
This may be achieved in some embodiments by sending portable gaming system 130. The Sound signals are often
reduced motion values or distorted motion values or Zero transmitted by the portable gaming system 130 in coordi
motion values within the control signals 150 (to achieve nation with gaming action simulated upon the portable
reduced motion, erratic motion, or no motion of the real gaming system 130. The Sounds can also be stored as digital
robotic toy vehicle respectively). data upon the mobile toy vehicle 110 and accessed at
0113. The use of “button presses” as the means by which appropriate times in accordance with control signals 150
the user inputs manual commands for controlling the mobile sent from the portable gaming system 130 and in coordina
toy vehicle 110 as moderated by the intervening gaming tion with gaming action upon the portable gaming system
software are not limited to buttons. Alternate user interfaces 130. In addition the sound signals that are emitted through
include a joystick, a trackball, a touch pad, dials, levers, the speakers on board the mobile toy vehicle 110 can include
triggers, sliders, and other analog orbinary controls upon the data transmitted to the vehicle from the portable gaming
portable gaming system 130 or interfaced with the portable system 130 over the communication interface as a result of
gaming system 130 can be used. For example a joystick user interaction with the manual user interface upon the
could be used by the user to command a direction and speed portable gaming system 130. In addition the Sound signals
of the mobile toy vehicle 110, a particular position of the that are emitted through the speakers on board the mobile
joystick mapping to a particular the direction and speed of toy vehicle 110 can include voice data from the user, the
the vehicle. However, as described above, the mapping can Voice data captured by a microphone contained within or
be modified by the gaming software based upon simulated interfaced with the portable gaming system 130. In this way
US 2006/0223637 A1 Oct. 5, 2006

a user can project his or her voice from the portable gaming 110 at the time the collision is detected. The sound effects
system 130 to the remote environment in which the mobile can also be dependent upon the then current gaming action
toy vehicle 110 is operating. displayed upon the portable gaming system 130 at the time
the collision is detected.
Light Generation in the Remote Toy Vehicle Space
0117 The mobile toy vehicle 110 as described through 0.122 Also, simulated sound effects, simulated damage
out this document can include additional means for inter levels can be adjusted within the simulation software run
acting with the real environment around it such as having ning upon the portable gaming system 130 in response to
onboard lights that the mobile toy vehicle 110 can shine into real-world collisions detected upon mobile toy vehicle 110.
its local environment under the control of the user as the magnitude of the change in the simulated damage levels
moderated by the intervening gaming software. The lights being optionally dependent upon the magnitude or direction
can include headlights, search lights, or colorful lights for of the collision as detected by sensors aboard the mobile toy
simulating weapons fire, weapon hits, or incurred damage. vehicle 110. The magnitude of the change in the simulated
The activation of the lights upon the mobile toy vehicle 110 damage level may be optionally dependent upon the speed
are controlled in response to signals received from the or direction of motion of the mobile toy vehicle 110 at the
portable gaming system 130, the signals sent at appropriate time the collision is detected. Also the magnitude of the
times in coordination with the gaming action upon the change in the simulated damage level may be optionally
portable gaming system 130. dependent upon the then current gaming action displayed
upon the portable gaming system 130 at the time the
Robotic Effectors collision is detected. In addition to, or instead of simulated
0118. The mobile toy vehicle 110 as described through damage levels, game scores can be adjusted within the
out this document can include additional means for inter
gaming Software running upon the portable gaming system
acting with the real environment around it such as having 130 in response to real-world collisions detected upon the
mobile effectors such as robotic arms or grippers or elec mobile toy vehicle 110, the magnitude of the change in score
tromagnets that can be manipulated under electronic control being optionally dependent upon the magnitude or direction
and in accordance with control signals 150 received from the of the collision as detected by sensors aboard the mobile toy
portable gaming system 130. vehicle 110. Also the magnitude of the change in the score
may be optionally dependent upon the speed or direction of
0119) The activation of the effectors upon the mobile toy motion of the mobile toy vehicle 110 at the time the collision
vehicle 110 are controlled in response to signals received is detected. Also the magnitude of the change in score may
from the portable gaming system 130, the signals sent at be optionally dependent upon the then current gaming action
appropriate times in coordination with the gaming action displayed upon the portable gaming system 130 at the time
upon the portable gaming system 130. In this way a user can the collision is detected. In addition to, or instead of game
pick up, push, or otherwise manipulate real objects within score changes, simulated game action can be modified
the real local space of the mobile toy vehicle 110, the picking within the gaming software running upon the portable
up, pushing, or manipulation being selectively performed in gaming system 130 in response to real-world collisions
coordination with other simulated gaming actions upon the detected upon the mobile toy vehicle 110, the type of the
portable gaming system 130. modified game action being optionally dependent upon the
magnitude or direction of the collision as detected by
Collisions with Real-World Objects and Simulation Inter sensors aboard the mobile toy vehicle 110. Also the type of
action the modified game action may be optionally dependent upon
0120. As disclosed previously, some embodiments of the the speed or direction of motion of the mobile toy vehicle
current invention include collision sensors aboard the 110 at the time the collision is detected.
mobile toy vehicle 110 such as contact sensors, pressure 0123. Also the type of the modified game action may be
sensors, or force sensors within the bumpers of the vehicle optionally dependent upon the then current gaming action
or acceleration sensors within the body of the mobile toy displayed upon the portable gaming system 130 at the time
vehicle 110. the collision is detected. For example, the simulated game
0121 Using any one or multiple the sensors, collisions action can display a hidden treasure to a user if the mobile
between the mobile toy vehicle 110 and real physical objects toy vehicle 110 collides with a wall or other real-world
can be detected and information relating to the collisions are Surface in a correct direction and at a speed that exceeds a
transmitted back to the portable gaming system 130 over the particular threshold. As another example, the simulated
communication interface. The information about the colli game action can collect a piece treasure, causing it to
sions are then used by the gaming Software-running upon the disappear and incrementing the player's score, if the mobile
portable gaming system 130 to update simulated gaming toy vehicle 110 collides with a wall or other real-world
action. For example, Sound effects can be generated by the Surface in a correct location or correct direction or at a speed
portable gaming system 130 in response to detected real that exceeds a particular threshold. In this way simulated
world collisions. The sound effects can be displayed through gaming action is moderated or updated based upon real
speakers upon or local to the portable gaming system 130. world interactions between the mobile toy vehicle 110 and
The Sound effects can also be displayed through speakers the real physical space in which it operates.
upon the mobile toy vehicle 110 (as described in the para Gaming Scores:
graph above). The Sound effects can be dependent upon the
direction or magnitude of the collision as detected through 0.124. Another novel aspect of the present invention is
the sensors. The Sound effects can also be dependent upon that computer generated gaming score or scores, as com
the speed or direction of motion of the mobile toy vehicle puted by the gaming software running upon the portable
US 2006/0223637 A1 Oct. 5, 2006

gaming system 130, are dependent upon the simulated 130 and multiple mobile toy vehicle 110, the multiple users
gaming action running upon the portable gaming system 130 can walk around in the same shared physical space while at
as well as real-world motion of and real-world feedback the same time being privy only to the displayed feedback
from the mobile toy vehicle 110. from their own portable gaming system 130. In this way the
0125. As described previously, scoring can be computed users can experience both shared and private aspects of the
based upon the imagery collected from a camera or cameras joint gaming experience. For example an second player may
aboard the mobile toy vehicle 110 or sensor readings from not know how much simulated fuel a first player has left, and
other sensors aboard the mobile toy vehicle 110 or the vice versa, for each of their fuel displays are only provided
motion of the mobile toy vehicle 110, combined with upon each of their respective portable gaming system 130.
simulated gaming action that occurs at the same time as the 0.130. In some embodiments a non-portable gaming sys
imagery is collected, the sensor readings are taken, or the tem 130 can be used alone or in combination with portable
motion of the mobile toy vehicle 110 is imparted. gaming system 130, the non-portable gaming system 130
0126 For example, as described previously, scoring can acting as stationary gaming station for mobile toy vehicle
be incremented, decremented, or otherwise modified based 110 control or a central sever for coordinating the portable
upon the robotic toy vehicle contacting or otherwise collid gaming system 130.
ing with a real world physical object, the scoring also User Gaming Scenario
dependent upon the contacting or colliding occurring in
coordination with simulated gaming action Such as in coor 0131 The unique methods and apparatus disclosed
dination with a displayed image of a graphical target, herein enable a wide variety of gaming scenarios that merge
treasure, barrier, obstacle, or weapon. As another example, simulated gaming action with real world motion and feed
as described previously, scoring can be incremented, decre back from robotic toy vehicles. The gaming scenarios can be
mented, or otherwise modified based upon the robotic toy single player or multi player.
vehicle targeting and firing a simulated weapon upon (and
hitting) another real vehicle, simulated vehicle, or some 0.132. As one simple example of such gaming action, a
other real or simulated object or target that appears upon the game scenario is enabled upon a portable gaming system
portable gaming system 130 display. As another example, as 130 by Software running upon the portable gaming system
described previously, scoring can be incremented, decre 130 that functions as follows: two users compete head to
mented, or otherwise modified based upon the robotic toy head in a task to gather the most simulated treasure (cubes
vehicle being targeted and fired upon (and hit) by simulated of gold) while battling each other for dominance using the
weapons fire from another real vehicle controlled by another simulated weapons aboard their vehicles. Each user has a
player through another portable gaming system 130 or by a portable gaming system 130 connected by wireless commu
simulated vehicle or other simulated opponent generated nication link to a mobile toy vehicle 110. The two portable
within the simulation run upon the portable gaming system gaming system 130 are also in communication with each
130. other by wireless communication links. In this case, all
wireless communication links use Bluetooth technology.
0127. In addition to the methods described in the para The game begins by each user placing their vehicles in
graph above, other factors can be used to increment or different rooms of a house and selecting the 'start game'
decrement scoring variables upon the portable gaming sys option on the user interface of their portable gaming system
tem 130. For example a clock or timer upon the portable 130. An image appears upon each player's portable gaming
gaming system 130 can be used to determine how much time system 130, the image a composite of the video feedback
elapsed during a period in which the mobile toy vehicle 110 from the camera mounted upon the mobile toy vehicle 110
was required to perform a certain task or achieve a certain being controlled by that user combined with overlaid graphi
objective. The elapsed time, as monitored by software cal imagery of a vehicle cockpit (including windows and
running upon the portable gaming system 130, adds to the dashboard meters and readouts). The overlaid graphical
challenge of the gaming experience and provides additional imagery includes a score for each user, currently set to Zero.
metrics by which to determine gaming performance of a The overlaid graphical imagery also includes a distance
USC.
traveled value for each user and is currently set to zero. The
The User and Mobile Toy Vehicle Interaction overlaid graphical imagery also includes a damage value for
0128. A particular advantage provided by the use of a each user and is currently set to Zero. The overlaid graphical
portable gaming system 130 is that a user can walk around, imagery also includes a fuel level value and an ammunition
level value, both presented as graphical bar meters shown in
following his or her mobile toy vehicle 110 as it traverses a FIG. 13. NOTE, FIG. 13 is not as it should be in my
particular local space. This could involve the user walking printout. The full fuel level is represented by the red bar
from room to room as his or her vehicle moves about his or
her house. This could involve a user walking around a park, along the top of the display and the full ammunition level is
represented by the green bar along the top of the display. The
school yard, field, or other outside environment as his or her fuel level bar and ammunition level bar are displayed at
robotic toy vehicle traverses an outside space. The user can varying lengths during the game as the simulated fuel and
employ both direct visual sighting of his or her mobile toy simulated ammunition are used, the length of the displayed
vehicle 110 as well as first person video feedback collected red and green bars decreasing proportionally to simulated
from his or her mobile toy vehicle 110 (as displayed upon fuel usage and simulated ammunition usage respectively.
the screen of the portable gaming system 130) when engag When there is no fuel left in the simulated tank, the red bar
ing in the unique on-screen off-screen gaming experience. will disappear from the display. When there is no ammuni
0129. When multiple users are engaged in a joint gaming tion left in the simulated weapon the greenbar will disappear
experience that includes multiple portable gaming system from the display. Also drawn upon the screen is a green
US 2006/0223637 A1 Oct. 5, 2006

crosshair in the center of the screen. This crosshair repre within or connected to the portable gaming system 130. For
sents the current targeting location of the simulated weapons example if the portable gaming system 130 is a Sony
of the simulated vehicle that is being controlled this dis PlayStation Portable, a commercially available GPS sensor
played portable gaming system 130, the targeting location (and optional magnetometer) can be plugged into a port of
being shown relative to the real physical environment of the the device and is thereby affixed locally to the device. A
mobile toy vehicle 110. In this way simulated vehicle second GPS sensor (and optional magnetometer) is incor
information, including simulated targeting information, are porated within or connected to the mobile toy vehicle 110.
merged with the real physical space of the mobile toy Spatial position and/or motion and/or orientation data
vehicle 110 creating a merged on-screen off-screen gaming derived from the GPS sensor (and optional magnetometer) is
scenario. transmitted back to the portable gaming system 130 over the
0133) Once the game has been started by both users, they bi-directional communication link. In this way the portable
press buttons upon their portable gaming system 130 to gaming system 130 software has two sets of locative data
move their mobile toy vehicle 110 about the real physical (i.e. positions and optional orientations). A first set of
space of their house. As they move the vehicles the camera locative data that indicates the spatial position and/or motion
feedback is updated, giving each player a real-time first and/or orientation of the portable gaming system 130 itself
person view of the local space as seen from the perspective and a second set of locative data that indicates the spatial
of their mobile toy vehicle 110. They are now playing the position and/or motion and/or orientation of the mobile toy
game—their gaming goal as moderated by the gaming vehicle 110. The portable gaming system 130 can then use
Software running on each portable gaming system 130 for these two sets of data and compute the difference between
each player to move his or her mobile toy vehicle 110 about them thereby generating the relative distance between the
the real physical space of the house, searching for simulated portable gaming system 130 and the mobile toy vehicle 110.
targets that will be overlaid onto the video feedback from the relative orientation between the portable gaming system
their vehicle camera by the software running on their 130 and the mobile toy vehicle 110, the relative speed
portable gaming system 130. If and when they encounter between the portable gaming system 130 and the mobile toy
their opponent (the mobile toy vehicle 110 controlled by the vehicle 110, or the relative direction of motion between the
other player) they must either avoid that vehicle or engage portable gaming system 130 and the mobile toy vehicle 110.
it in battle, damaging that vehicle before it damages them. In 0.137 Such difference information can then be used to
this particular gaming embodiment, the simulated targets are update gaming action. Such difference information can also
treasure (cubes of gold) to be collected by running their be displayed to the user in numerical or graphical form. For
vehicle into the location of the treasure. example the relative distance between the portable gaming
0134) The software running upon each portable gaming system 130 and the mobile toy vehicle 110 can be displayed
system 130 decides when and where to display such treasure as a numerical distance (in feet or meters) upon the display
based upon the accrued distance traveled by each mobile toy of the portable gaming system 130. In addition an arrow can
vehicle 110 (as determined by optical encoders measuring be displayed upon the screen of the portable gaming system
the accrued rotation and orientation of the wheels of the 130, the arrow pointing in the direction from the portable
vehicle). As the gold cubes are found and collided with, the gaming system 130 to the mobile toy vehicle 110. In
score of that user is increased and displayed upon the addition a different colored arrow can be displayed upon the
portable gaming system 130. Also displayed throughout the screen of the portable gaming system 130 indicating the
game are other targets including additional fuel and addi direction of motion (relative to the portable gaming system
tional ammunition, also acquired by driving the real vehicle 130) that the mobile toy vehicle 110 is then currently
into the location that appears to collide with the simulated moving. Using such display information, as derived from the
image of the fuel or ammo. When simulated fuel or simu plurality of spatial position or orientation sensors 218, the
lated ammo are found and collided with by a vehicle, the player of the gaming system can keep track of the relative
simulated fuel levels or simulated ammo levels are updated position or orientation or motion of the mobile toy vehicle
for that vehicle in the simulation software accordingly. The 110 during gaming action.
game ends when the time runs out (in this embodiment when 0.138 For embodiments of the current invention that
10 minutes of playing time has elapsed) as determined using include a plurality of mobile toy vehicle 110, each of the
a clock or timer within one or both portable gaming system mobile toy vehicle 110 equipped with a spatial position
130 or when one of the vehicles destroys the other of the sensor Such as a GPS sensor and an optional magnetometer,
vehicles in battle. The player with the highest score at the additional advanced features can be enabled.
end of the game is the winner.
Advanced Tracking 0.139 For example, in some embodiments the locative
sensor data from the plurality of mobile toy vehicle 110 are
0135) In an advanced embodiment of the present inven sent to a particular one (or more) of the portable gaming
tion, an absolute spatial position or orientation sensor 218 is system 130. In other words, a portable gaming system 130
included upon both the portable gaming system 130 and the being used by a first player will received locative data from
mobile toy vehicle 110 such that the software running upon a first mobile toy vehicle 110 over the bi-directional com
the portable gaming system 130 can compute the relative munication link, that mobile toy vehicle 110 being the one
location or orientation between the player (who is holding the first player is controlling.
the portable gaming system 130) and the robotic toy vehicle 0140. In addition, the portable gaming system 130 being
he is controlling.
used by the first player will also receive locative data from
0136. In one embodiment the absolute spatial position a second mobile toy vehicle 110 over a bi-directional
sensor is a GPS sensor. A first GPS sensor is incorporated communication link, that mobile toy vehicle 110 being one
US 2006/0223637 A1 Oct. 5, 2006

that a second player is controlling. Also, the portable gaming be displayed either the position, motion, or orientation of the
system 130 being used by the first player will ALSO receive plurality of mobile toy vehicle 110 relative to the first
locative data from a third mobile toy vehicle 110 over a portable gaming system 130 or can be displayed the posi
bi-directional communication link, that mobile toy vehicle tion, motion, or orientation of the plurality of mobile toy
110 being one that a third player is controlling. Using the vehicle 110 relative to the first mobile toy vehicle 110. The
data from the first, second, and third locative sensors aboard display can be numerical, for example indicating a distance
the first, second, and third mobile toy vehicle 110, the between each of the mobile toy vehicle 110 and the first
gaming Software upon the first portable gaming system 130 portable gaming system 130 or indicating a distance
can update the gaming action as displayed upon the screen between each of the mobile toy vehicle 110 and the first
of that gaming system. For example, the gaming software mobile toy vehicle 110. The display can also be graphical,
upon the first portable gaming system 130 computes and for example plotting a graphical icon Such as dot or a circle
displays the relative distance, or orientation, or motion upon a displayed radar map, the displayed radar map rep
between the first mobile toy vehicle 110 and the second resenting the relative location of each of the plurality of
mobile toy vehicle 110. This may be displayed, for example, mobile toy vehicle 110. The color of the dot or circle can be
as simulated radar upon the display of the first portable varied to allow the user to distinguish between the plurality
gaming system 130, again mixing real-world gaming action of mobile toy vehicle 110. For example in one embodiment
with simulated gaming action. all teammate vehicles are be displayed in one color and all
0141 The gaming software upon the first portable gam opponent vehicles are displayed in another color, and the
ing system 130 also computes and displays the relative vehicle that is being controlled by the player who is wielding
distance, or orientation, or motion between the first mobile
that particular portable gaming system 130 is displayed
toy vehicle 110 and the third mobile toy vehicle 110. In this brighter than all other others. In this way that player can
way the first player can be displayed information upon his know the location of his or her own vehicle, the locations of
portable gaming system 130 that indicates the relative his or her teammate vehicles, and the location of his or her
position or motion or orientation between the mobile toy opponent vehicles. Also if there are entirely simulated
vehicle 110 that he is controlling (the first vehicle) and the vehicles operating along the mobile toy vehicle 110 in the
mobile toy vehicle 110 another player is controlling (the current gaming scenario the locations of the simulated
second vehicle). In addition the first player can be displayed vehicles can optionally be displayed as well. In some
information upon his portable gaming system 130 that embodiments the simulated vehicles are displayed in a
indicates the relative position or motion or orientation visually distinct manner Such that they can be distinguished
between the mobile toy vehicle 110 that he is controlling (the from real vehicles, for example being displayed in a different
first vehicle) and the mobile toy vehicle 110 a third player is color, different shape, or different brightness.
controlling (the third vehicle). And if additional mobile toy 0144. It should be noted that the description above
vehicle 110 were being used, each with additional position focused upon the display of the first player upon the first
sensors, the displayed information could include relative portable gaming system 130, it should be understood that a
position or motion or orientation between the first vehicle similar display can be created upon the portable gaming
and each of the additional vehicles as well. In this way the system 130 of the other users, each of their displays being
first player can know the position, motion, or orientation of generated relative to their portable gaming system 130 or
one or more of the other mobile toy vehicle 110 that are relative to their mobile toy vehicle 110. In this way all player
participating in the game. In some cases those other mobile (or a selective subset of users) can be provided with spatial
toy vehicle 110 are opponents in the gaming scenario. In information about other users with respect to their own
other cases those other mobile toy vehicle 110 are team location or the location of the mobile toy vehicle 110 that
mates in the gaming scenario. In some embodiments the they are personally controlling.
position, motion, or orientation of only certain mobile toy
vehicle 110 are displayed for example only of those User to User Sensor Data Interaction
mobile toy vehicle 110 that are teammates in the gaming 0145 For embodiments such as the ones described above
scenario.
in which a single portable gaming system 130 receives data
0142. In other embodiments the position, motion, or (such as GPS data and magnetometer data) from a plurality
orientation of only other certain mobile toy vehicle 110 are of different mobile toy vehicle 110 over bi-directional com
displayed for example only those mobile toy vehicle 110 munication links, a unique ID can be associated with each
that are within a certain range of the portable gaming system stream or packet of data Such that the single portable gaming
130 of the first player, or only the mobile toy vehicle 110 that system 130 can determine from which mobile toy vehicle
are within a certain range of the first mobile toy vehicle 110. 110 the received data came from or is associated with. It
or only the mobile toy vehicle 110 that are opponents of the should also be noted that in some embodiments the from a
first player, or only the mobile toy vehicle 110 that do not plurality of the different mobile toy vehicle 110 is not
then currently have a simulated cloaking feature enabled, or communicated directly to the first portable gaming system
only the mobile toy vehicle 110 that do not have a simulated 130 but instead is communicated via other of the portable
radar-jamming feature enabled, or only the mobile toy gaming system 130.
vehicle 110 that do not have a shield feature enabled, or only 0146 In such an embodiment each mobile toy vehicle
the mobile toy vehicle 110 that are not obscured by a 110 may be configured to communicate ONLY with a single
simulated terrain feature such as a mountain, hill, or barrier. one of the portable gaming system 130, the sensor data from
0143. The embodiment above including a plurality of the plurality of mobile toy vehicle 110 being exchanged
mobile toy vehicle 110, each with a spatial position sensor among the portable gaming system 130 to enable the fea
aboard, the user of the first portable gaming system 130 can tures described above. In this way a portable gaming system
US 2006/0223637 A1 Oct. 5, 2006

130 can selectively send data about the location of its mobile disclosed, the method also using the Navigation Chip tech
toy vehicle 110 to other of the portable gaming system 130, nology from Agilent. In this embodiment the Navigation
the selective sending of the data depending upon the simu Chip is not mounted on the undersurface of the mobile toy
lated gaming action as controlled by Software running upon vehicle 110 and aimed at the floor as described in the
the portable gaming system 130. example above, but instead is aimed outward toward the
0147 For example, if a particular mobile toy vehicle 110 room within which the mobile toy vehicle 110 is manipu
has a simulated cloaking feature or a simulated radar jam lating. This chip takes rapid low resolution Snapshots of the
ming feature enabled at a particular time, the portable room the way a camera would and uses integrated electron
gaming system 130 that is controlling that mobile toy ics to compute the relative motion (offset) of the Snapshots.
vehicle 110 can, based upon Such current gaming action, Because it is assumed that the room itself is stationary and
selectively determine not to send location information about the mobile toy vehicle 110 is that which is moving, the
the mobile toy vehicle 110 to some or all of the other motion between snapshots (i.e. the offset) can be used to
portable gaming system 130 currently engaged in the game. determine the relative motion of the vehicle over time
Similarly, if a particular mobile toy vehicle 110 is hidden (changing position or orientation). Multiple of the Naviga
behind a simulated mountain or barrier, the portable gaming tion Chips can be used in combination to get more accurate
system 130 that is controlling that mobile toy vehicle 110 change information. For example two sensors—one sensor
can, based upon Such current gaming action, selectively pointed along the forward motion of the vehicle and one
determine not to send location information about the mobile sensor pointed to the left (at a right angle to the forward
toy vehicle 110 to some or all of the other portable gaming sensor). Or as another example four sensors—one sensor
system 130 currently engaged in the game. pointed in each of four directions—forward, back, left, and
right.
Alternate Mobile Vehicle Tracking Methods
0148. The features described above that use relative or 0151. Another method for tracking the position or orien
absolute position, motion, or orientation of vehicles or tation changes of the mobile toy vehicle 110 is to use the
gaming systems are described with respect to GPS sensors camera mounted on the vehicle (as discussed throughout this
other sensors or other sensing methods can be used. For disclosure) and compare Subsequent camera images to deter
example, optical encoders can be used aboard the mobile toy mine motion of the vehicle from image offset data. The
vehicle 110 to track the rotation of wheels as well as the technique is similar to that used by the Agilent sensor
steering angle. By tracking the rotation of wheels and the described above. The advantage of using the camera instead
steering direction during the rotations of the wheels, the of the Agilent sensor is that the more accurate visual data
relative position, motion, or orientation of a vehicle can be yields greater resolution in position and orientation change
tracked over time. This method has the advantage of being information. The disadvantage of using the camera is the
cheaper than GPS and works better indoors than GPS, but is need for more expensive processing electronics to get a
susceptible to errors if the wheels of a vehicle slip with rapid update rate. A rapid update rate is critical for accurate
respect to the ground Surface and thereby distort the accrued position or orientation change data for a mobile toy vehicle
distance traveled or direction traveled information. 110 that is moving or turning quickly over time.
0149 An alternative sensing method that is inexpensive Storing and Displaying Trajectory Information
and accurate on indoor floor Surfaces is hereby disclosed 0152 Another feature enabled by the methods and appa
herein as a novel method of tracking the location, motion, or ratus disclosed herein is the storing and displaying of
orientation of a mobile toy vehicle 110 with respect to a trajectory information. Position or orientation or motion data
ground Surface. This sensing method uses one or more related to a mobile toy vehicle 110 is captured and trans
optical position sensors on the underSurface of the mobile mitted to a portable gaming system 130 as disclosed previ
toy vehicle 110 and aimed down at the floor. Such sensors, ously. This data is then stored in memory local to the
as commonly used in optical computer mice, illuminate a portable gaming system 130 along with time information
Small Surface area with an LED and takes optical pictures of indicating the absolute or relative time when the position or
that surface at a rapid rate (such as 1500 pictures per second) orientation or motion data was captured. This yields a stored
using a silicon optical array called a Navigation Chip. time-history of the mobile toy vehicle 110 position or
Integrated electronics then determine the relative motion of orientation or motion within the memory of the portable
the surface with respect to the sensor. As described in the gaming system 130. The time history is used to update
paper “Silicon Optical Navigation” by Gary Gordon, John gaming action.
Corcoran, Jason Hartlove, and Travis Blalock of Agilent
Technology (the maker of the Navigation Chip), the paper 0153. In some embodiments the user can request to view
hereby incorporated by reference, this sensing method is a graphical display of the time history, the graphical display
fast, accurate, and inexpensive. For these reasons such for example being a plot of the position the mobile toy
sensors are hereby proposed in the novel application of vehicle 110 during a period of time. If for example the user
tracking the changing position or orientation of a mobile toy had commanded the mobile toy vehicle 110 to traverse a
vehicle 110. To get accurate orientation sensing, two of the large oval trajectory, an oval shape is plotted upon the
Navigation Chip sensors can be used upon the undersurface portable gaming system 130.
of the vehicle with a disposed distance between them. By 0154) In other embodiments the scoring of the game is
comparing the differing position change data from each of based in whole or in part upon the stored time-history of the
the two sensors, the changing rotation of the vehicle can be mobile toy vehicle 110 position or orientation or motion. For
derived in software. example the game might require a player to command his or
0150. Another novel method for tracking the position or her mobile toy vehicle 110 to perform a “figure eight'. The
orientation changes of the mobile toy vehicle 110 is hereby Software running upon the portable gaming system 130 can
US 2006/0223637 A1 Oct. 5, 2006

score the user's ability to perform the “figure eight' by move at a particular speed. Other command identifiers
processing the time-history data and comparing the data include a “turn left' command and a “turn right” command
with the characteristic figure eight shape. In this way a user's and a “headlights on and “headlights off command and a
ability to command a robot to perform certain trajectories “move backward’ command and a “sound effect” command
can be scored as part of the gaming action. and a "Zoom camera' command and a “pan camera' com
0155 In other embodiments, the engagement of simu mand and a “fire weapon' command and a “report GPS
lated elements within the gaming action is dependent upon data' command and a “report ultrasound sensor' command
the time history data. For example, certain simulated trea and a “report distance traveled' command and a "spin in
Sures within a gaming scenario might only be accessible place' command. Such commands may or may not include
when reaching that treasure from a certain direction (for command data. If command data is used along with a
example, when coming upon the treasure from the north). To particular command identifier, the command data may
determine how the robotic vehicle comes upon a certain include but is not limited to magnitude values, direction
location, as opposed to just determining if the vehicle is at values, duration values, distance values, or time delay val
that certain location, the Software running upon the portable ues. In addition a command can include a device ID that
gaming system 130 can use the time-history of data. indicates to which of multiple mobile toy vehicle 110 the
command is intended for.
Mobile Toy Vehicle Communication Channel
0156. A bidirectional communication channel is estab 0162. In general electronics within each of the mobile toy
lished between the portable gaming system 130 and the vehicle 110 interprets the received control signals 150 that
mobile toy vehicle 110, the communication connection for are intended for it (as optionally identified by the device ID)
transmitting control signals 150 from the portable gaming and then controls sensors or actuators or lights or speakers
system 130 to the mobile toy vehicle 110 and for transmit or cameras accordingly.
ting sensor data from the from the mobile toy vehicle 110 to
the portable gaming system 130. 0.163 During implementation, Bluetooth is a preferred
wireless communication technology for transmitting control
0157. In some embodiments the mobile toy vehicle 110 signals 150 from portable gaming system 130 to mobile toy
can transmit the sensor data to a plurality of portable gaming vehicle 110, for transmitting sensor data sent from mobile
system 130 devices, each of the portable gaming system 130 toy vehicle 110 to portable gaming system 130, and for
devices updating software controlled gaming action in exchanging game-related data between and among portable
response to the data. gaming system 130 consistent with the features and func
0158. In some embodiments a single portable gaming tions of this invention. Other communication technologies
system 130 can selectively transmit control signals 150 to a can be used, digital or analog. For example other digital
plurality of mobile toy vehicle 110, each of the mobile toy wireless communication methodologies can be used Such as
vehicle 110 identifiable by a unique ID. WiFi and WLAN. Also, purely analog communication meth
0159. In some embodiments a single portable gaming ods can be used in some embodiments for certain appropri
ate features, for example analog radio frequency communi
system 130 can receive sensor data from a plurality of cation can be used to convey camera images from the mobile
mobile toy vehicle 110, the sensor data from each of the toy vehicle 110 to the portable gaming system 130 or to
mobile toy vehicle 110 being associated with a unique ID for convey motor power levels from the portable gaming system
that vehicle.
130 to the mobile toy vehicle 110.
0160 In some embodiments a portable gaming system Camera Zoom Control
130 can communicate with a plurality of other portable
gaming system 130, each of the portable gaming system 130 0164. Another feature enabled in some embodiments of
identifiable by a unique ID, the portable gaming system 130 the current invention is a Zoom control that adjusts the
exchanging data related to the real or simulated Status of a camera lens Zoom focusing upon the mobile toy vehicle 110.
plurality of vehicles being controlled by a plurality of users.
In some embodiments the bidirectional communication 0.165. This is achieved by sending control signals 150
channel is established using a digital wireless communica related to camera lens Zoom focusing from the portable
tion means such as a Bluetooth communication connection. gaming system 130 to the mobile toy vehicle 110 in response
In such digital embodiments the control signals 150 sent to user interactions with buttons (or other manual controls)
from the portable gaming system 130 to the mobile toy upon the portable gaming system 130. For example a Zoom
vehicle 110 are digital commands. lever is incorporated upon one embodiment of the portable
gaming system 130 such that when a user pushes the Zoom
0161 In some embodiments the digital commands follow lever forward, control signals 150 are sent from the portable
a command protocol of a variety of commands, each of the gaming system 130 to the mobile toy vehicle 110 to cause
commands including a command identifier and command the camera to Zoom in. Alternatively when the user pushes
data. For example a digital command identifier is sent from the Zoom lever backwards, control signals 150 are sent from
the portable gaming system 130 to the mobile toy vehicle the portable gaming system 130 to the mobile toy vehicle
110 that indicates a “move forward’ command and the 110 to cause the camera to Zoom out.
command data includes a value representing the speed at
which the mobile toy vehicle 110 is to move. Alternative 0166 Electronics upon the mobile toy vehicle 110
command data can include the distance by which the mobile receives and interprets the control signals 150 from the
toy vehicle 110 is to move. Alternative command data can portable gaming system 130 and controls actuators that
include the time for which the mobile toy vehicle 110 should adjust the camera Zoom appropriately.
US 2006/0223637 A1 Oct. 5, 2006

Physical Space Targeting and Overlaid Graphics select, or otherwise engage a variety of real physical loca
tions or real physical objects or other real physical mobile
0167 One of the valuable features enabled by the meth toy vehicle 110 while playing a simulated gaming scenario.
ods and apparatus disclosed herein is the ability for a player This creates a combined on-screen off-screen gaming expe
of a computer game to target real physical locations or real rience in which a user can use a portable gaming system 130
physical objects or other real robotic devices by adjusting to move a real physical toy about a real physical space while
the position, orientation, or focus of a robotically controlled engaging software generated gaming actions relative to that
Video camera within a real physical space Such that an real physical toy and that real physical space.
overlaid graphical image Such as a graphical crosshair is
positioned upon the video image of the location, object, or What is claimed is:
real robotic device. In one embodiment the method func
tions as follows—a video image of a remote physical space 1. An apparatus for combined on-screen and off-screen
user entertainment, said apparatus comprising:
is received from a remote camera mounted upon the mobile
toy vehicle 110, the direction and orientation of the camera a mobile toy vehicle that varies its position and orienta
dependent upon the direction and orientation of the mobile tion within the physical world in response to received
toy vehicle 110 with respect real physical space as well as control commands, the mobile toy vehicle including a
the direction and orientation of the camera with respect to drive system, an on-board camera, and a wireless
the mobile toy vehicle 110. The video image from the remote communication link;
camera is displayed upon the screen of the portable gaming a portable gaming system running gaming software, the
system 130 for a user to view. A graphical image of a portable gaming system including a visual display, user
crosshair is drawn overlaid upon the video image, the input controls, and a wireless communication link, said
graphical image of the crosshair being drawn at a fixed portable gaming system operative to receive real-time
location upon the screen of the portable gaming system 130, camera data from said mobile toy vehicle over said
for example at or near the center of the screen, as shown in communication link and display a representation of
FIG. 8 and FIG. 13 herein. The user presses buttons (or said camera data upon said visual display, said portable
engages other manual controls) upon the portable gaming gaming system also operative and send control com
system 130, the particular buttons or other controls associ mands to said mobile toy vehicle over said communi
ated with a desired physical motion of the mobile toy vehicle cation link in response to user manipulation of said user
110. In response to the user button presses (or other manual input control; and
control manipulations), the portable gaming system 130
sends control signals 150 to the mobile toy vehicle 110 gaming Software running upon said portable gaming
causing the mobile toy vehicle 110 to move in position or system, said gaming Software operative to monitor
orientation with respect to the real physical space by ener game play and provide the user with a simulated
gizing appropriate motors within the vehicle. Meanwhile vehicle, the simulated vehicle combining the real-world
updated video images continue to be received by the por functions and features of the mobile toy vehicle with
table gaming system 130 from the camera upon the mobile simulated features and functions that are overlaid upon
toy vehicle 110, the images displayed upon the screen of the the visual display of the camera data and/or introduced
portable gaming system 130. Also the graphical image of the into the control interface between the user and the
crosshairs continue to be drawn overlaid upon the updated mobile toy vehicle.
Video image, the location of the crosshairs being drawn at 2. The apparatus as in claim 1; wherein the mobile toy
the fixed location upon the screen of the portable gaming vehicle further comprises:
system 130. Because the crosshairs are displayed at a fixed a mock weapons system;
location upon the screen while the video image is changing
based upon the motion of the mobile toy vehicle 110, the a software configurable vehicle computer control system;
player is given the sense that the crosshairs are moving about
the real physical space (even though the crosshairs are really wherein said software configurable vehicle computer con
being displayed at a fixed location upon the screen of the trol system operatively controls the drive system, the
portable gaming system 130). In this way a user can position weapons system, the video camera, and the communi
the crosshairs at different locations or upon different objects cations link.
within the remote space, thereby performing gaming actions. 3. The apparatus as in claim 1 wherein said drive system
For example, by moving the position or orientation of the includes an electronically controlled motor that powers one
mobile toy vehicle 110 as described herein, a player can or more wheels.
position the crosshairs upon a particular object within the 4. The apparatus as in claim 1 wherein the maximum
real physical space. Then by pressing another particular speed of the drive system is limited by one or more simu
button (or by adjusting some other particular manual con lated vehicle parameters maintained by the gaming software
trol) upon the portable gaming system 130, the user identi and effected by the status of game play.
fies that object, selects that object, fires upon that object, or 5. An apparatus as in claim 4 wherein said one or more
otherwise engages that object within the simulated gaming simulated vehicle parameters includes a simulated terrain
action. In this way the mobile camera affixed to the mobile parameter for the environment of the simulated vehicle.
toy vehicle 110, by sending images with changing perspec 6. The apparatus as in claim 1 wherein the mobile toy
tive to the portable gaming system 130, the images com vehicle further comprises:
bined by gaming software with overlaid graphical a vehicle location system;
crosshairs, the graphical crosshairs drawn at a fixed location
while the video image is changing in perspective with wherein said vehicle location system is connected to a
respect to the real physical space, allows the player to target, Software configurable vehicle computer control system.
US 2006/0223637 A1 Oct. 5, 2006

7. The apparatus as in claim 1 wherein the mobile toy vehicle steering system is modified by a simulated fuel level
vehicle further comprises: and/or damage level maintained by said portable gaming
system.
a microphone; 26. The method according to claim 22. wherein the
wherein said microphone is connected to a software portable gaming system emits a sound when said mobile toy
configurable vehicle computer control system. vehicle has a real-world collision.
8. The apparatus as in claim 1 wherein one or more 27. The method according to claim 22. wherein the mobile
display qualities of said camera data is modified in response toy vehicle emits a sound based upon simulated gaming
to one or more simulated vehicle parameters maintained by action determined by said portable gaming system.
the gaming Software and effected by the status of game play. 28. The method according to claim 22 wherein the por
9. The apparatus as in claim 8 wherein one of said one or table gaming system maintains and displays a score upon
more display qualities is a the brightness of the display of said screen, said score being based at least in part upon
said camera data. real-world actions of said mobile toy vehicle.
10. An apparatus as in claim 9 wherein said one or more 29. The method according to claim 22 wherein the score
simulated vehicle parameters includes a simulated time of is modified based at least in part upon a measured time.
day parameter for the environment of the simulated vehicle. 30. The method according to claim 22 wherein said
portable gaming system is operative to display overlaid
11. An apparatus as in claim 9 wherein said one or more crosshairs upon said real-time camera image, said crosshairs
simulated vehicle parameters includes a simulated weather showing the location within the real physical world at which
parameter for the environment of the simulated vehicle. a simulated weapon of said mobile toy vehicle is aimed.
12. An apparatus as in claim 8 wherein said one or more 31. The method according to claim 22 wherein the relative
simulated vehicle parameters includes a status parameter for location of the mobile toy vehicle to the user of the portable
a simulated shield of the simulated vehicle. gaming system is computed by:
13. The apparatus as in claim 1 wherein the mobile toy reading the location sensor on the portable gaming sys
vehicle further comprises a light, wherein said light is
connected to a software configurable vehicle computer con tem;
trol system. reading the location sensor on the mobile toy vehicle:
14. The apparatus as in claim 13 wherein the signal
amplitude of the light is modified by the vehicle computer computing the difference between the two values.
control system in response to one or more parameters 32. The method according to claim 31 wherein the relative
maintained by the gaming Software and effected by the status location is graphically displayed on the screen.
of game play. 33. The method according to claim 22 further comprising:
15. The apparatus as in claim 6 wherein the vehicle recording the orientation and position of the mobile toy
location system includes one or more of a GPS sensor, a vehicle on a periodic basis.
magnetometer, or an optical sensor. 34. The method according to claim 22 wherein the screen
16. The apparatus as in claim 1 wherein that the gaming displays a crosshairs over said real-time camera image, and
software is further operative to: the user identifies a real-world object using the crosshairs
with manual interaction.
maintains a list of physical object images; and 35. A method for an on-screen/off-screen gaming experi
ence, said method comprising:
maintains a list of virtual objects, with the virtual objects
being identified with the physical object images, and Enabling a first user to control the position and orientation
with the virtual objects being displayed as overlays of a first mobile toy vehicle by manipulating manual
upon said video image data. input controls upon a first portable gaming system, said
17. The apparatus as in claim 1 wherein the gaming first portable gaming system communicating with said
Software is further operative to display upon said screen, a first mobile toy vehicle over a first wireless communi
simulated ammunition level for the simulated vehicle. cation link.
18. The apparatus as in claim 1 wherein the gaming Enabling a second user to control the position and orien
Software is further operative to display upon said screen, a tation of a second mobile toy vehicle by manipulating
simulated fuel and/or power level for the simulated vehicle. the manual input control upon a second portable gam
19. The apparatus as in claim 1 wherein the gaming ing system, said second portable gaming system com
Software is further operative to display upon said screen, a municating with said second mobile toy vehicle over a
simulated shield strength level for a simulated shield of the second wireless communication link.
simulated vehicle, the simulated shield being operative to Enabling said first portable gaming system to exchange
reduce the simulated damage imparted upon the simulated gaming information with said second portable gaming
vehicle by certain system over a third wireless communication link.
23. The method according to claim? wherein the mobile 36. A method as recited in claim 35 wherein said first
toy vehicle stops when hitting a simulated barrier. portable gaming System runs gaming Software, said gaming
24. The method according to claim 22 wherein the user's Software operative to moderate a simulated gaming experi
ability to control the mobile toy vehicle drive system and/or ence that is updated at least in part based upon manual input
steering system is modified by a simulated terrain feature provided by said first user through said manual input control
maintained by said portable gaming system of said first portable gaming system and upon gaming
25. The method according to claim 22 wherein the user's information received from said second portable gaming
ability to control the mobile toy vehicle drive system and/or system over said third wireless communication link.
US 2006/0223637 A1 Oct. 5, 2006
20

37. A method as recited in claim 36 wherein said second and operative to capture image data, said image data trans
portable gaming system also runs gaming software, said mitted to said first portable gaming system over a wireless
gaming software operative to moderate a simulated gaming communication link and displayed upon a display screen of
experience that is updated at least in part based upon manual said first portable gaming system.
input provided by said second user through said manual 43. A method as recited in claim 42 wherein said second
input control of said second portable gaming system and mobile toy vehicle includes a second camera mounted upon
upon gaming information received from said first portable it and operative to capture image data, said image data
gaming system over said third wireless communication link. transmitted to said second portable gaming system over a
38. A method as recited in claim 36 wherein said first
wireless communication link and displayed upon a display
user's ability to control the position of said first vehicle screen of said second portable gaming system.
using the manual input control of said first portable gaming 44. A method as recited in claim 35 wherein said first
system is dependent at least in part upon one or more portable gaming system displays a score to said first user,
simulation parameters updated within said gaming software.
39. A method as recited in claim 38 wherein said one or said score based at least in part upon said gaming informa
more simulation parameters includes a simulated damage tion received form said second portable gaming system over
parameter. said third communication link.
40. A method as recited in claim 38 wherein said one or 45. A method as recited in claim 35 wherein said first
more simulation parameters includes a simulated terrain portable gaming system displays status information related
parameter. to said second mobile toy vehicle, said status information
41. A method as recited in claim 38 wherein said one or based at least in part upon said gaming information received
more simulation parameters includes a fuel level and/or from said second portable gaming system over said third
power level parameter. communication link.
42. A method as recited in claim 35 wherein said first
mobile toy vehicle includes a first camera mounted upon it

You might also like