Us 11010917
Us 11010917
4400 Detect Light at Sensor/ Store Timing and /or Intensity data
4500
Time End of Sweep /Start of Sync Flash
End
US 11,010,917 B2
Page 2
150
5 160
P
140
5
r c oe s r Gyroscope
)
s
(
Memory
1450
130
AM(s)cagenltromet r
SS 170
180
10
1
.
Fig
10 0 120
U.S. Patent May 18 , 2021 Sheet 2 of 12 US 11,010,917 B2
150
210 Proces r G(
)
s
yroscope 160
20 Power Sup ly
:sss
130
10
180
2A
.
Fig
20 0
U.S. Patent May 18, 2021 Sheet 3 of 12 US 11,010,917 B2
150
2B
.
Fig
210 Proces 160
sss
Gyroscope
)
s
(
20 0
240
U.S. Patent May 18, 2021 Sheet 4 of 12 US 11,010,917 B2
150
2C
.
Fig
210 Proces Gyroscope
)
s
( 160
2150s
?
Memory A(s)Mcagenltromet r
5170
20 Power Sup ly
180
1500
·
250 Data
230 Micro contrle G(
)
s
yroscope 160
2350 Memory 5
20 0
Sss
20
130
Power Sup ly
240
(s)MAcagenltromet r
5
170
180
U.S. Patent May 18, 2021 Sheet 5 of 12 US 11,010,917 B2
3000
O
O
O
3200
1300
Fig . 3A
U.S. Patent May 18, 2021 Sheet 6 of 12 US 11,010,917 B2
@s 130
o
30 0 3B
.
Fig
U.S. Patent May 18, 2021 Sheet 7 of 12 US 11,010,917 B2
4000
Start
4400 Detect Light at Sensor /Store Timing and /or Intensity data
4500
Time End of Sweep /Start of Sync Flash
End
Fig . 4
U.S. Patent May 18, 2021 Sheet 8 of 12 US 11,010,917 B2
Period , T
FIG . 5
2
FIG . 6
U.S. Patent May 18, 2021 Sheet 9 of 12 US 11,010,917 B2
7000
Start
7100
Receive Light Sensor Data
7500
Determine Position and /or Pose of Other Sensors
End
Fig . 7
U.S. Patent May 18, 2021 Sheet 10 of 12 US 11,010,917 B2
8000
Start
End
Fig . 8
U.S. Patent May 18, 2021 Sheet 11 of 12 US 11,010,917 B2
9000
Start
End
Fig . 9
U.S. Patent May 18 , 2021 Sheet 12 of 12 US 11,010,917 B2
Bus/DAdartesa
DBSeauntsoar S
?
103 0 Clock
BSeunsor Decodr
5
s
FSlyanshc Detcor Counter
102 0
10 0 10
.
Fig
10 0
US 11,010,917 B2
1 2
SYSTEMS AND METHODS FOR POSITION for tracking objects in a physical environment for use in
AND POSE DETERMINATION AND VR / AR technology that will enable developers of mobile
TRACKING VR applications to increase user immersion . Moreover, this
integrated mobile solution should provide 6 DoF capabilities
CROSS - REFERENCE TO RELATED 5 as well as sub -millimeter precision without the need to tether
APPLICATIONS to or employ an architecture such as the x86 .
This application is a continuation of U.S. patent applica In addition , there is a need to provide robust position
tion Ser. No. 16 /029,414 filed Jul . 6 , 2018 , published as U.S. sensing / tracking in order to enhance the user experience in
Patent the AR /VR environment. For example , there is a need to
Pat . No.Application
10,460,469Publication 2019/0012801
, which claims , and now
the benefit U.S. 10
of U.S. reduce the effect of judder in the mapping to the AR /VR
environment, which is caused by measurement noise in the
Provisional Patent Application No. 62/ 530,058 filed on Jul . position estimation within the physical environment. At the
7 , 2017. The contents of all of the above identified appli same time , there is a need for high - responsiveness, such that
cations, patents, and patent publications are hereby incor
porated by reference herein in their entireties . the AR /VR environment is able to respond to the sudden
15 motion of tracked objects.
FIELD OF THE INVENTION Systems and methods for determining the position and
pose of an object are provided . In certain embodiments, the
The present invention relates to the field of determining object comprises at least four light sensors . The systems and
the position of and tracking objects using transmitted light. methods derive angular position relative to an emitter for at
Specifically , the invention relates to position and pose deter- 20 least one of the four light sensors based on light received
mination and the tracking of objects, such as headsets and from the emitter at the least one light sensor, the angular
controllers for use in an environment ( e.g. , Virtual Reality position including azimuth , and elevation . The systems and
(VR) and / or Augmented Reality (AR) ) using the capture methods also determine a number (N) of light sensors that
and / or derivation of light timing data . are used to solve a system of equations using the derived
BACKGROUND 25 angular position, the system of equations comprising at least
( N ) simultaneous equations, the solution of which provides
Current solutions for position and pose determination and ( N ) of lightof the
estimates ranges of the (N) light sensors , the number
sensors being at least three . The systems and
tracking of objects suffer from several deficiencies. For methods further determine which of the at least four light
example , devices intended for a mobile environment are
confined to 3 Degree-of-Freedom (3 DoF) tracking solu- 30 the sensors are used to solve the system of equations and using
tions . In other words, these systems are only able to track a lightsystem of equations, solve for a range of each of the (N)
sensors . The systems and methods also use a rigid body
user's motion through three axes ofmovement:yaw ( normal transform and at least one of the solved for ranges or the
axis ) , pitch ( lateral axis ) and roll ( longitudinal axis) about a
fixed position (i.e. , there is no translation ). Because these derived angular position to determine a rigid -body position
systems lack the 6 Degree -of -Freedom ( 6 DoF ) present in 35 for any of the four or more light sensors that were not used
the physical world , developers using these systems are to solve the system of equations.
restricted in the level of immersion they can provide within In certain embodiments, the at least ( N ) simultaneous
their applications. In addition, these current solutions are equations is equal to ( N ). In certain embodiments, in order
only able to provide limited precision , which may be accept- to determine the number (N) of light sensors that are used to
able for gaming , but is inadequate for certain simulations 40 solve a system of equations, the systems and methods further
( e.g. , surgical simulation ), education , and other use cases observe a system state ; determine how many of the at least
where true immersion is required . four light sensors received light exceeding a threshold ; or
On the other hand, current 6 DoF positional tracking determine how many sensors are needed to cover a mini
technology generally requires a powerful computer to derive mum area or volume on the object. In certain embodiments ,
and calculate the relative position of a user whilst inside a 45 the system states comprise at least one of: battery power,
VR or AR environment. For example , this technology may battery usage rate, CPU loading , 1/0 loading, temperature ,
consist of an external infrared lamp, which may be referred and frame rate . In certain embodiments, in order to deter
to as a “ Base Station ” or “ Beacon . " These Base Stations mine which of the at least four lights sensors are used to
broadcast horizontal and vertical sweeps of infrared (IR) solve the system of equations, the systems and methods
light into a physical space at about 60 times a second . The 50 further determine a maximum area or volume covered on the
IR light sweeps are received by IR sensors (e.g. , diodes ) object by the ( N ) sensors; determine a time difference
integrated into objects within the physical space . The IR between when at least two of the at least four light sensors
diodes will interact with the IR sweeps and interpret them as received light; determine which of the at least four lights
timing data . The timing data from all of the IR diodes is sensors received the most light; select from a group of
transmitted via a tethered or wireless connection to an x86 55 previously used sensors ; or use a state in a virtual reality /
( or similarly powerful) machine . The x86 machine then augmented reality environment .
calculates all the time differences between all the individual In certain embodiments, the system and methods further
IR diodes ( e.g. , on the user's headset and input controllers) observe a system state; select a filter from an ordered filter
and can then calculate a sub -millimeter XYZ position for the lattice based on the observed system state ; and update the
user, 60 times a second. Thus, 6 DoF tracking has been 60 selected filter with the solved -for range, derived angular
restricted to high - powered and relatively non -mobile ( teth- position , or determined rigid -body position of one of the
ered) solutions , such as x86 - based architectures . four or more light sensors . In certain embodiments, the
system and methods further receive inertial measurement
SUMMARY OF THE INVENTION unit data , and using a predetermined threshold , determine if
65 the received inertial measurement unit data represents a
Accordingly, there is a need to provide an integrated motion event; and in response to determining that the
mobile solution for determining position and pose as well as received inertial measurement unit data represents a motion
US 11,010,917 B2
3 4
event, forward time project the output of the selected filter FIG . 10 is a block diagram illustrating a system in
using the received inertial measurement unit data ; and use accordance with certain embodiments .
the forward time projected output as an initial solution to the
system of at least ( N ) simultaneous equations. DESCRIPTION OF THE INVENTION
In certain embodiments, in order to derive angular posi 5
FIG . 1 depicts an exemplary system 1000 for determining
tion, the system and methods further detect the start of a sync
flash emitted by the emitter when a certain number of the at an object's ( e.g. , object 1100 ) position and pose and for
least four light sensors receive light within a given period of tracking such objects within a physical space in accordance
time ; and receive timing data associated with one or more of with certain embodiments. Such an object may be a VR / AR
the at least four light sensors that detect light during a sweep 10 headset , controller, or generally any tool ( such as a scalpel ,
paintbrush , sword ), that is capable of being manipulated or
following the detected sync flash . In certain embodiments, in positioned
order to derive angular position, the system and methods position andinpose a physical environment. Determination of
further determine an offset between the frequency of light applicability in the offields as well as tracking such objects has
of gaming, procedure simulation /
received at the at least one of the four light sensors and a 15 emulation , as well as VR / AR generally . For example, a
base frequency of light emitted by the emitter. positioned and tracked VR headset may be used to orient the
In certain embodiments , the object comprises a power camera in a VR / AR environment, while a tracked controller
source and a microcontroller coupled to the four or more may allow interaction in the VR / AR environment. Addition
light sensors , where the microcontroller is configured to : ally, a tracked scalpel may be used in medical simulations or
transmit data associated with the least one of the four light 20 training, while a tracked paintbrush may be used to create
sensors that received light to the processor; and wherein the digital artwork or allow the precise re - creation of a piece
processor is located on a second device distinct from the originally created via the tracked object.
object that is also wired or wirelessly coupled to the micro- System 1000 may include one or more light ( e.g. , IR)
controller. In certain embodiments, the object comprises a emitters ( 1200 ) . These emitters may be capable of perform
microcontroller coupled to the four or more light sensors , 25 ing one or more sweeps of a physical environment. In certain
where the microcontroller is configured to : transmit data embodiments, these sweeps may be orthogonal to each other
associated with the least one of the four light sensors that (e.g. , vertical, horizontal, and / or diagonal sweeps ). In certain
received light to the processor ; and wherein the processor is embodiments, a light emitter may also be capable of trans
located on a second device distinct from the object that is mitting a “ sync flash ,” which is a wide area saturation of
coupled to the object via a cable that supplies power to the 30 light. A sync flash may be used to indicate external timing
four or more light sensors and / or the microcontroller. and / or the timing ( e.g. , beginning / end ) of another type of
operation such as a sweep ( horizontal/vertical/diagonal ),
BRIEF DESCRIPTION OF THE DRAWINGS which emits a more narrow band of light when compared to
a sync flash . As an example, a Lighthouse base station
To facilitate further description of the embodiments , the 35 developed by Valve may be used as an IR emitter.
following drawings are provided, in which like references In certain embodiments, the system includes one or more
are intended to refer to like or corresponding parts, and in light sensors ( 1300 ) capable of detecting light emitted from
which : one or more light emitters. As an example, the light sensors
FIG . 1 is a block diagram illustrating a system in accor- may be a Triad Semiconductor's TS3633 - CM1 Castellated
dance with certain embodiments ; 40 Module. In certain embodiments , the light sensors are
FIG . 2A is a block diagram illustrating a system in embedded within or on an object ( e.g. , object 1100 ) such as
accordance with certain embodiments ; a VR -headset, controller, or tool . In certain embodiments,
FIG . 2B is a block diagram illustrating a system in the positions of the light sensors are fixed relative to each
accordance with certain embodiments ; other. In these embodiments, a rigid -body transform may be
FIG . 2C is a block diagram illustrating a system in 45 used to determine the positions of the light sensors in the
accordance with certain embodiments ; environment when less than all light sensors receive light.
FIG . 3A is a schematic diagram illustrating an object in In certain embodiments, the system includes one or more
accordance with certain embodiments ; processors ( 1400 ) and a memory ( 1450 ) . In certain embodi
FIG . 3B is a schematic diagram illustrating an object in ments, the processor is capable of receiving data from the
accordance with certain embodiments; 50 one or more light sensors . For example, the processor may
FIG . 4 is a flow diagram illustrating a method for deter- be capable of receiving information indicating that a light
mining azimuth and elevation for a light sensor in relation to sensor may have received light emitted from a light emitter.
a light source in accordance with certain embodiments; In certain embodiments , by using a hardware interrupt, the
FIG . 5 is a waveform illustrating timing pulses received processor may detect the transition of a light sensor from a
at a light sensor in accordance with certain embodiments; 55 high- to - low state . As an example, the processor may be an
FIG . 6 is coordinate system illustrating the position of Nvidia Tegra , Qualcomm Snapdragon, PowerVR or an Exy
light sensors in accordance with certain embodiments; nos ARM device. Light sensors may be wired or wirelessly
FIG . 7 is a flow diagram illustrating a method for deter- connected to the one or more processors .
mining position and pose for an object outfitted with an In certain embodiments, the system includes one or more
arbitrary number of light sensors in accordance with certain 60 inertial measurement units ( IMU ) ( 1500 ) , which may be
embodiments ; embedded within or on an object (e.g. , object 1100 ) . An IMU
FIG . 8 is a flow diagram illustrating a method for filtering may include one or more gyroscopes ( 1600 ) , magnetometers
position and pose data using a filter lattice in accordance ( 1700 ) , and / or accelerometers ( 1800 ) , which may be dis
with certain embodiments ; posed in one or more axes . For example, a tri - axial IMU may
FIG . 9 is a flow diagram illustrating a method for employ- 65 include three sets of gyroscopes, magnetometers, and / or
ing IMU data to counterbalance certain effects of filtering accelerometers disposed in three orthogonal axes . Accord
the position and pose estimates of a tracked object; and ingly, the one or more processors may receive readings
US 11,010,917 B2
5 6
representing yaw rate , magnetic field strength , and / or accel- connected or coupled to one another wirelessly ( e.g. , Wi- Fi,
eration in or more axes from an IMU . Zigbee , Bluetooth ) or via a wired connection (e.g. , USB ,
FIGS . 2A , 2B , and 2C depict various exemplary archi- IPC , SPI , a connector, or other cable) .
tectures of a system 2000 for determining an object's ( e.g. , In certain embodiments, power is supplied to the micro
object 1100 of FIG . 1 ) position and pose and for tracking 5 controller, light sensors and/ or IMU via a wired connection
such objects within a physical space in accordance with (e.g. , USB ) from another device such as a device ( 2500 )
certain embodiments. These architectures may be used to including processor ( 2100 ) and power source (2200 ) . In
determine an object's position (e.g. , object 1100 of FIG . 1 ) certain embodiments, the one or more processors ( e.g. ,
using any of the methods discussed in accordance with microcontroller 2300 ) which interface with the light sensors
FIGS . 4 and 7-9 . 10 transmit the light sensor data received from the light sensors
FIG . 2A depicts an embodiment of system 2000 using a to the one or more processors (e.g. , processor 2100 ) for
single processor ( 2100 ) coupled to a memory ( 2150 ) for determining the position of and tracking objects using the
determining an object's position and pose and for tracking sensor data. In certain embodiments, the one or more
such objects within a space . System 2000 may also include processors (e.g. , microcontroller 2300 or processor 2100 )
a power source (2200 ) (e.g. , a DC source such as a battery, 15 which interface with the IMU transmits the IMU measure
AC source , or AC/DC converter ), one or more light sensors ment data received from the IMU to one or more other
( e.g. , light sensors 1300 of FIG . 1 ) and an IMU ( e.g. , IMU processors (e.g. , processor 2100 or microcontroller 2300 )
1500 of FIG . 1 ) . For example, the system of FIG . 2A may for determining the position of and tracking objects using
be incorporated into a standalone VR headset. the sensor data . In certain embodiments, processor 2100 for
FIG. 2B depicts an embodiment of system 2000 using 20 determining position of and tracking objects may be
multiple processors (e.g. , processor (2100 ) coupled to a included in a smartphone or other similar device . The
memory ( 2150 ) and / or microcontroller ( 2300 ) coupled to a smartphone may then be coupled to the light sensors and
me ory ( 2350 ) ) for determining an object's position and microprocessor ( 2300 ) via a USB connection .
pose and for tracking such objects within a space . System To further illustrate , certain VR and / or AR devices ( e.g. ,
2000 may also include a power source ( 2200 ) ( e.g. , a DC25 headsets ) may use a connection to a standalone processor
source such as a battery, AC source , or AC/DC converter ), platform , which may be a mobile device , such as a mobile
one or more light sensors (e.g. , light sensors 1300 of FIG . 1 ) phone, phablet or tablet . Here , the mobile device's processor
and an IMU ( e.g. , IMU 1500 of FIG . 1 ) . In certain embodi- (e.g. , processor 2100 , which may be an ARM processor)
ments, one or more processors (e.g. , microcontroller 2300 ) controls the user experience (e.g. , provides display and
may interface with light sensors to receive light sensor data . 30 calculates / tracks position of the AR / VR device ). The stand
For example, microcontroller (2300 ) may be connected or alone processor platform may be further integrated with one
coupled to one or more light sensors wirelessly ( e.g. , Wi- Fi, or more plates on which light sensors may be disposed.
Zigbee , Bluetooth ) or via a wired connection (e.g. , USB , Light sensor /timing data may be transmitted the mobile
I ? C , SPI , a connector, or other cable ). In turn , microcon- device's processor by a second processor ( e.g. , microcon
troller (2300 ) may be coupled via wire ( e.g. , USB ) or 35 troller 2300 ) which interfaces with light sensors ( e.g. , light
wirelessly (e.g. , Wi- Fi, Zigbee, Bluetooth ) to one or more sensors 1300 of FIG . 1 ) to receive light sensor data . Trans
secondary processors (e.g. , processor 2100 ) for determining mission of light sensor data to processor 2100 may occur
a position for and tracking objects using the sensor data. In over Wi-Fi, USB etc. In this case , power for the light sensors
certain embodiments, one processor (e.g. , microcontroller and the interfacing light sensor processor /microcontroller
2300 ) and light sensors (e.g. , light sensors 1300 of FIG . 1 ) 40 may be supplied by the standalone processor platform via a
are part of a first device ( e.g. , device 2400 ) , while a second connection provided by the standalone processor platform
processor ( e.g. , processor 2100 ) and power source (2200 ) (e.g. , USB ) .
are part of a second device ( e.g. , device 2500 ) . In certain FIG . 2C depicts an embodiment of system 2000 using
embodiments, an IMU ( e.g. , IMU 1500 of FIG . 1 ) may be multiple processors ( e.g. , processor ( 2100 ) coupled to a
part of the first device and connected or coupled to micro- 45 memory ( 2150 ) and /or microcontroller ( 2300 ) coupled to a
controller (2300 ) wirelessly (e.g. , Wi-Fi, Zigbee , Bluetooth ) memory ( 2350 ) ) for determining an object's position and
or via a wired connection ( e.g. , USB , IPC , SPI , a connector, pose and for tracking such objects within a space . System
or other cable ) . In certain embodiments, an IMU (e.g. , IMU 2000 may also include a power source (2200 ) ( e.g. , a DC
1500 of FIG . 1 ) may be part of the second device and source such as a battery, AC source , or AC/ DC converter ),
connected or coupled to processor ( 2100 ) wirelessly ( e.g. , 50 one or more light sensors ( e.g. , light sensors 1300 of FIG . 1 )
Wi- Fi, Zigbee , Bluetooth ) or via a wired connection ( e.g. , and an IMU ( e.g. , IMU 1500 of FIG . 1 ) . In certain embodi
USB , 1C , SPI , a connector, or other cable ). In certain ments, one or more processors ( e.g. , microcontroller 2300 )
embodiments, an IMU ( e.g. , IMU 1500 of FIG . 1 ) may be may interface with light sensors to receive light sensor data .
part of both first and second devices. In certain embodi- For example, microcontroller (2300 ) may be connected or
ments , when IMUs are located on both first and second 55 coupled to one or more light sensors wirelessly ( e.g. , Wi-Fi,
devices, the system may fuse together IMU measurement Zigbee, Bluetooth ) or via a wired connection (e.g. , USB ,
data received from both IMU sources or may select between IPC , SPI , a connector, or other cable ) . In turn , microcon
using IMU measurement data from the first or second device troller (2300 ) may be coupled via wire (e.g. , USB ) or
based on a variety of factors . For example , IMU measure- wirelessly ( e.g. , Wi- Fi, Zigbee , Bluetooth ) to one or more
ment data may be selected based on the relative accuracy of 60 secondary processors (e.g. , processor 2100 ) for determining
each IMU in the aggregate or of the individual components a position for and tracking objects using the sensor data . In
( e.g. , gyroscope, magnetometer, accelerometer ). In certain certain embodiments, one processor (e.g. , microcontroller
embodiments, IMU measurement data may be selected 2300 ) , power source ( 2200 ) , and light sensors ( e.g. , light
based on the relative power requirements of each IMU . In sensors 1300 of FIG . 1 ) are part of a first device ( e.g. , device
certain embodiments, the unselected ( e.g. , high power or 65 2400 ) , while a second processor ( e.g. , processor 2100 ) and
lower accuracy ) IMU is shutdown ( e.g. , disconnected from power source ( 2200 ) are part of a second device (e.g. , device
its power source ). First and second devices may also be 2500 ) . In certain embodiments, an IMU (e.g. , IMU 1500 of
US 11,010,917 B2
7 8
FIG . 1 ) may be part of the first device and connected or ( 1300 ) may be disposed on a faceplate , top , bottom , sides of
coupled to microcontroller ( 2300 ) wirelessly ( e.g. , Wi-Fi, the headset and / or one or more securing straps. Disposing
Zigbee, Bluetooth ) or via a wired connection (e.g. , USB , sensors in more than one plane provides the ability for
IPC , SPI , a connector, or other cable ) . In certain embodi- systems (e.g. , system of FIG . 1 ) and methods ( e.g. , methods
ments, an IMU (e.g. , IMU 1500 of FIG . 1 ) may be part of 5 of FIGS . 4 and 7-9 ) to more accurately determine position
the second device and connected or coupled to processor using a single light emitter even when or more sensors may
( 2100 ) wirelessly ( e.g. , Wi - Fi, Zigbee , Bluetooth ) or via a be occluded . In certain embodiments, as shown in FIG . 3A
wired connection ( e.g., USB , IPC , SPI , a connector, or other sensors ( and processing architectures) may be incorporated
cable ). In certain embodiments , an IMU ( e.g. , IMU 1500 of as part of a removable faceplate ( 3200 ) , which can be further
FIG . 1 ) may be part of both first and second devices. In 10 attached and secured to a pre -existing VR / AR headset to add
certain embodiments, when IMUs are located on both first improved position and pose determination as well as track
and second devices, the system may fuse together IMU ing . In certain embodiments , faceplate 3200 may include
measurement data received from both IMU sources or may various slits , fans, or louvres to improve ventilation and
select between using IMU measurement data from the first cooling. In certain embodiments, as shown in FIG . 3B ,
or second device based on a variety of factors. For example, 15 sensors ( and processing architectures) may be incorporated
IMU measurement data may be selected based on the into a band like device ( e.g. , headband , wristband, arm
relative accuracy of each IMU in the aggregate or of the band)
individual components ( e.g. , gyroscope, magnetometer, FIG . 4 is an exemplary method 4000 for determining
accelerometer ). In certain embodiments, IMU measurement position of an object in accordance with certain embodi
data may be selected based on the relative power require- 20 ments. The method of FIG . 4 may be used in accordance
ments of each IMU . In certain embodiments, the unselected with any of the systems described above with respect to
( e.g. , high power or lower accuracy ) IMU is shutdown ( e.g. , FIGS . 1 , 2A , 2B , 2C , 3A , and / or 3B . For example, proces
disconnected from its power source ). First and second sors 1400 of FIG . 1 and / or processor 2100 and / or micro
devices may also be connected or coupled to one another controller 2300 of FIGS . 2A , 2B , 2C may be configured to
wirelessly (e.g. , Wi-Fi, Zigbee, Bluetooth) or via a wired 25 perform the steps of the method of FIG . 4 described herein .
connection ( e.g. , USB , IPC , SPI , a connector, or other cable ) . În certain embodiments, one or more processors (e.g. , pro
In certain embodiments, the one or more processors ( e.g. , cessor 2100 or microcontroller 2300 of FIGS . 2A , 2B , 2C )
microcontroller 2300 ) which interface with the light sensors are used to determine a type of operation ( e.g. , sync flash or
transmit the light sensor data received from the light sensors sweep ) being performed by one or more light emitters ( e.g. ,
to the one or more processors (e.g. , processor 2100 ) for 30 light emitter 1200 of FIG . 1 ) . In step 4100 , the presence of
determining the position of and tracking objects using the a sync flash may be detected . In certain embodiments , a sync
sensor data . In certain embodiments , the one or more flash is a wide area saturation of light. In certain embodi
processors (e.g. , microcontroller 2300 or processor 2100 ) ments, a sync flash is determined have occurred when a
which interface with the IMU transmits the IMU measure- certain number ( e.g. , 3 , 8 , 12 , or more) or a certain propor
ment data received from the IMU to one or more other 35 tion of light sensors attached to an object detect a light
processors (e.g. , processor 2100 or microcontroller 2300 ) transmission simultaneously or within a certain time period.
for determining the position of and tracking objects using In certain embodiments, a sync flash is detected based upon
the sensor data . In certain embodiments, processor 2100 for how long one or more sensors is illuminated . For example,
determining position of and tracking objects may be a sync flash may be determined when one or more light
included in a smartphone or other similar device . The 40 sensors is illuminated for longer or shorter than a given time
smartphone may then be coupled to the light sensors and period . In certain embodiments, a sync flash is detected
microprocessor ( 2300 ) via a USB connection . based on the level of illumination ( brightness ) of light
FIG . 2C is an exemplary embodiment depicting how received at one or more light sensors . For example, if a sync
passive / ordinary devices (e.g. , devices with no / insufficient flash emission is brighter than a sweep , a sensor receiving
onboard processing and / or power ), such as a scalpel may 45 such higher amplitude light may be indicative of a sync
become track - able in a VR /AR environment. Here , the flash .
system includes a battery powered processor that can be In certain embodiments, detection of a sync flash is
attached , along with an array of light sensors to any passive performed by disabling interrupts and polling each light
device to enable that device to be tracked . For example, a sensor. Polling of light sensors may be performed serially
microcontroller, light sensors and / or battery may be coupled 50 and / or in parallel. For example , a number of n -bit sensors
together and incorporated into / on the passive device . The may be connected to an N -bit data bus of a processor such
microcontroller may communicate light sensor data to a that N / n sensors may be polled simultaneously. As a specific
second processor (e.g. , ARM processor) via wire or wire- example, 32 1 -bit sensors may be polled simultaneously
lessly ( Wi- Fi, Bluetooth , and /or RF ) . The second processor with a 32 - bit bus . In this example, detection ofa sync flash
may than determine position and pose of as well as track the 55 may be accomplished by counting the number of bits read
passive object using the received light data . from the data bus which indicate light reception by a light
FIGS . 3A and 3B are exemplary embodiments of a system sensor. In certain embodiments, a sync flash is detected by
3000 in accordance with certain embodiments of the inven- determining how long one or more bits on the data bus
tion . For example, the systems of FIGS . 3A and / or 3B may remains indicative of light being received at the associated
be used as an object 1100 in the system of FIG . 1. In 60 sensor. In certain embodiments, detection of a light pulse
addition , any of the processing architectures described in will trigger an interrupt causing a processor to determine if
relation to FIGS . 2A , 2B , and /or 2C may be incorporated a sync pulse has been detected.
into the system of FIGS . 3A and / or 3B such that its position In step 4200 , the end of a sync flash may be determined .
may be determined and tracked using any of the methods In certain embodiments, the end of a sync flash is detected
discussed in accordance with FIGS . 4 and 7-9 . FIG . 3A 65 when following the detection of a sync flash , fewer than a
depicts a system including an array of light sensors disposed certain number ( e.g. , 12 , 8 , 3 , or less ) or less than a certain
in a plurality of planes. As depicted in FIG . 3A , light sensors proportion of light sensors detect a light transmission simul
US 11,010,917 B2
9 10
taneously or within a certain time period. In step 4300 , the measurement of the azimuth angle ( ? ) of the light sensor in
beginning of a light sweep (e.g. , vertical, horizontal, diago- the target constellation , according to the following equation
nal ) of a light emitter may be timed . In certain embodiments, where T is the period between scans .
a sweep is a narrow band of light when compared to a sync
flash . Timing of the sweep may occur following the detec- 5
tion of the beginning or the end of a sync flash . In certain Azimuth B = time( P3) -T time(P1)) x 360°
embodiments, a processor may re - enable interrupts and
begin waiting for sweep completion .
In step 4400 , light may be detected at one or more light
sensors . In response to light detection , timing information 10 by Ina FIG
light
. 5 , pulses P4 and P5 are also sync pulses emitted
emitter base station . These sync pulses may be
may be stored. For example, the time between the beginning emitted when
of the sweep and the detection of light may be stored in a horizontal sweepthe. Barring elevation ( vertical) angle is 0 ° for a
memory . In certain embodiments, timing information is used received by all light sensorsocclusion on the
, these pulses should be
tracked object. Pulse P6
to derive angular position ( azimuth ( B ) , and elevation ( O ) ) is received by a light sensor when the horizontal
for the light sensor relative to the emitter. In certain embodi- 15 passes it . The time between the rising edge of pulselaser line
ments, light intensity data indicating the brightness of the P6 provides a direct measurement of the elevation angleand P4
(0 )
received light is also stored . Light intensity data ( along with of the light sensor in the target constellation according to the
timing data ) may be used to differentiate ( reject) reflected following equation.
and / or multipath signals from light received directly. In
certain embodiments, the angular position (e.g. , rotation 20
degree /offset) of a light sweep ( e.g. , vertical, horizontal, ( time( P6 ) - time(P4)
diagonal ) may be encoded in the light transmission itself
( e.g. , by modulating the light pulse) . By encoding sweep
Elevation, 0 = T 2)x 360°
position in the light transmission itself, in certain embodi
ments, a sync flash may no longer be required. In certain 25 In FIG . 5 , pulses P7 and P8 are similar to P1 and P2 and
embodiments , position information may be encoded in the indicate the start of a new revolution . Generally, pulses P1
emitted light itself by offsetting the frequency of the emitted and P2 are emitted from the base station in such a way that
light and associating the frequency offset with an angular the laser line pulse P3 does not interfere with pulse P1 or P2 .
position . For example, as a sweep progress from 0 to 360 Likewise , pulses P4 and P5 are sent out from the base station
degrees the frequency of light emitted by the emitter may 30 in such a way that the laser line pulse P6 does not interfere
also be increased or decreased in accordance with the with P4 or P5 .
progress of the sweep from 0 to 360 degrees. Accordingly, According to the spherical coordinate system (range ( R ) ,
angular position (azimuth ( B ) and elevation ( 0 ) ) for the light azimuth ( B ) , and elevation ( O ) ) , light sensors A , B , and C
sensor relative to the emitter may be derived based on the will be located with coordinates (R4, 01 , B1 ) , (R3, 02, B2 ) ,
determination of an offset between the frequency of the light 35 and (Rc, 03 , B3 ) . FIG . 6 depicts the spherical coordinate
received at the sensor and a base frequency of light emitted system and the relationship amongst sensors A , B , and C.
by the emitter. As discussed above , azimuth and elevation are measured
In step 4500 , the end of a light sweep or beginning of a using the base station and received light signals . Range of
sync flash may be timed . For example, a processor may use the sensors may be solved for via a system of non - linear
internal processor timing to detect the expected end of a 40 equations.
sweep or the expected beginning of the next sync flash . In f(R4,RB ) = Rx + R32-2R ARB cos QAB - AB2 = 0
certain embodiments , a processor may disable interrupts and
once again begin polling light sensors for a sync flash . f(RB,Rc)= Rg +RC - 2RpRc COS ABC - BC2 = 0
Using the exemplary systems of FIGS . 1 , 2A , 2B , 2C , 3A ,
and / or 3B as well as the methods of FIG . 4 it is possible to 45 f(R 4,Rc)= RA2 +RC - 2RqRc cos dac - AC2 = 0
determine the position and pose of as well as track objects cos QAB = sin B , cos , sin B2 cos 02 + sin ß? sin , sin
in a physical environment. This position and pose data may B2 sin 02 + cos Bi cos B2
then be used in an AR /VR environment. Position /pose of the
object is determined based upon streams of light sensor data cos Abc = sin B2 cos O2 sin B3 cos Oz + sin B2 sin 02 sin
received from one or more light sensors . Prior solutions for 50 B3 sin Oz + cos B2 cos B3
determining position of tracked objects using light emitters
and sensors have been previously described in Islam et al . , cos Cac = sin B , cos O , sin B3 cos 02 + sin B , sin 0 , sin
“ Indoor Positional Tracking Using Dual - Axis Rotating B3 sin 03 + cos Bi cos B3
Laser Sweeps , ” IEEE International Instrumentation and When the length of the sides AB , BC , and AC are known
Measurement Technology Conference Proceedings, p . 1315- 55 in advance, the system of equations can be solved for using
21 , Taipan, Taiwan ( 2016 ) . a root finding method (e.g. , Newton's root finding method ).
FIG . 5 depicts an exemplary pulse train emitted from a The length of the sides AB , BC , and AC may be known in
light emitter (e.g. , light emitter 1200 of FIG . 1 ) and received advance when the light sensors are fixed relative to each
by a light sensor (e.g. , light sensor 1300 of FIG . 1 ) present other or can otherwise be determined via a measurement.
on a tracked object (e.g. , object 1100 of FIG . 1 ) . In FIG . 5 , 60 For example, when two or more light sensors are on a
pulses P1 and P2 are sync pulses emitted by a light emitter flexible or spooled band a potentiometer, spring, or similar
base station . These sync pulses may be emitted when the tool could be used measure the distance between them .
azimuth ( horizontal) angle is 0 ° for a vertical sweep . Barring FIG . 7 is an exemplary method 7000 for determining
occlusion , these pulses should be received by all light position of an object on which more than three light sensors
sensors on the tracked object. Pulse P3 is received by a light 65 are disposed in accordance with certain embodiments. The
sensor when the vertical sweep passes it . The time between method of FIG . 7 may be used in accordance with any of the
the rising edges of the pulse P1 and P3 provides a direct systems described above with respect to FIGS . 1 , 2A , 2B ,
US 11,010,917 B2
11 12
2C , 3A , and /or 3B as well as the method of FIG . 4. For mizing the number of planes covered by the sensors . In
example , processors 1400 of FIG . 1 and / or processor 2100 certain embodiments, the n - sensors may be selected based
and / or microcontroller 2300 of FIGS . 2A , 2B , 2C may be on the relative time differences of when they received light.
configured to perform the steps of the method of FIG . 7 For example, the n - sensors may be chosen to maximize or
described herein . Any number of sensors may be fitted to a 5 minimize the timing difference amongst them . To further
tracked object, thus reducing the effects of possible occlu- illustrate, in certain embodiments, a sensor which received
sion, while still having enough sensors to accurately deter- light the earliest during a sweep and the sensor which
mine position . Generally, it is possible to solve for the ranges received light the latest during a sweep may be used to solve
of any number of sensors . For example, a VR / AR headset the system of equations. In certain embodiments, selection
may be outfitted with 20 or more light sensors . In certain 10 of which of n- sensors to use for solving the system of
embodiments, multi -planar light sensor placement along equations is based on which sensors were previously used to
with the linear- system property discussed herein may be solve the system . For example, if sensors A , B , and C were
used to heuristically limit the number of iterations in the root used as part of a previous solution any combination ( e.g. ,
finding method used to solve for light sensor ranges. one, two, all ) of A , B , C may selected to solve the system of
In step 7100 , timing and / or light intensity data may be 15 equations. To further illustrate, if four sensors are being used
received for one or more light sensors . In certain embodi- to solve the current system of equations, sensors A , B , C , E
ments, light timing and / or intensity data was previously may be used as part of a solution or sensors B , C , E , F may
stored as part of the methods of FIG . 4. In step 7200 , the be used instead . In certain embodiments, the previous
received light sensor data may be associated with one or examples may be combined and any of the factors weighted
more light sensors. In certain embodiments, light sensor data 20 to arrive at a grouping of which sensors to use in solving the
may be filtered as part of step 7200 or other steps . system of equations . In certain embodiments , selection of
In step 7300 , a determination is made as to the number of which of n - sensors to use for solving the system of equations
light sensors ( for which data was received ) are to be used to is based on a state in a VR /AR environment (e.g. , position
solve a system of equations for determining the object's of a camera , location of a point of interest, setting, etc. ).
position and pose . For example, an object may be outfitted 25 In step 7500 , the positions and /or pose of the other sensors
with twenty light sensors , light sensor data may be received ( e.g. , those sensors not used to solve the system of equa
for any number of those sensors , such as fifteen . In this case , tions ) is determined . For example, a rigid body transform
position and pose of the object may be determined using may be used to locate the other sensors based on the known
systems of equations using any number of sensors between or measured geometry of the sensors disposed on the object
three and fifteen . In certain embodiments, the number of 30 as well as the positions for the sensors as determined by the
light sensors used to solve the system of equations is based solution to the system of equations.
upon various observed system states . Examples of such As discussed above , the ranges of the various sensors
systems states are: remaining battery power, battery usage represent the unknown variables, the light sensor timing data
rate, loading (e.g. , CPU , I/ O , etc. ), temperature , frame rate , represents the azimuth and elevation of the sensors , while
or other application/ user settings. In certain embodiments, 35 the sensor geometry describes the number of simultaneous
the n -brightest sensors based on light intensity data are used equations. For example, in the case of four sensors there are
to solve the system of equations. For example, the n -bright- four unknown ranges, one for each sensor and a correspond
est sensors may be those n - sensors that received light ing set of six equations. The six equations include the three
exceeding a threshold . In certain embodiments, the number above for f (R4, RB ), f (RB, Rc) , f (R4, Rc) as well as three
of sensors used is based on covering a minimum area / 40 equations which includes the geometry for the fourth sensor
volume framed by the physical placement of the light ( D ) relative to other three : f ( Rp , R ), f (Rp, Ro ), f (RDR ).
sensors on the object. For example, if light data is received Accordingly, in certain embodiments, less than the entire
for fifteen of twenty sensors disposed on an object, the number of simultaneous equations is used . For example, in
number of sensors used to solve the system of equations may addition to f (RA, RB ) , f (RB , RC ) , f (RA , RC ) only one of:
be the minimum number of sensors which cover at least a 45 f (RD , RA ), f (RD , RB ), f (RD , RC ) may be used ( solved for ).
determined number of square or cubic inches. In embodi- Other combinations are acceptable. Using only one of f(RD ,
ments where sensor geometry is known in advance , map- RA ), f (RD , RB ) , f(RD , RC) makes the number of variables
pings between sensor combinations and covered surface area equal to the number of equations. Similarly , for each addi
may be computed and stored in advance . In other embodi- tional sensor only one additional equation may be solved for
ments , covered area or volume for sensor combinations may 50 (e.g. , 8 sensors , 8 equations describing their geometry ). In
be computed as needed based on a known geometry. In certain embodiments, a pseudo - inverse is used to solve for
certain embodiments , the previous examples may be com- the system of linear or non - linear equations. For example,
bined and any of the factors weighted to arrive at an when using an overdetermined system — more equations
appropriate number of sensors to use in solving the system than unknowns a pseudoinverse may be used as a part of
of equations. 55 a root -finding method used to solve the system of equations.
In step 7400 , a determination may be made as to which of In certain embodiments, the computational complexity of
n - sensors to use in solving a system of equations for solving for the non - linear system can be reduced by solving
determining the object's position and pose . For example, in a linear system approximating the solution to the non - linear
step 7300 , it may have been determined to use four sensors system . In certain embodiments , azimuth and elevation
out of a possible fifteen sensors for which light sensor data 60 measurements derived from three or more identified light
was received to determine position and pose of the object. In sensors are used to solve the linear system . The linear
certain embodiments , it may be desirable to select the solution may then be applied as initial values for the solution
sensors in order to maximize /minimize the area or volume to the non - linear system (e.g. , as initial values for the root
covered by the n (in this case 4 ) sensors used to solve the finding method ). When starting from a linear solution, in
system of equations. In certain embodiments , sensors may 65 certain embodiments, when the non - linear system converges
be selected based on their planar orthogonality. For example , quickly (i.e. , within a small number of iterations for a given
n - sensors may be selected based on maximizing or mini- tolerance ), the linear solution may be returned as a solution
US 11,010,917 B2
13 14
to the non - linear system . In certain other embodiments, a ever, if frame rate decreases below a threshold when using
solution provided by an iteration of the non - linear root the Kalman filter then the system may prefer the computa
finding method may be used . In certain embodiments, a prior tionally cheaper moving average filter. In certain embodi
solution , filtered output (discussed below in relation to FIG . ments, switching amongst filters is supported by maintaining
8 ) , and / or predicted output (discussed below in relation to 5 a buffer of previous filtered positions that may be accessed
FIG . 9 ) or a weighted combination thereof, may be applied by the next chosen filter. In certain embodiments, a filter is
as initial values for the solution to the non- linear system maintained for each sensor on the object. In certain embodi
( e.g. , as initial values for the root finding method ). ments, a filter is maintained for a maximum number of
Solving for the system of equations as discussed above , sensors that is less than the number sensors on the object.
provides an accurate estimate of the tracked object's posi- 10 For example, the maximum number of filters may be based
tion ( range, azimuth , elevation) . The position estimates , on the maximum number of sensors for which a system of
however, are generally subject to noise . Such noise in the equations may be solved for as discussed above with respect
tracked object's position may show up as judder in an to the method of FIG . 4. In certain embodiments, the number
AR /VR environment. Simple low -pass filtering of the of filters corresponds to the number of sensors for which a
tracked objection's position may be used to reduce judder, 15 system of equations was solved for as discussed with respect
however, this increases the system's response times and to the method of FIG . 4 .
correspondingly reduces immersion and may lead to user FIG . 9 depicts an exemplary method 9000 for counter
sickness . balancing certain effects of filtering the position and pose of
FIG . 8 depicts an exemplary method 8000 for reducing a tracked object. The method of FIG . 9 may be used in
judder while maintaining system response times in accor- 20 accordance with any of the systems described above with
dance with certain embodiments . The method of FIG . 8 may respect to FIGS . 1 , 2A , 2B , 2C , 3A , and / or 3B as well as the
be used in accordance with any of the systems described methods of FIGS . 4 , 7 , and 8. For example, processors 1400
above with respect to FIGS . 1 , 2A , 2B , 2C , 3A , and / or 3B of FIG . 1 and /or processor 2100 and / or microcontroller 2300
as well as the methods of FIGS . 4 and 7. For example, of FIGS . 2A , 2B , 2C may be configured to perform the steps
processors 1400 of FIG . 1 and /or processor 2100 and /or 25 of the method of FIG . 9 described herein. In certain embodi
microcontroller 2300 of FIGS . 2A , 2B , 2C may be config- ments, accelerometer, magnetometer, and / or gyroscopic
ured to perform the steps of the method of FIG . 8 described ( IMU ) data streams may be employed to counterbalance any
herein . In certain embodiments, a hierarchy of filters is over -smoothing effects of the judder smoothing filters dis
employed in order to smooth the estimate of the tracked cussed above . In certain embodiments , use of IMU data
object's position . In step 8100 , a hierarchy of filters may be 30 allows the AR /VR system to maintain responsiveness and
arranged . For example , the hierarchy of filters may be maintain or increase immersion . In step 9100 , one or more
arranged based on output quality and / or computational cost . sets of IMU measurement data may be received . In step
In certain embod nts , the ordering may result in a totally 9200 , a determination may be made as to whether the
ordered lattice, where F = { f1, f2, ... fn } is the set of filters received IMU data indicates a motion event. For example ,
and < is a total order over F. 35 when the received IMU data exceeds certain thresholds ( a
In step 8200 , one or more systems states may be observed magnitude ) which indicate fast movement, the IMU data
and a filter may be selected . For example, if f ;< f, then the may be used instead of smoothed position /pose data . In
adaptive filtering system may transition from filter f; to filter certain embodiments, IMU and smoothed / filtered position
f, on the occurrence of an observation threshold . An obser- sensor data may be weighted and combined based on the
vation threshold is a tuple of values related to various system 40 magnitude of the received IMU data . In certain embodi
states . For example, system states may be related to : remain- ments , the choice of observation threshold values and / or
ing battery power, battery usage rate , loading ( e.g. , CPU , weights is configurable by the application programmer or
1/0 , etc. ) , temperature, frame rate, or other application /user user. In step 9300 , filtered / smoothed sensor position data
settings). In certain embodiments, when one or more thresh- and /or IMU data may be manipulated ( e.g. , time - forward
old values is exceeded the system transitions from filter f; to 45 projected ) and fed back as an initial solution to the system
filter f ;. However, when no or only limited /certain threshold of non - linear equations discussed above .
values are exceeded, the system may choose to use the FIG . 10 depicts an exemplary timing module 10000 for
highest - quality filter available . In certain embodiments, sys- determining the timing of when light emitted from a light
tem states are individually weighted, such that the chosen emitter ( e.g. , light emitter 1200 of FIG . 1 ) is detected at one
filter will be based on the weighted summation of the various 50 or more light sensors (e.g. , light sensor 1300 of FIG . 1 ) in
observed system states . Any suitable filter may be used for accordance with certain embodiments. For example, the
adaptive filtering (e.g. , Kalman , Chebyshev , Butterworth , system of FIG . 10 may be used in the system of FIG . 1 or
moving average , etc. ). In step 8300 , the selected filter may as part of any of the processing architectures described in
be updated with one or more measurement values . For relation to FIGS . 2A , 2B , and / or 2C to allow for the position
example, sensor position and pose estimates previously 55 of light sensors to be determined and tracked using any of
determined by a solution to a system of equations as the methods discussed in accordance with FIGS . 4 and 7-9 .
discussed above with respect to FIG . 7 may be used to In certain embodiments, timing module 10000 may replace
update filtered position estimates for one or more sensors on microcontroller 2300 and/or memory 2350. Use of timing
an object (e.g. , object 1100 of FIG . 1 ) . module 10000 may improve timing accuracy when com
To further illustrate, consider a moving average filter and 60 pared with a processor, which may be less accurate due to
a Kalman filter which are both run over the same raw data masked interrupt states , serving of higher priority interrupts ,
input. The Kalman filter is perceived to be of higher quality or may also allow for increased angular resolution based on
as it removes more visual vibrations from the user's AR /VR higher possible clock speeds in timing module 10000 .
experience. However, the Kalman filter requires signifi- In certain embodiments, a clock signal may be used to
cantly more floating point operations than the moving aver- 65 drive one or more X -bit counters 10100 to cause it to count
age filter. For example, if CPU usage drops below 25 % then up /down. In certain embodiments, one X - bit counter may be
the system may use the higher -quality Kalman filter. How- used for each light sensor or a group of light sensors . In
US 11,010,917 B2
15 16
certain embodiments, a sensor data bus represents the com- position , pose , range of the various light sensors as dis
bined n - bit signals from N or more light sensors . In certain cussed with respect to the methods of FIGS . 4 and 7-9 . In
embodiments, timing module includes a sync flash detector certain embodiments, reading the memory causes the invali
10200 , which is used to detect the presence of a sync flash dation of its contents .
emitted from a light emitter. In certain embodiments, a sync 5 While there have been shown and described and pointed
flash is determined to have occurred when a certain number out various novel features of the invention as applied to
( e.g. , 3 , 8 , 12 , or more ) or a certain proportion of the N light particular embodiments thereof, it will be understood that
sensors connected to the sensor data bus detect a light various omissions and substitutions and changes in the form
transmission simultaneously or within a certain time period. and details of the systems and methods described and
For example, in the case of 1 - bit light sensors , a sync flash 10 illustrated , may be made by those skilled in the art without
may be detected when three or more light sensors indicate a departing from the spirit of the invention . Those skilled in
bit value consistent with the presence of the detection of the art will recognize, based on the above disclosure and an
light ( e.g. , ' O’or ‘ 1 ' ) . In the case of n -bit light sensors, a sync understanding therefrom of the teachings of the invention,
flash may be detected when three or more light sensors that the particular hardware and devices described, and the
indicate a value that exceeds or falls below a threshold 15 general functionality provided by and incorporated therein ,
consistent with the presence of light detection . In certain may vary in different embodiments of the invention . Accord
embodiments, when a particular light sensor detects light, ingly, the particular system components shown in the vari
the current value of a counter is stored in a memory 10400 ous figures are for illustrative purposes to facilitate a full and
in a location associated with the light sensor. In certain complete understanding and appreciation of the various
embodiments, in order to capture light duration, when a 20 aspects and functionality of particular embodiments of the
particular light sensor stops detecting light following the invention as realized in system and method embodiments
detection of light, the current value of a counter is stored in thereof. Those skilled in the art will appreciate that the
a memory 10400 in a location associated with the light invention can be practiced in other than the described
sensor. In certain embodiments, a sync flash is detected embodiments, which are presented for purposes of illustra
when the difference between counter values for one or more 25 tion and not limitation .
sensors exceeds a threshold . What is claimed is :
Once a sync flash is detected , sync flash detector may 1. A system for determining position and pose of a
output a sync flash signal . In certain embodiments, a sync physical object, the system comprising:
flash signal is provided from a processor ( e.g. , processors an emitter configured to provide a light sweep and a sync
1400 of FIG . 1 and / or processor 2100 and / or microcontroller 30 flash ;
2300 of FIGS . 2A , 2B , 2C ) when it detects a sync flash . In at least four light sensors disposed on the object, wherein
certain other embodiments, sync flash signal is provided to the at least four light sensors are configured to receive
a processor to indicate a sync flash was detected by timing the light sweep and the sync flash from the emitter ;
module 10000 . a processor coupled to the light sensors and configured to :
In certain embodiments , counter ( s) 10100 receive a sync 35 distinguish between the light sweep and the sync flash
flash signal , which causes counter ( s ) to return to a known provided by the emitter ;
value (e.g. , reset ). In certain embodiments, counter ( s) or derive angular position relative to the emitter for at
memory receive a sync flash signal, which causes the least one of the four light sensors based on light
counter to store its current count value in a memory . In received from the emitter at the at least one light
certain embodiments, for example, when counters are shared 40 sensor during the light sweep ;
amongst two or more light sensors, during a light sweep determine a number (N) of light sensors that are used
following a sync flash , sensor bus decoder 10300 determines to solve a system of equations using the derived
which of any of the N - light sensors connected to the sensor angular position , the system of equations comprising
data bus have received light. In certain embodiments , when at least ( N ) simultaneous equations, the solution of
a particular light sensor detects light, the current value of a 45 which provides range estimates for the ( N ) light
counter is stored in a memory 10400 in a location associated sensors , the number (N) of light sensors being at
with the light sensor. In certain embodiments , in order to least three;
capture light duration , when a particular light sensor stops determine which of the at least four light sensors are
detecting light following the detection of light, the current used to solve the system of equations, the determi
value of a counter is stored in a memory 10400 in a location 50 nation being at least partly based upon which of the
associated with the light sensor. In certain embodiments, at least four light sensors received light from the
timing module 10000 includes logic for correlating a light emitter during the light sweep ;
sensor with its timing data indicated by a counter value . In using the system of equations, solve for a range of each
certain embodiments, for example, when memory locations of the ( N ) light sensors ; and
in memory are not associated with a particular light sensor 55 using a rigid body transform and at least one of the
( e.g. , in a FIFO ) , the identity of the light sensor receiving solved for ranges or the derived angular position ,
light is also stored in a memory alongside or pointing to its determine a rigid -body position for any of the at least
associated counter / timing data . In certain embodiments , four light sensors that were not used to solve the
memory receives a sync flash signal , which may cause the system of equations .
invalidation of its contents (e.g. , writing an invalid bit or an 60 2. The system of claim 1 , wherein the at least (N)
invalid count value) . simultaneous equations is equal to (N) .
Following the completion of one or more sweeps ( e.g. , 3. The system of claim 1 , wherein in order to determine
vertical, horizontal) or sweep pairs , a processor ( e.g. , pro- the number (N ) of light sensors that are used to solve a
cessors 1400 of FIG . 1 and / or processor 2100 and / or micro- system of equations, the processor is further configured to :
controller 2300 of FIGS . 2A , 2B , 2C ) may read memory via 65 observe a system state;
address /data bus to retrieve stored timing values from determine how many of the at least four light sensors
memory. These timing values may be used to determine the received light exceeding a threshold ; or
US 11,010,917 B2
17 18
determine how many sensors are needed to cover a a cable that supplies power to the at least four light
minimum area or volume on the object. sensors and the microcontroller.
4. The system of claim 3 , wherein the system states 13. A method for determining position and pose of an
comprise at least one of: object in a physical environment, the physical environment
battery power , battery usage rate , CPU loading , I/O 5 comprising:
loading, temperature , and frame rate . an emitter configured for providing a light sweep and a
5. The system of claim 1 , wherein in order to determine sync flash;
which of the at least four lights sensors are used to solve the at least four light sensors disposed on the object, wherein
system of equations, the processor is further configured to : 10 the at least four light sensors are configured for receiv
determine a maximum area or volume covered on the ing the light sweep and the sync flash from the emitter ;
object by the (N ) sensors ; a processor coupled to the light sensors , the method ,
determine a time difference between when at least two of operable with the emitter, the light sensors, and the
processor, comprising :
the at least four light sensors received light; distinguishing by the processor between the light sweep
select from a group of previously used sensors; or 15 and the sync flash provided by the emitter;
use a state in a virtual reality /augmented reality environ deriving by the processor angular position relative to
ment. the emitter for at least one of the four light sensors
6. The system of claim 1 , wherein the processor is further based on light received from the emitter at the at least
configured to : observe a system state; one light sensor during the light sweep ;
select a filter from an ordered filter lattice based on the 20 determining by the processor a number (N) of light
observed system state; and sensors that are used to solve a system of equations
update the selected filter with the solved - for range , using the derived angular position , the system of
derived angular position , or determined rigid -body equations comprising at least (N) simultaneous equa
position of one of the four or more light sensors . tions , the solution of which provides range estimates
7. The system of claim 6 , wherein the system states 25 for the ( N ) light sensors , the number (N) of light
comprise at least one of : sensors being at least three ;
battery power, battery usage rate, CPU loading , I /O determining by the processor which of the at least four
loading, temperature , and frame rate . light sensors are used to solve the system of equa
8. The system of claim 6 , wherein the processor is further tions , the determination being at least partly based
configured to : 30 upon which of the at least four light sensors received
receive inertial measurement unit data ; light from the emitter during the light sweep;
using a predetermined threshold , determine if the received using the system of equations, solving by the processor
inertial measurement unit data represents a motion for a range of each of the (N) light sensors ; and
event; using a rigid body transform and at least one of the
in response to determining that the received inertial 35 solved for ranges or the derived angular position ,
measurement unit data represents a motion event, for determining by the processor a rigid -body position
ward time project the output of the selected filter using for any of the four or more light sensors that were not
the received inertial measurement unit data ; and used to solve the system of equations.
use the forward time projected output at least as part of an 14. The method of claim 13 , wherein the at least (N)
initial solution to the system of at least (N) simultane- 40 simultaneous equations is equal to (N) .
ous equations. 15. The method of claim 13 , wherein in order to determine
9. The system of claim 1 , wherein to distinguish between the number ( N ) of light sensors that are used to solve a
the light sweep and the sync flash provided by the emitter, system of equations, the method further comprises:
the processor is further configured to : observing a system state ;
detect a start of the sync flash when a certain number of 45 determining how many of the at least four light sensors
the at least four light sensors receive light from the received light exceeding a threshold ; or
emitter within a given period of time . determining how many sensors are needed to cover a
10. The system of claim 1 , wherein in order to derive minimum area or volume on the object.
angular position, the processor is further configured to : 16. The method of claim 15 , wherein the system states
determine an offset between the frequency of light 50 comprise at least one of:
received at the at least one of the four light sensors and battery power , battery usage rate, CPU loading , I /O
a base frequency of light emitted by the emitter. loading, temperature, and frame rate .
11. The system of claim 1 , wherein the object comprises 17. The method of claim 13 , wherein in order to determine
a power source and a microcontroller coupled to the at least which of the at least four lights sensors are used to solve the
four light sensors, the microcontroller configured to : 55 system of equations, the method further comprises:
transmit data associated with at least one of the four light determining a maximum area or volume covered on the
sensors that received light to the processor, and object by the ( N ) sensors;
wherein the processor is located on a second device determining a time difference between when at least two
distinct from the object that is also wired or wirelessly of the at least four light sensors received light;
coupled to the microcontroller. 60 selecting from a group of previously used sensors ; or
12. The system of claim 1 , wherein the object comprises using a state in a virtual reality /augmented reality envi
a microcontroller coupled to the at least four light sensors , ronment.
the microcontroller configured to : 18. The method of claim 13 , wherein the method further
transmit data associated with at least one of the four light comprises :
sensors that received light to the processor; and 65 observing a system state ;
wherein the processor is located on a second device selecting a filter from an ordered filter lattice based on the
distinct from the object that is coupled to the object via observed system state ; and
US 11,010,917 B2
19 20
updating the selected filter with the solved - for range ,
derived angular position , or determined rigid -body
position of one of the four or more light sensors .
19. The method of claim 18 , wherein the system states
comprise at least one of: 5
battery power, battery usage rate, CPU loading , I /O
loading, temperature, and frame rate .
20. The method of claim 18 , wherein the method further
comprises:
receiving inertial measurement unit data ; 10
using a predetermined threshold , determining if the
received inertial measurement unit data represents a
motion event;
in response to determining that the received inertial
measurement unit data represents a motion event, for- 15
ward time projecting the output of the selected filter
using the received inertial measurement unit data ; and
using the forward time projected output at least as part of
an initial solution to the system of at least ( N ) simul
taneous equations. 20
21. The method of claim 13 , wherein to distinguish
between the light sweep and the sync flash provided by the
emitter, the method further comprises:
detecting a start of the sync flash emitted by the emitter
when a certain number of the at least four light sensors 25
receive light from the emitter within a given period of
time .
22. The method of claim 13 , wherein in order to derive
angular position, the method further comprises :
determining an offset between the frequency of light 30
received at the at least one of the four light sensors and
a base frequency of light emitted by the emitter.