Object Direction Using Video Input Combined With Tilt Angle Information

Patent No. USRE48417 (titled "Object Direction Using Video Input Combined With Tilt Angle Information") was filed by Sony Interactive Entertainment Inc on Jul 14, 2016.

What is this patent about?

’417 is related to the field of interactive computer entertainment systems, specifically methods and apparatuses for obtaining input data from an object, such as a game controller. Traditional game controllers often rely on button presses and joystick movements. This patent addresses the problem of enhancing user interaction by incorporating the controller's spatial orientation and movement into the control scheme.

The underlying idea behind ’417 is to improve the accuracy and efficiency of object tracking by fusing data from multiple sensors. Specifically, it uses tilt angle information from an inertial sensor (like an accelerometer or gyroscope) to guide the image processing of a video camera. This allows the system to predict the orientation of the object, reducing the computational burden of searching through all possible orientations during image recognition.

The claims of ’417 focus on a method, apparatus, and storage medium for obtaining input data from an object. The core process involves capturing a live image of the object, receiving tilt angle information from a non-image sensor, using this tilt information to generate a rotated reference image , comparing the live image to the rotated reference image, and generating an indication when a match is found.

In practice, the invention works by first capturing a live video feed of the controller. Simultaneously, an accelerometer within the controller measures its tilt angle. This tilt angle is then used to select or generate a reference image that is pre-rotated to match the controller's current orientation. The system then compares the live image to this rotated reference image, looking for a match. When a match is found, it indicates the controller's position and orientation, which can then be translated into game commands.

This approach differs from prior art that relies solely on image processing or inertial sensors. Image-only systems are computationally expensive because they must search through all possible orientations. Inertial-only systems suffer from drift, accumulating errors over time. By combining these two sources of information, ’417 achieves a more robust and efficient tracking system. The fusion of inertial and visual data allows for faster and more accurate object tracking, leading to a more responsive and immersive gaming experience.

How does this patent fit in bigger picture?

Technical landscape at the time

In the mid-2000s when ’417 was filed, at a time when motion capture for gaming was typically implemented using dedicated camera systems or specialized controllers, systems commonly relied on accelerometers or gyroscopes for orientation data rather than fusing it with video input. Hardware or software constraints made real-time image processing and sensor fusion non-trivial.

Novelty and Inventive Step

The examiner approved the application because the prior art did not teach or suggest using tilt angle or motion information from a sensor other than the image capture device to rotate a reference image, comparing the live image with the rotated reference image, and generating an indication when they match.

Claims

This patent contains 43 claims, of which claims 1, 11, 20, 30, and 39 are independent. The independent claims generally focus on methods, apparatuses, and storage mediums for obtaining input data from an object or tracking a device using image capture and sensor data, particularly tilt angle information. The dependent claims generally elaborate on the specific components, features, and steps of the methods, apparatuses, and storage mediums described in the independent claims.

Key Claim Terms New

Definitions of key terms used in the patent claims.

Term (Source)Support for SpecificationInterpretation
Inertial sensor data
(Claim 39)
“Detecting and tracking a user's manipulations of a game controller body may be implemented in different ways. For example, in some embodiments a camera peripheral can be used with the computer entertainment system to detect motions of the hand-held controller body and transfer them into actions in a game.”Data received from one or more sensors that provide information related to a tilt angle of the housing used to obtain at least one rotated reference image.
Infrared (ir) light emitting diodes (leds)
(Claim 30, Claim 39)
“Detecting and tracking a user's manipulations of a game controller body may be implemented in different ways. For example, in some embodiments a camera peripheral can be used with the computer entertainment system to detect motions of the hand-held controller body and transfer them into actions in a game.”Light emitting diodes that emit infrared light, which are attached to the body of the device and used to detect changes in position or orientation of the body in three-dimensional space.
Live image of the object
(Claim 1, Claim 11, Claim 20)
“The user or player of a video game typically holds the game controller with one or both hands in order to operate the buttons, joy stick, etc., located on the controller. Often times while playing the game the user will also move the entire controller itself around in the air as he or she simultaneously operates the buttons, joy stick, etc. Some users tend to get excited while playing the game and attempt to control actions or aspects of the game by moving the entire controller itself around in the air.”A real-time image of the object captured by an image capture device.
Rotated reference image
(Claim 1, Claim 11, Claim 20, Claim 30, Claim 39)
“Various embodiments of the methods, apparatus, schemes and systems described herein provide for the detection, capture and tracking of the movements, motions and/or manipulations of the entire controller body itself by the user. The detected movements, motions and/or manipulations of the entire controller body by the user may be used as additional commands to control various aspects of the game or other simulation being played.”An image of the object that has been adjusted based on tilt angle information received from sensors other than the image capture device. This image is used for comparison with a live image of the object.
Tilt angle of the object
(Claim 1, Claim 11, Claim 20, Claim 30, Claim 39)
“Detecting and tracking a user's manipulations of a game controller body may be implemented in different ways. For example, in some embodiments a camera peripheral can be used with the computer entertainment system to detect motions of the hand-held controller body and transfer them into actions in a game. The camera can be used to detect many different types of motions of the controller, such as for example up and down movements, twisting movements, side to side movements, jerking movements, wand-like motions, plunging motions, etc.”The angular orientation of the object, as determined by one or more sensors distinct from the image capture device. This angle is used to generate a rotated reference image.

Patent Family

Patent Family

File Wrapper

The dossier documents provide a comprehensive record of the patent's prosecution history - including filings, correspondence, and decisions made by patent offices - and are crucial for understanding the patent's legal journey and any challenges it may have faced during examination.

  • Date

    Description

  • Get instant alerts for new documents

USRE48417

SONY INTERACTIVE ENTERTAINMENT INC
Application Number
US15210816
Filing Date
Jul 14, 2016
Status
Granted
Expiry Date
Sep 28, 2026
External Links
Slate, USPTO, Google Patents