US20090325710A1 - Dynamic Selection Of Sensitivity Of Tilt Functionality - Google Patents

Dynamic Selection Of Sensitivity Of Tilt Functionality Download PDF

Info

Publication number
US20090325710A1
US20090325710A1 US12/163,345 US16334508A US2009325710A1 US 20090325710 A1 US20090325710 A1 US 20090325710A1 US 16334508 A US16334508 A US 16334508A US 2009325710 A1 US2009325710 A1 US 2009325710A1
Authority
US
United States
Prior art keywords
input device
remote input
sensitivity range
motion sensor
processing device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/163,345
Inventor
Eric P. Filer
Loren Douglas Reas
Vasco Rubio
Dennis W. Tom
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Priority to US12/163,345 priority Critical patent/US20090325710A1/en
Assigned to MICROSOFT CORPORATION reassignment MICROSOFT CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FILER, ERIC P., REAS, LOREN DOUGLAS, RUBIO, VASCO, TOM, DENNIS W.
Priority to TW098117854A priority patent/TW201002400A/en
Priority to BRPI0915060A priority patent/BRPI0915060A2/en
Priority to PCT/US2009/048874 priority patent/WO2009158628A2/en
Priority to KR1020107028907A priority patent/KR20110031925A/en
Priority to EP09771159.2A priority patent/EP2291819A4/en
Priority to CA2724855A priority patent/CA2724855A1/en
Priority to JP2011516715A priority patent/JP2011526192A/en
Priority to MX2010013570A priority patent/MX2010013570A/en
Priority to CN2009801248707A priority patent/CN102077234A/en
Priority to RU2010153354/08A priority patent/RU2504008C2/en
Publication of US20090325710A1 publication Critical patent/US20090325710A1/en
Priority to IL209049A priority patent/IL209049A0/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/22Setup operations, e.g. calibration, key configuration or button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • A63F13/235Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/812Ball games, e.g. soccer or baseball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1018Calibration; Key and button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1037Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted for converting control signals received from the game device into a haptic signal, e.g. using force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/206Game information storage, e.g. cartridges, CD ROM's, DVD's, smart cards
    • A63F2300/208Game information storage, e.g. cartridges, CD ROM's, DVD's, smart cards for storing personal settings or data of the player

Definitions

  • gestures of a player are mimicked in an animated depiction of the player.
  • the term “gestures” may refer to movements of the player, or corresponding movements of the animated depiction of the player. Examples of such gestures include movements of all or part of a body, which may include movements of a body member, such as a hand, arm, head, face, etc.
  • gestures are typically detected by a motion sensor in a remote gaming input device handled by the player, and communicated from the remote device to the gaming system processor.
  • motion sensors include gyros, magnetometers, and accelerometers.
  • the palette of supported gestures is typically limited by the pre-set resolution of the motion sensor. That is, sensitivity to gestures is typically limited to the resolution to which the motion sensor has been set.
  • the player typically needs to manually change the sensitivity of the motion sensor.
  • a fine sensor i.e., a sensor with relatively high sensitivity
  • the sensor may tend to clip.
  • a gross sensor i.e., a sensor with relatively low sensitivity
  • depiction of the fine motion tends to get blurred in noise. In either scenario, data may be lost.
  • a computer system such as a gaming system, for example, may include a processing device and a remote input device.
  • the remote input device may be operationally coupled to provide input to the processing device.
  • the remote input device may be wirelessly coupled to the processing device.
  • the remote input device may include one or more motion sensors, each having one or more sensitivity ranges.
  • the remote input device may include one or more motion sensors, each having a plurality of selectable sensitivity ranges.
  • the remote input device may include a plurality of motion sensors, each having at least one sensitivity range.
  • the processing device may include a context-determining module, a sensitivity-determining module, and a communications module.
  • the context-determining module may be configured to ascertain a current context in an application, such as a game-playing application, for example, executing on a computing device.
  • the context-determining module may be configured to ascertain a current scripted situation within the game playing application, or to ascertain a user profile.
  • the sensitivity-determining module may be configured to receive information from the context-determining module, and to determine a desired sensitivity range for a remote input device.
  • the sensitivity-determining module may be configured to determine the desired sensitivity range based at least in part on the user profile.
  • the communications module may be configured to communicate information indicative of the desired sensitivity range to the remote input device.
  • the processing device may signal the remote input device to select one of the sensors from the plurality of sensors, and/or to select one of the sensitivity ranges from the plurality of sensitivity ranges.
  • the remote input device may be configured to receive communicated information that is indicative of a desired sensitivity range.
  • the remote input device may be configured to respond to the received information by operating in the desired sensitivity range.
  • the remote input device may be configured to respond to the received information by activating a particular physical sensor having a sensitivity range corresponding to the desired sensitivity range.
  • FIG. 1 depicts an image of a gross gesture.
  • FIGS. 2A-C depict images of fine gestures at various times in tilt mode.
  • FIG. 3 is a functional block diagram of an example computing system.
  • FIG. 4 is a flowchart of an example method for use in a computing system as depicted in FIG. 1 .
  • FIG. 5 is a block diagram of an example computing environment in which example embodiments and aspects of the present invention may be implemented.
  • FIG. 6 is an example network configuration in which aspects of the invention may be implemented.
  • a typically gaming system may include a game console.
  • a processing device on which the game's operational software application may be executed, may be housed in the game console.
  • the gaming system may also include a remote input device, the nature of which may be based on the actual game the player is playing.
  • the remote input device may communicate to the game console information corresponding to the player's gestures using the remote input device.
  • the game console may cause a depiction of the player's gestures, or the effects thereof, to be presented on a video display, such as a television, computer monitor, or dedicated video display to which the game console is operationally coupled.
  • the remote input device may represent a golf club.
  • the player's gestures may be characterized by the player's swinging of the golf club.
  • the effects of the player's gestures may be characterized by the golf club's being swung.
  • a player will tend to swing harder (i.e., faster and over a greater angle) when driving than when chipping.
  • a player will tend to swing harder when chipping than when putting.
  • greater motion sensitivity may be desirable during a putting gesture than during a driving gesture.
  • the system may be capable of recognizing gestures used in such a game-play scenario, and adjusting the hardware sensitivity dynamically in response to such recognition.
  • the game software may be configured to decide when to switch resolution, and to determine the resolution to which to switch. Because a video game is typically a scripted interaction, the game software typically knows the context of the current situation. For example, in a golf scenario, the game software can recognize that, if the ball is at the tee, the player is likely to be driving rather than putting. Similarly, if the ball is on the green, the player is likely to be putting rather than driving. Or the context could be identified based on club selection. For example, if the player selects a driver, he is likely about to drive. If he selects a putter, he is likely about to putt.
  • the game software can recognize the context, and determine a desired sensitivity from the context.
  • the processing device may signal the remote input device to select progressively more sensitive sensors.
  • FIG. 1 depicts an example image of a gross gesture. As shown, an image of a person holding a microphone is presented. The gross gesture corresponds to the singer swinging her arms over about a 60-degree angle rather quickly. To produce a clear image of such a gesture, relatively low motion sensitivity would be desirable.
  • FIGS. 2A-C depict example images of fine gestures at various times in tilt mode. As shown, the singer is now titling the microphone relatively slowly over a relatively small angle (e.g., at a rate of 10 degrees every 7 seconds). To produce a clear image of such a gesture, relatively high motion sensitivity would be desirable.
  • FIG. 3 is a functional block diagram of an example computing system 10 .
  • the system 10 may include a computing or processing device 20 , and a remote input device 30 .
  • the processing device 20 may be housed in a game console, for example.
  • the remote input device 30 may be operationally coupled to provide input to the processing device 20 .
  • the remote input device 30 may be wired to the processing device 20 , or wirelessly coupled to the processing device 20 .
  • the remote input device 30 may include a human-interface device, such as a ball, bat, drumstick, fishing rod, or microphone, for example, including any type of game controller, such as a joystick, headset, helmet, heads-up display, or the like.
  • the remote input device 30 may include gesture recognition hardware.
  • the gesture recognition hardware may include one or more sensors, which may be motion sensors, thermal sensors, or pressure sensors, for example, or any combination of such sensors.
  • the remote input device 30 may be robotic device, of a type that might be used in manufacturing, for example.
  • the remote input device 30 may be operable over a plurality of sensitivity ranges.
  • the remote input device 30 may include one or more physical motion sensors 32 A-C. Examples of such motion sensors include gyros, accelerometers, and magnetometers. Typically, a single motion sensor or type of motion sensor will not provide absolute positioning of a moving object. Accordingly, multiple, different sensors may be employed. For example, an accelerometer may be used for measuring movement, while an additional sensor (e.g., a gyro) may be employed for determining position.
  • a gyro additional sensor
  • Each of the one or more physical motion sensors 32 A-C may be operable over a plurality of selectable sensitivity ranges.
  • the remote input device 30 may include a plurality of physical motion sensors 32 A-C, each of which is operable in at least one sensitivity range. It should be understood that the systems and methods described herein are not limited to the use of motion sensors. For example, thermal or pressure sensors could be employed.
  • the processing device 20 may include a context-determining module 22 , a sensitivity-determining module 24 , and a communications module 26 .
  • the context-determining module 22 may be configured to ascertain a current context in an application 28 executing on the processing device 20 .
  • the application 28 may be a game-playing application.
  • the context-determining module 26 may be configured to ascertain a current scripted situation within the game-playing application 28 .
  • the context-determining module 22 may be configured to ascertain a user profile 27 .
  • the processing device 20 may include a memory 25 in which the user profile 27 is stored.
  • An example user profile may include one or more predefined preferences of a specific user. Examples of such preferences include presets of default settings for gain, sensitivity, and personalization. This may be accessed through a password or biometric sensor, for example. Multiple profiles may be stored at once.
  • the sensitivity-determining module 24 may be configured to receive information from the context-determining module 22 , and to determine a desired sensitivity range for the remote input device 30 .
  • the desired sensitivity range may be determined based at least in part on the current context in the application 28 executing on the processing device 20 .
  • the desired sensitivity range may be determined based at least in part on the current scripted situation within a game-playing application.
  • the sensitivity-determining module 24 may be configured to determine the desired sensitivity range based at least in part on the user profile 27 .
  • the communications module 26 may be configured to communicate information indicative of the desired sensitivity range to the remote input device 30 , for use in selecting a sensitivity range for the remote input device 30 .
  • the processing device 20 may signal the remote input device 30 to operate in the desired sensitivity range.
  • the signal may be transmitted over a wired or wireless connection between the processing device 20 and the remote input device 30 .
  • the processing device 20 can send a control signal to the sensor in the remote input device 30 to set the sensitivity of the sensor.
  • Such a signal may include a field that informs the remote input device 30 of the desired sensitivity range.
  • the signal may include a number of bits (e.g., two) that correspond to the desired scale setting.
  • the number of bits (and, accordingly, the range of sensitivities) may be a parametric value that may be adjustable via the processing device.
  • the remote input device 30 may receive the signal from the processing device 20 , and thus receive from the processing device 20 information indicative of the desired sensitivity range.
  • the remote input device 30 may respond to receiving the information communicated from the processing device 20 by operating in the desired sensitivity range.
  • the remote input device 30 may respond to receiving the communicated information by causing a selected one of the physical motion sensors to operate in a selected one of the plurality of sensitivity ranges.
  • the remote input device 30 may cause to operate a selected one of the physical motion sensors that is operable in the desired sensitivity range.
  • the remote input device 30 includes a physical motion sensor that is operable over a plurality of selectable sensitivity ranges
  • the remote input device 30 may cause the physical motion sensor to operate in the desired sensitivity range by selecting the desired sensitivity range from the plurality of sensitivity ranges over which the motion sensor is operable.
  • FIG. 4 provides a flowchart of an example method 60 for use in a computing system as depicted in FIG. 3 .
  • a desired sensitivity range for the remote input device may be determined at the processing device.
  • the determination at 62 may be based, at least in part, on a current context in an application executing on the system.
  • the determination at 62 may be based, at least in part, on a user profile.
  • the processing device may signal the remote input device to operate in the desired sensitivity range.
  • the processing device may signal the remote input device to cause a motion sensor in the remote input device to operate in a selected sensitivity range, as at 70 .
  • the processing device may signal the remote input device to cause a selected one of a plurality of motion sensors to operate in the desired sensitivity range, as at 72 .
  • FIG. 5 shows an exemplary computing environment in which example embodiments and aspects may be implemented.
  • the computing system environment 100 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality. Neither should the computing environment 100 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary operating environment 100 .
  • Examples of well known computing systems, environments, and/or configurations that may be suitable for use include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, embedded systems, distributed computing environments that include any of the above systems or devices, and the like.
  • Computer-executable instructions such as program modules, being executed by a computer may be used.
  • program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
  • Distributed computing environments may be used where tasks are performed by remote processing devices that are linked through a communications network or other data transmission medium.
  • program modules and other data may be located in both local and remote computer storage media including memory storage devices.
  • an exemplary system includes a general purpose computing device in the form of a computer 110 .
  • Components of computer 110 may include, but are not limited to, a processing unit 120 , a system memory 130 , and a system bus 121 that couples various system components including the system memory to the processing unit 120 .
  • the processing unit 120 may represent multiple logical processing units such as those supported on a multi-threaded processor.
  • the system bus 121 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
  • ISA Industry Standard Architecture
  • MCA Micro Channel Architecture
  • EISA Enhanced ISA
  • VESA Video Electronics Standards Association
  • PCI Peripheral Component Interconnect
  • the system bus 121 may also be implemented as a point-to-point connection, switching fabric, or the like, among the communicating devices.
  • Computer 110 typically includes a variety of computer readable media.
  • Computer readable media can be any available media that can be accessed by computer 110 and includes both volatile and nonvolatile media, removable and non-removable media.
  • Computer readable media may comprise computer storage media and communication media.
  • Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CDROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by computer 110 .
  • Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
  • modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of any of the above should also be included within the scope of computer readable media.
  • the system memory 130 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 131 and random access memory (RAM) 132 .
  • ROM read only memory
  • RAM random access memory
  • BIOS basic input/output system
  • RAM 132 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120 .
  • FIG. 5 illustrates operating system 134 , application programs 135 , other program modules 136 , and program data 137 .
  • the computer 110 may also include other removable/non-removable, volatile/nonvolatile computer storage media.
  • FIG. 5 illustrates a hard disk drive 140 that reads from or writes to non-removable, nonvolatile magnetic media, a magnetic disk drive 151 that reads from or writes to a removable, nonvolatile magnetic disk 152 , and an optical disk drive 155 that reads from or writes to a removable, nonvolatile optical disk 156 , such as a CD ROM or other optical media.
  • removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like.
  • the hard disk drive 141 is typically connected to the system bus 121 through a non-removable memory interface such as interface 140
  • magnetic disk drive 151 and optical disk drive 155 are typically connected to the system bus 121 by a removable memory interface, such as interface 150 .
  • hard disk drive 141 is illustrated as storing operating system 144 , application programs 145 , other program modules 146 , and program data 147 . Note that these components can either be the same as or different from operating system 134 , application programs 135 , other program modules 136 , and program data 137 . Operating system 144 , application programs 145 , other program modules 146 , and program data 147 are given different numbers here to illustrate that, at a minimum, they are different copies.
  • a user may enter commands and information into the computer 20 through input devices such as a keyboard 162 and pointing device 161 , commonly referred to as a mouse, trackball or touch pad.
  • Other input devices may include a microphone, joystick, game pad, satellite dish, scanner, or the like.
  • These and other input devices are often connected to the processing unit 120 through a user input interface 160 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB).
  • a monitor 191 or other type of display device is also connected to the system bus 121 via an interface, such as a video interface 190 .
  • computers may also include other peripheral output devices such as speakers 197 and printer 196 , which may be connected through an output peripheral interface 195 .
  • the computer 110 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 180 .
  • the remote computer 180 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 110 , although only a memory storage device 181 has been illustrated in FIG. 5 .
  • the logical connections depicted in FIG. 5 include a local area network (LAN) 171 and a wide area network (WAN) 173 , but may also include other networks.
  • LAN local area network
  • WAN wide area network
  • Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.
  • the computer 110 When used in a LAN networking environment, the computer 110 is connected to the LAN 171 through a network interface or adapter 170 .
  • the computer 110 When used in a WAN networking environment, the computer 110 typically includes a modem 172 or other means for establishing communications over the WAN 173 , such as the Internet.
  • the modem 172 which may be internal or external, may be connected to the system bus 121 via the user input interface 160 , or other appropriate mechanism.
  • program modules depicted relative to the computer 110 may be stored in the remote memory storage device.
  • FIG. 5 illustrates remote application programs 185 as residing on memory device 181 . It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
  • MICROSOFT®'s .NET platform includes servers, building-block services, such as web-based data storage, and downloadable device software.
  • the .NET platform provides (1) the ability to make the entire range of computing devices work together and to have user information automatically updated and synchronized on all of them, (2) increased interactive capability for web sites, enabled by greater use of XML rather than HTML, (3) online services that feature customized access and delivery of products and services to the user from a central starting point for the management of various applications, such as e-mail, for example, or software, such as Office .NET, (4) centralized data storage, which will increase efficiency and ease of access to information, as well as synchronization of information among users and devices, (5) the ability to integrate various communications media, such as e-mail, faxes, and telephones, (6) for developers, the ability to create reusable modules, thereby increasing productivity and reducing the number of programming errors, and (7) many other cross-platform integration features as well.
  • one or more portions of the invention may also be implemented via an operating system, API, or middleware software between a coprocessor and requesting object, such that services may be performed by, supported in, or accessed via all of .NET's languages and services, and in other distributed computing frameworks as well.
  • FIG. 6 illustrates an example network environment in which the present invention may be employed.
  • actual network and database environments may be arranged in a variety of configurations; however, the example environment shown here provides a framework for understanding the type of environment in which an embodiment may operate.
  • the example network may include one or more client computers 200 a, a server computer 200 b, data source computers 200 c, and/or databases 270 , 272 a, and 272 b.
  • the client computers 200 a and the data source computers 200 c may be in electronic communication with the server computer 200 b by way of the communications network 280 (e.g., an intranet, the Internet or the like).
  • the client computers 200 a and data source computers 200 c may be connected to the communications network by way of communications interfaces 282 .
  • the communications interfaces 282 can be any type of communications interfaces such as Ethernet connections, modem connections, wireless connections and so on.
  • the server computer 200 b may provide management of the database 270 by way of database server system software such as MICROSOFT®'s SQL SERVER or the like. As such, server 200 b may act as a storehouse of data from a variety of data sources and provides that data to a variety of data consumers.
  • database server system software such as MICROSOFT®'s SQL SERVER or the like.
  • server 200 b may act as a storehouse of data from a variety of data sources and provides that data to a variety of data consumers.
  • a data source may be provided by data source computer 200 c.
  • Data source computer 200 c may communicate data to server computer 200 b via communications network 280 , which may be a LAN, WAN, Intranet, Internet, or the like.
  • Data source computer 200 c may store data locally in database 272 a, which may be database server or the like.
  • the data provided by data source 200 c can be combined and stored in a large database such as a data warehouse maintained by server 200 b.
  • Client computers 200 a that desire to use the data stored by server computer 200 b can access the database 270 via communications network 280 .
  • Client computers 200 a access the data by way of, for example, a query, a form, etc. It will be appreciated that any configuration of computers is equally compatible with an embodiment of the present invention.

Abstract

Disclosed is a gaming system having a processing device and a remote input device that is operationally coupled to the processing device. The remote input device may include a motion sensor. The resolution of the motion sensor may be set dynamically from the game software, such that both gross and fine gestures can have the maximum effect. By enabling the game software to assess and control the resolution requirements, and enabling the input device to adjust and respond accordingly, relatively fine gestures, as well as relatively gross gestures, can be discerned and depicted with better accuracy and precision.

Description

    BACKGROUND
  • Gaming systems are known in which the gestures of a player are mimicked in an animated depiction of the player. As used herein, the term “gestures” may refer to movements of the player, or corresponding movements of the animated depiction of the player. Examples of such gestures include movements of all or part of a body, which may include movements of a body member, such as a hand, arm, head, face, etc.
  • In such a system, gestures are typically detected by a motion sensor in a remote gaming input device handled by the player, and communicated from the remote device to the gaming system processor. Examples of such motion sensors include gyros, magnetometers, and accelerometers. The palette of supported gestures is typically limited by the pre-set resolution of the motion sensor. That is, sensitivity to gestures is typically limited to the resolution to which the motion sensor has been set.
  • To achieve the fullest range of a particular gesture input for a gaming input device, the player typically needs to manually change the sensitivity of the motion sensor. However, if the player selects a fine sensor (i.e., a sensor with relatively high sensitivity) and performs a gross gesture, then the sensor may tend to clip. Conversely, if the player selects a gross sensor (i.e., a sensor with relatively low sensitivity) and performs a fine gesture, then depiction of the fine motion tends to get blurred in noise. In either scenario, data may be lost.
  • It would be desirable, therefore, to have a gaming system where the resolution could be set dynamically from the game software, such that both gross and fine gestures can have the maximum effect. By enabling the game software to assess and control the resolution requirements, and enabling the input device to adjust and respond accordingly, relatively fine gestures, as well as relatively gross gestures, could be discerned and depicted with better accuracy and precision.
  • SUMMARY
  • As described herein, a computer system, such as a gaming system, for example, may include a processing device and a remote input device. The remote input device may be operationally coupled to provide input to the processing device. The remote input device may be wirelessly coupled to the processing device.
  • The remote input device may include one or more motion sensors, each having one or more sensitivity ranges. For example, the remote input device may include one or more motion sensors, each having a plurality of selectable sensitivity ranges. Alternatively or additionally, the remote input device may include a plurality of motion sensors, each having at least one sensitivity range.
  • The processing device may include a context-determining module, a sensitivity-determining module, and a communications module. The context-determining module may be configured to ascertain a current context in an application, such as a game-playing application, for example, executing on a computing device. For example, the context-determining module may be configured to ascertain a current scripted situation within the game playing application, or to ascertain a user profile.
  • The sensitivity-determining module may be configured to receive information from the context-determining module, and to determine a desired sensitivity range for a remote input device. The sensitivity-determining module may be configured to determine the desired sensitivity range based at least in part on the user profile.
  • The communications module may be configured to communicate information indicative of the desired sensitivity range to the remote input device. For example, the processing device may signal the remote input device to select one of the sensors from the plurality of sensors, and/or to select one of the sensitivity ranges from the plurality of sensitivity ranges.
  • The remote input device may be configured to receive communicated information that is indicative of a desired sensitivity range. The remote input device may be configured to respond to the received information by operating in the desired sensitivity range. For example, the remote input device may be configured to respond to the received information by activating a particular physical sensor having a sensitivity range corresponding to the desired sensitivity range.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 depicts an image of a gross gesture.
  • FIGS. 2A-C depict images of fine gestures at various times in tilt mode.
  • FIG. 3 is a functional block diagram of an example computing system.
  • FIG. 4 is a flowchart of an example method for use in a computing system as depicted in FIG. 1.
  • FIG. 5 is a block diagram of an example computing environment in which example embodiments and aspects of the present invention may be implemented.
  • FIG. 6 is an example network configuration in which aspects of the invention may be implemented.
  • DETAILED DESCRIPTION OF ILLUSTRATIVE EMBODIMENTS Overview; Example Scenarios
  • An example scenario in which the systems and methods described herein may be used is now presented in the context of a gaming system. It should be understood, however, that a game system is described for illustrative purposes only, and that the systems and methods described herein are not limited to implementation in gaming systems.
  • A typically gaming system may include a game console. A processing device, on which the game's operational software application may be executed, may be housed in the game console. The gaming system may also include a remote input device, the nature of which may be based on the actual game the player is playing. The remote input device may communicate to the game console information corresponding to the player's gestures using the remote input device. The game console may cause a depiction of the player's gestures, or the effects thereof, to be presented on a video display, such as a television, computer monitor, or dedicated video display to which the game console is operationally coupled.
  • Consider an example scenario wherein a player is playing a game of golf. Accordingly, the remote input device may represent a golf club. The player's gestures may be characterized by the player's swinging of the golf club. The effects of the player's gestures may be characterized by the golf club's being swung.
  • In an example scenario, there may be three types of golf swings, i.e., driving, chipping, and putting. It should be understood that, in general, a player will tend to swing harder (i.e., faster and over a greater angle) when driving than when chipping. Similarly, a player will tend to swing harder when chipping than when putting. Accordingly, to present accurate and precise depictions of both driving and putting, greater motion sensitivity may be desirable during a putting gesture than during a driving gesture.
  • The system may be capable of recognizing gestures used in such a game-play scenario, and adjusting the hardware sensitivity dynamically in response to such recognition. For example, the game software may be configured to decide when to switch resolution, and to determine the resolution to which to switch. Because a video game is typically a scripted interaction, the game software typically knows the context of the current situation. For example, in a golf scenario, the game software can recognize that, if the ball is at the tee, the player is likely to be driving rather than putting. Similarly, if the ball is on the green, the player is likely to be putting rather than driving. Or the context could be identified based on club selection. For example, if the player selects a driver, he is likely about to drive. If he selects a putter, he is likely about to putt.
  • The game software can recognize the context, and determine a desired sensitivity from the context. As play moves from a driving context to a chipping context to a putting context, the processing device may signal the remote input device to select progressively more sensitive sensors. Thus, driving gestures, chipping gestures, and putting gestures, along with the effects thereof, can be depicted accurately and precisely.
  • FIG. 1 depicts an example image of a gross gesture. As shown, an image of a person holding a microphone is presented. The gross gesture corresponds to the singer swinging her arms over about a 60-degree angle rather quickly. To produce a clear image of such a gesture, relatively low motion sensitivity would be desirable.
  • FIGS. 2A-C depict example images of fine gestures at various times in tilt mode. As shown, the singer is now titling the microphone relatively slowly over a relatively small angle (e.g., at a rate of 10 degrees every 7 seconds). To produce a clear image of such a gesture, relatively high motion sensitivity would be desirable.
  • A detailed description of example systems and methods follows.
  • Dynamic Selection of Sensitivity of Tilt Functionality
  • FIG. 3 is a functional block diagram of an example computing system 10. As shown, the system 10 may include a computing or processing device 20, and a remote input device 30. The processing device 20 may be housed in a game console, for example. The remote input device 30 may be operationally coupled to provide input to the processing device 20. The remote input device 30 may be wired to the processing device 20, or wirelessly coupled to the processing device 20.
  • The remote input device 30 may include a human-interface device, such as a ball, bat, drumstick, fishing rod, or microphone, for example, including any type of game controller, such as a joystick, headset, helmet, heads-up display, or the like. The remote input device 30 may include gesture recognition hardware. The gesture recognition hardware may include one or more sensors, which may be motion sensors, thermal sensors, or pressure sensors, for example, or any combination of such sensors. The remote input device 30 may be robotic device, of a type that might be used in manufacturing, for example.
  • The remote input device 30 may be operable over a plurality of sensitivity ranges. The remote input device 30 may include one or more physical motion sensors 32A-C. Examples of such motion sensors include gyros, accelerometers, and magnetometers. Typically, a single motion sensor or type of motion sensor will not provide absolute positioning of a moving object. Accordingly, multiple, different sensors may be employed. For example, an accelerometer may be used for measuring movement, while an additional sensor (e.g., a gyro) may be employed for determining position.
  • Each of the one or more physical motion sensors 32A-C may be operable over a plurality of selectable sensitivity ranges. The remote input device 30 may include a plurality of physical motion sensors 32A-C, each of which is operable in at least one sensitivity range. It should be understood that the systems and methods described herein are not limited to the use of motion sensors. For example, thermal or pressure sensors could be employed.
  • The processing device 20 may include a context-determining module 22, a sensitivity-determining module 24, and a communications module 26. The context-determining module 22 may be configured to ascertain a current context in an application 28 executing on the processing device 20. For example, the application 28 may be a game-playing application. The context-determining module 26 may be configured to ascertain a current scripted situation within the game-playing application 28.
  • The context-determining module 22 may be configured to ascertain a user profile 27. The processing device 20 may include a memory 25 in which the user profile 27 is stored. An example user profile may include one or more predefined preferences of a specific user. Examples of such preferences include presets of default settings for gain, sensitivity, and personalization. This may be accessed through a password or biometric sensor, for example. Multiple profiles may be stored at once.
  • The sensitivity-determining module 24 may be configured to receive information from the context-determining module 22, and to determine a desired sensitivity range for the remote input device 30. The desired sensitivity range may be determined based at least in part on the current context in the application 28 executing on the processing device 20. For example, the desired sensitivity range may be determined based at least in part on the current scripted situation within a game-playing application. The sensitivity-determining module 24 may be configured to determine the desired sensitivity range based at least in part on the user profile 27.
  • The communications module 26 may be configured to communicate information indicative of the desired sensitivity range to the remote input device 30, for use in selecting a sensitivity range for the remote input device 30. The processing device 20 may signal the remote input device 30 to operate in the desired sensitivity range. The signal may be transmitted over a wired or wireless connection between the processing device 20 and the remote input device 30. Thus, the processing device 20 can send a control signal to the sensor in the remote input device 30 to set the sensitivity of the sensor.
  • Such a signal may include a field that informs the remote input device 30 of the desired sensitivity range. For example, the signal may include a number of bits (e.g., two) that correspond to the desired scale setting. The number of bits (and, accordingly, the range of sensitivities) may be a parametric value that may be adjustable via the processing device.
  • The remote input device 30 may receive the signal from the processing device 20, and thus receive from the processing device 20 information indicative of the desired sensitivity range. The remote input device 30 may respond to receiving the information communicated from the processing device 20 by operating in the desired sensitivity range.
  • For example, the remote input device 30 may respond to receiving the communicated information by causing a selected one of the physical motion sensors to operate in a selected one of the plurality of sensitivity ranges. Where the remote input device 30 includes a plurality of physical motion sensors, the remote input device 30 may cause to operate a selected one of the physical motion sensors that is operable in the desired sensitivity range. Where the remote input device 30 includes a physical motion sensor that is operable over a plurality of selectable sensitivity ranges, the remote input device 30 may cause the physical motion sensor to operate in the desired sensitivity range by selecting the desired sensitivity range from the plurality of sensitivity ranges over which the motion sensor is operable.
  • To summarize, FIG. 4 provides a flowchart of an example method 60 for use in a computing system as depicted in FIG. 3. At 62, a desired sensitivity range for the remote input device may be determined at the processing device. As shown at 64, the determination at 62 may be based, at least in part, on a current context in an application executing on the system. As shown at 66, the determination at 62 may be based, at least in part, on a user profile.
  • At 68, the processing device may signal the remote input device to operate in the desired sensitivity range. The processing device may signal the remote input device to cause a motion sensor in the remote input device to operate in a selected sensitivity range, as at 70. Alternatively or additionally, the processing device may signal the remote input device to cause a selected one of a plurality of motion sensors to operate in the desired sensitivity range, as at 72.
  • Example Computing Environment
  • FIG. 5 shows an exemplary computing environment in which example embodiments and aspects may be implemented. The computing system environment 100 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality. Neither should the computing environment 100 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary operating environment 100.
  • Numerous other general purpose or special purpose computing system environments or configurations may be used. Examples of well known computing systems, environments, and/or configurations that may be suitable for use include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, embedded systems, distributed computing environments that include any of the above systems or devices, and the like.
  • Computer-executable instructions, such as program modules, being executed by a computer may be used. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Distributed computing environments may be used where tasks are performed by remote processing devices that are linked through a communications network or other data transmission medium. In a distributed computing environment, program modules and other data may be located in both local and remote computer storage media including memory storage devices.
  • With reference to FIG. 5, an exemplary system includes a general purpose computing device in the form of a computer 110. Components of computer 110 may include, but are not limited to, a processing unit 120, a system memory 130, and a system bus 121 that couples various system components including the system memory to the processing unit 120. The processing unit 120 may represent multiple logical processing units such as those supported on a multi-threaded processor. The system bus 121 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus (also known as Mezzanine bus). The system bus 121 may also be implemented as a point-to-point connection, switching fabric, or the like, among the communicating devices.
  • Computer 110 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 110 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CDROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by computer 110. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of any of the above should also be included within the scope of computer readable media.
  • The system memory 130 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 131 and random access memory (RAM) 132. A basic input/output system 133 (BIOS), containing the basic routines that help to transfer information between elements within computer 110, such as during start-up, is typically stored in ROM 131. RAM 132 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120. By way of example, and not limitation, FIG. 5 illustrates operating system 134, application programs 135, other program modules 136, and program data 137.
  • The computer 110 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only, FIG. 5 illustrates a hard disk drive 140 that reads from or writes to non-removable, nonvolatile magnetic media, a magnetic disk drive 151 that reads from or writes to a removable, nonvolatile magnetic disk 152, and an optical disk drive 155 that reads from or writes to a removable, nonvolatile optical disk 156, such as a CD ROM or other optical media. Other removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like. The hard disk drive 141 is typically connected to the system bus 121 through a non-removable memory interface such as interface 140, and magnetic disk drive 151 and optical disk drive 155 are typically connected to the system bus 121 by a removable memory interface, such as interface 150.
  • The drives and their associated computer storage media discussed above and illustrated in FIG. 5, provide storage of computer readable instructions, data structures, program modules and other data for the computer 110. In FIG. 5, for example, hard disk drive 141 is illustrated as storing operating system 144, application programs 145, other program modules 146, and program data 147. Note that these components can either be the same as or different from operating system 134, application programs 135, other program modules 136, and program data 137. Operating system 144, application programs 145, other program modules 146, and program data 147 are given different numbers here to illustrate that, at a minimum, they are different copies. A user may enter commands and information into the computer 20 through input devices such as a keyboard 162 and pointing device 161, commonly referred to as a mouse, trackball or touch pad. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the processing unit 120 through a user input interface 160 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB). A monitor 191 or other type of display device is also connected to the system bus 121 via an interface, such as a video interface 190. In addition to the monitor, computers may also include other peripheral output devices such as speakers 197 and printer 196, which may be connected through an output peripheral interface 195.
  • The computer 110 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 180. The remote computer 180 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 110, although only a memory storage device 181 has been illustrated in FIG. 5. The logical connections depicted in FIG. 5 include a local area network (LAN) 171 and a wide area network (WAN) 173, but may also include other networks. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.
  • When used in a LAN networking environment, the computer 110 is connected to the LAN 171 through a network interface or adapter 170. When used in a WAN networking environment, the computer 110 typically includes a modem 172 or other means for establishing communications over the WAN 173, such as the Internet. The modem 172, which may be internal or external, may be connected to the system bus 121 via the user input interface 160, or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 110, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation, FIG. 5 illustrates remote application programs 185 as residing on memory device 181. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
  • Example Distributed Computing Frameworks or Architectures
  • Various distributed computing frameworks have been and are being developed in light of the convergence of personal computing and the Internet. Individuals and business users alike are provided with a seamlessly interoperable and web-enabled interface for applications and computing devices, making computing activities increasingly web browser or network-oriented.
  • For example, MICROSOFT®'s .NET platform includes servers, building-block services, such as web-based data storage, and downloadable device software. Generally speaking, the .NET platform provides (1) the ability to make the entire range of computing devices work together and to have user information automatically updated and synchronized on all of them, (2) increased interactive capability for web sites, enabled by greater use of XML rather than HTML, (3) online services that feature customized access and delivery of products and services to the user from a central starting point for the management of various applications, such as e-mail, for example, or software, such as Office .NET, (4) centralized data storage, which will increase efficiency and ease of access to information, as well as synchronization of information among users and devices, (5) the ability to integrate various communications media, such as e-mail, faxes, and telephones, (6) for developers, the ability to create reusable modules, thereby increasing productivity and reducing the number of programming errors, and (7) many other cross-platform integration features as well.
  • While example embodiments herein are described in connection with software residing on a computing device, one or more portions of the invention may also be implemented via an operating system, API, or middleware software between a coprocessor and requesting object, such that services may be performed by, supported in, or accessed via all of .NET's languages and services, and in other distributed computing frameworks as well.
  • Network Environment
  • FIG. 6 illustrates an example network environment in which the present invention may be employed. Of course, actual network and database environments may be arranged in a variety of configurations; however, the example environment shown here provides a framework for understanding the type of environment in which an embodiment may operate.
  • The example network may include one or more client computers 200 a, a server computer 200 b, data source computers 200 c, and/or databases 270, 272 a, and 272 b. The client computers 200 a and the data source computers 200 c may be in electronic communication with the server computer 200 b by way of the communications network 280 (e.g., an intranet, the Internet or the like). The client computers 200 a and data source computers 200 c may be connected to the communications network by way of communications interfaces 282. The communications interfaces 282 can be any type of communications interfaces such as Ethernet connections, modem connections, wireless connections and so on.
  • The server computer 200 b may provide management of the database 270 by way of database server system software such as MICROSOFT®'s SQL SERVER or the like. As such, server 200 b may act as a storehouse of data from a variety of data sources and provides that data to a variety of data consumers.
  • In the example network environment of FIG. 6, a data source may be provided by data source computer 200 c. Data source computer 200 c may communicate data to server computer 200 b via communications network 280, which may be a LAN, WAN, Intranet, Internet, or the like. Data source computer 200 c may store data locally in database 272 a, which may be database server or the like. The data provided by data source 200 c can be combined and stored in a large database such as a data warehouse maintained by server 200 b.
  • Client computers 200 a that desire to use the data stored by server computer 200 b can access the database 270 via communications network 280. Client computers 200 a access the data by way of, for example, a query, a form, etc. It will be appreciated that any configuration of computers is equally compatible with an embodiment of the present invention.

Claims (20)

1. A method for use in a computing system, the computing system comprising a processing device and a remote input device, the remote input device being operable over a plurality of sensitivity ranges, the method comprising:
determining, at the processing device, a desired sensitivity range for the remote input device; and
signaling by the processing device to the remote input device to operate in the desired sensitivity range.
2. The method of claim 1, wherein the computing system comprises a game playing system, and wherein determining the desired sensitivity range is based at least in part on a current context in an application executing on the game playing system.
3. The method of claim 1, wherein the computing system comprises a game playing system, and wherein determining the desired sensitivity range is based at least in part on a user profile.
4. The method of claim 1, wherein the remote input device comprises a motion sensor that is operable over a plurality of selectable sensitivity ranges, and wherein signaling by the processing device to the remote input device comprises signaling by the processing device to the remote input device to cause the motion sensor to operate in a sensitivity range selected from the plurality of sensitivity ranges.
5. The method of claim 1, wherein the remote input device comprises a plurality of motion sensors, each said motion sensor being operation in at least one sensitivity range, and wherein signaling by the processing device to the remote input device comprises signaling by the processing device to operate a sensor selected from the plurality of sensors.
6. The method of claim 1, wherein the remote input device comprises at least one of a gyro, an accelerometer, or a magnetometer.
7. The method of claim 1, wherein the remote input device is wirelessly coupled to the processing device.
8. A system, comprising:
a context-determining module configured to ascertain a current context in an application executing on a computing device;
a sensitivity-determining module configured to receive information from the context-determining module, and to determine a desired sensitivity range for a remote input device, wherein the remote input device is operationally coupled to provide input to the computing device, and wherein the remote input device is operable over a plurality of sensitivity ranges; and
a communications module configured to communicate information indicative of the desired sensitivity range to the remote input device for use in selecting a sensitivity range for the remote input device.
9. The system of claim 8, wherein the remote input device is wirelessly coupled to the computing device.
10. The system of claim 8, wherein the application executing on the computing device is a game-playing application, and the context-determining module is configured to ascertain a current scripted situation within the game playing application.
11. The system of claim 8, wherein the context-determining module is configured to ascertain a user profile, and the sensitivity-determining module is configured to determine the desired sensitivity range based at least in part on the user profile.
12. The system of claim 8, wherein the remote input device comprises a physical motion sensor having a plurality of sensitivity ranges, and is configured to respond to communicated information indicative of the desired sensitivity range by operating in the desired sensitivity range.
13. The system of claim 8, wherein the remote input device comprises a plurality of physical motion sensors, each said motion sensor having at least one sensitivity range, and wherein the remote input device is configured to respond to communicated information indicative of the desired sensitivity range by activating at least one of the physical motion sensors, the at least one physical motion sensor being operable in a sensitivity range corresponding to the desired sensitivity range.
14. The system of claim 8, wherein the remote input device comprises at least one of a gyro, an accelerometer, or a magnetometer.
15. A computer-implemented game-playing system, comprising:
a remote input device comprising a physical motion sensor that is operable over a plurality of sensitivity ranges; and
a processing device that ascertains a current scripted situation within a game playing application executing on the processing device, determines a desired sensitivity range for the remote input device based at least in part on the current scripted situation, and communicates information indicative of the desired sensitivity range to the remote input device,
wherein the remote input device receives from the processing device the communicated information indicative of the desired sensitivity range, and responds to receiving the communicated information by causing the physical motion sensor to operate in the desired sensitivity range.
16. The system of claim 15, wherein the processing device determines the desired sensitivity range based at least in part on a user profile.
17. The system of claim 15, wherein the physical motion sensor is operable over a plurality of selectable sensitivity ranges, and wherein the remote input device responds to receiving the communicated information by causing the physical motion sensor to operate in a selected one of the plurality of sensitivity ranges.
18. The system of claim 15, wherein the remote input device comprises a plurality of physical motion sensors, each said physical motion sensor operable in at least one respective sensitivity range, and wherein the remote input device responds to receiving the communicated information by causing a selected one of the plurality of physical motion sensors to operate in the desired sensitivity range.
19. The system of claim 15, wherein the physical motion sensor includes at least one of a gyro, an accelerometer, or a magnetometer.
20. The system of claim 15, wherein the remote input device is wirelessly coupled to the computing device.
US12/163,345 2008-06-27 2008-06-27 Dynamic Selection Of Sensitivity Of Tilt Functionality Abandoned US20090325710A1 (en)

Priority Applications (12)

Application Number Priority Date Filing Date Title
US12/163,345 US20090325710A1 (en) 2008-06-27 2008-06-27 Dynamic Selection Of Sensitivity Of Tilt Functionality
TW098117854A TW201002400A (en) 2008-06-27 2009-05-27 Dynamic selection of sensitivity of tilt functionality
RU2010153354/08A RU2504008C2 (en) 2008-06-27 2009-06-26 Dynamic selection of sensitivity when executing tilt function
CA2724855A CA2724855A1 (en) 2008-06-27 2009-06-26 Dynamic selection of sensitivity of tilt functionality
PCT/US2009/048874 WO2009158628A2 (en) 2008-06-27 2009-06-26 Dynamic selection of sensitivity of tilt functionality
KR1020107028907A KR20110031925A (en) 2008-06-27 2009-06-26 Dynamic selection of sensitivity of tilt functionality
EP09771159.2A EP2291819A4 (en) 2008-06-27 2009-06-26 Dynamic selection of sensitivity of tilt functionality
BRPI0915060A BRPI0915060A2 (en) 2008-06-27 2009-06-26 dynamic selection of tilt functionality sensitivity
JP2011516715A JP2011526192A (en) 2008-06-27 2009-06-26 Dynamic selection of tilt function sensitivity
MX2010013570A MX2010013570A (en) 2008-06-27 2009-06-26 Dynamic selection of sensitivity of tilt functionality.
CN2009801248707A CN102077234A (en) 2008-06-27 2009-06-26 Dynamic selection of sensitivity of tilt functionality
IL209049A IL209049A0 (en) 2008-06-27 2010-11-01 Dynamic selection of sensitivity of tilt functionality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/163,345 US20090325710A1 (en) 2008-06-27 2008-06-27 Dynamic Selection Of Sensitivity Of Tilt Functionality

Publications (1)

Publication Number Publication Date
US20090325710A1 true US20090325710A1 (en) 2009-12-31

Family

ID=41445349

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/163,345 Abandoned US20090325710A1 (en) 2008-06-27 2008-06-27 Dynamic Selection Of Sensitivity Of Tilt Functionality

Country Status (12)

Country Link
US (1) US20090325710A1 (en)
EP (1) EP2291819A4 (en)
JP (1) JP2011526192A (en)
KR (1) KR20110031925A (en)
CN (1) CN102077234A (en)
BR (1) BRPI0915060A2 (en)
CA (1) CA2724855A1 (en)
IL (1) IL209049A0 (en)
MX (1) MX2010013570A (en)
RU (1) RU2504008C2 (en)
TW (1) TW201002400A (en)
WO (1) WO2009158628A2 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100007528A1 (en) * 2008-07-11 2010-01-14 Nintendo Co., Ltd. Expanding operating device and operating system
US20100151948A1 (en) * 2008-12-15 2010-06-17 Disney Enterprises, Inc. Dance ring video game
US20110181601A1 (en) * 2010-01-22 2011-07-28 Sony Computer Entertainment America Inc. Capturing views and movements of actors performing within generated scenes
US20110271013A1 (en) * 2010-04-30 2011-11-03 Nintendo Co., Ltd. Input device
US20180329501A1 (en) * 2015-10-30 2018-11-15 Samsung Electronics Co., Ltd. Gesture sensing method and electronic device supporting same
US20200073481A1 (en) * 2017-01-12 2020-03-05 Sony Corporation Information processing apparatus, information processing method, and program
US11422625B2 (en) * 2019-12-31 2022-08-23 Human Mode, L.L.C. Proxy controller suit with optional dual range kinematics
US20240028136A1 (en) * 2022-07-22 2024-01-25 Asustek Computer Inc. Electronic device and sensitivity adjustment method for sensor

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6065550B2 (en) * 2012-12-03 2017-01-25 船井電機株式会社 Video equipment
EP2793105A1 (en) * 2013-04-19 2014-10-22 Alcatel Lucent Controlling a user interface of an interactive device
US9871544B2 (en) 2013-05-29 2018-01-16 Microsoft Technology Licensing, Llc Specific absorption rate mitigation
US10893488B2 (en) 2013-06-14 2021-01-12 Microsoft Technology Licensing, Llc Radio frequency (RF) power back-off optimization for specific absorption rate (SAR) compliance
US20150141080A1 (en) * 2013-11-21 2015-05-21 Microsoft Corporation Object Detection and Characterization
US9813997B2 (en) 2014-01-10 2017-11-07 Microsoft Technology Licensing, Llc Antenna coupling for sensing and dynamic transmission
US10044095B2 (en) 2014-01-10 2018-08-07 Microsoft Technology Licensing, Llc Radiating structure with integrated proximity sensing
US9785174B2 (en) 2014-10-03 2017-10-10 Microsoft Technology Licensing, Llc Predictive transmission power control for back-off
US9871545B2 (en) 2014-12-05 2018-01-16 Microsoft Technology Licensing, Llc Selective specific absorption rate adjustment
WO2016168267A1 (en) * 2015-04-15 2016-10-20 Thomson Licensing Configuring translation of three dimensional movement
US10013038B2 (en) 2016-01-05 2018-07-03 Microsoft Technology Licensing, Llc Dynamic antenna power control for multi-context device
JP6169238B1 (en) * 2016-09-21 2017-07-26 京セラ株式会社 Electronic device, program, and control method
US10461406B2 (en) 2017-01-23 2019-10-29 Microsoft Technology Licensing, Llc Loop antenna with integrated proximity sensing
US10224974B2 (en) 2017-03-31 2019-03-05 Microsoft Technology Licensing, Llc Proximity-independent SAR mitigation
GB2613811A (en) * 2021-12-15 2023-06-21 Sony Interactive Entertainment Inc Interaction modification system and method

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6466198B1 (en) * 1999-11-05 2002-10-15 Innoventions, Inc. View navigation and magnification of a hand-held device with a display
US6624824B1 (en) * 1996-04-30 2003-09-23 Sun Microsystems, Inc. Tilt-scrolling on the sunpad
US6690358B2 (en) * 2000-11-30 2004-02-10 Alan Edward Kaplan Display control for hand-held devices
US20040119684A1 (en) * 2002-12-18 2004-06-24 Xerox Corporation System and method for navigating information
US20050119036A1 (en) * 2003-10-03 2005-06-02 Amro Albanna Input system and method
US20050216867A1 (en) * 2004-03-23 2005-09-29 Marvit David L Selective engagement of motion detection
US20060256081A1 (en) * 2002-07-27 2006-11-16 Sony Computer Entertainment America Inc. Scheme for detecting and tracking user manipulation of a game controller body
US7184025B2 (en) * 2002-05-31 2007-02-27 Microsoft Corporation Altering a display on a viewing device based upon a user controlled orientation of the viewing device
US20070066394A1 (en) * 2005-09-15 2007-03-22 Nintendo Co., Ltd. Video game system with wireless modular handheld controller
US20070213109A1 (en) * 2006-03-13 2007-09-13 Nintendo Co., Ltd. Game apparatus and storage medium having game program stored thereon
US7289102B2 (en) * 2000-07-17 2007-10-30 Microsoft Corporation Method and apparatus using multiple sensors in a device with a display
US20070265088A1 (en) * 2006-05-09 2007-11-15 Nintendo Co., Ltd. Storage medium storing game program, game apparatus, and game system
US7301526B2 (en) * 2004-03-23 2007-11-27 Fujitsu Limited Dynamic adaptation of gestures for motion controlled handheld devices
US20080076566A1 (en) * 2006-08-25 2008-03-27 Nintendo Co., Ltd. Storage medium having game program stored thereon and game apparatus
US20080125997A1 (en) * 2006-09-27 2008-05-29 General Electric Company Method and apparatus for correction of multiple EM sensor positions
US20090062005A1 (en) * 2007-08-30 2009-03-05 Industrial Technology Research Institute Method for adjusting sensing range and sensitivity and inertia interactive aparatus and system using thereof

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5945988A (en) * 1996-06-06 1999-08-31 Intel Corporation Method and apparatus for automatically determining and dynamically updating user preferences in an entertainment system
US6053814A (en) * 1997-12-04 2000-04-25 Logitech, Inc. System and method for automatically adjusting game controller sensitivity to player inputs
RU2251732C2 (en) * 1999-09-11 2005-05-10 Сони Компьютер Энтертейнмент Инк. Control device
US9682319B2 (en) * 2002-07-31 2017-06-20 Sony Interactive Entertainment Inc. Combiner method for altering game gearing
JP2007531113A (en) * 2004-03-23 2007-11-01 富士通株式会社 Identification of mobile device tilt and translational components
US7976385B2 (en) * 2004-05-11 2011-07-12 Mattel, Inc. Game controller with sensitivity adjustment
JP2006075556A (en) * 2004-09-09 2006-03-23 Tadashi Ohashi Gyro controller
JP5010822B2 (en) * 2005-09-29 2012-08-29 株式会社ソニー・コンピュータエンタテインメント Information communication system, information processing apparatus, information processing program, storage medium storing information processing program, and display control method
JP2008015679A (en) * 2006-07-04 2008-01-24 Sony Computer Entertainment Inc User interface device and operational sensitivity adjustment method
JP2008011980A (en) * 2006-07-04 2008-01-24 Sony Computer Entertainment Inc User interface device and operational sensitivity adjusting method
JP5051822B2 (en) * 2006-08-02 2012-10-17 任天堂株式会社 Game device with general-purpose remote control function

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6624824B1 (en) * 1996-04-30 2003-09-23 Sun Microsystems, Inc. Tilt-scrolling on the sunpad
US6466198B1 (en) * 1999-11-05 2002-10-15 Innoventions, Inc. View navigation and magnification of a hand-held device with a display
US7289102B2 (en) * 2000-07-17 2007-10-30 Microsoft Corporation Method and apparatus using multiple sensors in a device with a display
US6690358B2 (en) * 2000-11-30 2004-02-10 Alan Edward Kaplan Display control for hand-held devices
US7184025B2 (en) * 2002-05-31 2007-02-27 Microsoft Corporation Altering a display on a viewing device based upon a user controlled orientation of the viewing device
US20060256081A1 (en) * 2002-07-27 2006-11-16 Sony Computer Entertainment America Inc. Scheme for detecting and tracking user manipulation of a game controller body
US20040119684A1 (en) * 2002-12-18 2004-06-24 Xerox Corporation System and method for navigating information
US20050119036A1 (en) * 2003-10-03 2005-06-02 Amro Albanna Input system and method
US7301526B2 (en) * 2004-03-23 2007-11-27 Fujitsu Limited Dynamic adaptation of gestures for motion controlled handheld devices
US20050216867A1 (en) * 2004-03-23 2005-09-29 Marvit David L Selective engagement of motion detection
US20070066394A1 (en) * 2005-09-15 2007-03-22 Nintendo Co., Ltd. Video game system with wireless modular handheld controller
US20070213109A1 (en) * 2006-03-13 2007-09-13 Nintendo Co., Ltd. Game apparatus and storage medium having game program stored thereon
US20070265088A1 (en) * 2006-05-09 2007-11-15 Nintendo Co., Ltd. Storage medium storing game program, game apparatus, and game system
US20080076566A1 (en) * 2006-08-25 2008-03-27 Nintendo Co., Ltd. Storage medium having game program stored thereon and game apparatus
US20080125997A1 (en) * 2006-09-27 2008-05-29 General Electric Company Method and apparatus for correction of multiple EM sensor positions
US20090062005A1 (en) * 2007-08-30 2009-03-05 Industrial Technology Research Institute Method for adjusting sensing range and sensitivity and inertia interactive aparatus and system using thereof

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8384565B2 (en) 2008-07-11 2013-02-26 Nintendo Co., Ltd. Expanding operating device and operating system
US20100007528A1 (en) * 2008-07-11 2010-01-14 Nintendo Co., Ltd. Expanding operating device and operating system
US8057290B2 (en) * 2008-12-15 2011-11-15 Disney Enterprises, Inc. Dance ring video game
US20100151948A1 (en) * 2008-12-15 2010-06-17 Disney Enterprises, Inc. Dance ring video game
WO2011090509A1 (en) * 2010-01-22 2011-07-28 Sony Computer Entertainment America Inc. Capturing views and movements of actors performing within generated scenes
US20110181601A1 (en) * 2010-01-22 2011-07-28 Sony Computer Entertainment America Inc. Capturing views and movements of actors performing within generated scenes
KR101748593B1 (en) * 2010-01-22 2017-06-20 소니 인터랙티브 엔터테인먼트 아메리카 엘엘씨 Capturing views and movements of actors performing within generated scenes
US20110271013A1 (en) * 2010-04-30 2011-11-03 Nintendo Co., Ltd. Input device
US8554960B2 (en) * 2010-04-30 2013-10-08 Nintend Co., Ltd. Input device
EP2383634A3 (en) * 2010-04-30 2014-06-25 Nintendo Co., Ltd. Input device
US20180329501A1 (en) * 2015-10-30 2018-11-15 Samsung Electronics Co., Ltd. Gesture sensing method and electronic device supporting same
US20200073481A1 (en) * 2017-01-12 2020-03-05 Sony Corporation Information processing apparatus, information processing method, and program
US11209908B2 (en) * 2017-01-12 2021-12-28 Sony Corporation Information processing apparatus and information processing method
US11422625B2 (en) * 2019-12-31 2022-08-23 Human Mode, L.L.C. Proxy controller suit with optional dual range kinematics
US20240028136A1 (en) * 2022-07-22 2024-01-25 Asustek Computer Inc. Electronic device and sensitivity adjustment method for sensor

Also Published As

Publication number Publication date
EP2291819A2 (en) 2011-03-09
RU2010153354A (en) 2012-07-10
TW201002400A (en) 2010-01-16
EP2291819A4 (en) 2015-03-04
MX2010013570A (en) 2011-02-24
JP2011526192A (en) 2011-10-06
WO2009158628A2 (en) 2009-12-30
CN102077234A (en) 2011-05-25
RU2504008C2 (en) 2014-01-10
BRPI0915060A2 (en) 2015-10-27
CA2724855A1 (en) 2009-12-30
IL209049A0 (en) 2011-01-31
WO2009158628A3 (en) 2010-05-06
KR20110031925A (en) 2011-03-29

Similar Documents

Publication Publication Date Title
US20090325710A1 (en) Dynamic Selection Of Sensitivity Of Tilt Functionality
US10905950B2 (en) Head-mounted display tracking
JP6422137B2 (en) Perceptual-based predictive tracking for head-mounted displays
US10007349B2 (en) Multiple sensor gesture recognition
CN107111340B (en) Method and system for user interaction in virtual or augmented reality scenes
US9477312B2 (en) Distance based modelling and manipulation methods for augmented reality systems using ultrasonic gloves
US9740284B2 (en) Adjusting content display orientation on a screen based on user orientation
JP5204224B2 (en) Object detection using video input combined with tilt angle information
US8291346B2 (en) 3D remote control system employing absolute and relative position detection
US20120086725A1 (en) System and Method for Compensating for Drift in a Display of a User Interface State
EP2307943B1 (en) Pointer with motion sensing resolved by data merging
US10754418B1 (en) Using body surfaces for placing augmented reality content
CN104956648B (en) Method and apparatus for sensing the orientation of object in space in fixed reference frame
TWI744606B (en) Motion detection system, motion detection method and computer-readable recording medium thereof
CN105723302A (en) Boolean/float controller and gesture recognition system
CN108958465A (en) Walking analysis method, virtual reality interlock method and device
US20210409615A1 (en) Skeletal tracking for real-time virtual effects
JP6603375B2 (en) Information processing system and information processing program
US11061469B2 (en) Head mounted display system and rotation center correcting method thereof
CN117191012A (en) Low-power-consumption outdoor large-scale map AR positioning technical method
WO2023091288A1 (en) Feature similarity scoring of physical environment for augmented reality gameplay
TW202335707A (en) Motion computing system and method for mixed reality
KR20230028661A (en) Virtual extreme golf system

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FILER, ERIC P.;REAS, LOREN DOUGLAS;RUBIO, VASCO;AND OTHERS;REEL/FRAME:021599/0822

Effective date: 20080925

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034564/0001

Effective date: 20141014