US8001613B2 - Security using physical objects - Google Patents

Security using physical objects Download PDF

Info

Publication number
US8001613B2
US8001613B2 US11/426,101 US42610106A US8001613B2 US 8001613 B2 US8001613 B2 US 8001613B2 US 42610106 A US42610106 A US 42610106A US 8001613 B2 US8001613 B2 US 8001613B2
Authority
US
United States
Prior art keywords
security
detected
display
pattern
attributes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US11/426,101
Other versions
US20070300307A1 (en
Inventor
Duncan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Priority to US11/426,101 priority Critical patent/US8001613B2/en
Assigned to MICROSOFT CORPORATION reassignment MICROSOFT CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ., DUNCAN
Publication of US20070300307A1 publication Critical patent/US20070300307A1/en
Application granted granted Critical
Publication of US8001613B2 publication Critical patent/US8001613B2/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/36User authentication by graphic or iconic representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication

Definitions

  • Accessing secured content may require the placement of physical objects against a display surface that can detect the objects.
  • the user can request to define any security pattern they wish, and the system may respond by asking the user to place the desired object or objects in the desired pattern on the display.
  • the system may then detect the various visual attributes of the placed object(s), such as their outline shape, position on the display, relative positioning with respect to other objects, rotation orientation, interior design pattern, etc., and may ask the user to select which ones will be used in the security pattern.
  • the display surface may be a table top configuration, which may be used as a desk, so it may contain other objects not necessarily intended by the user to be security pattern objects.
  • the system when defining a security pattern, may give the user the option of de-selecting certain objects so that they are ignored in the pattern being created.
  • the system may display an attribute menu adjacent to each detected object on the display.
  • the menu can list the various detected attributes, and may give the user the option to check/un-check the individual attributes to allow the user to customize the level of security desired. For example, a user might simply wish to use the object's outline shape for one security pattern, and may wish to use the outline shapes and rotational orientations of a group of objects for another security pattern.
  • the system may also allow the user to define a margin of error for the application of the security pattern. Accordingly, if the security pattern requires that an object having a particular shape be placed at a particular location on the display, the system may be configured to accept placements of the object in slightly different locations (e.g., 5% to the left or right, 10% above, etc.).
  • FIG. 1 illustrates an example computing environment in which features described herein may be implemented.
  • FIG. 2 illustrates an example optical detection system that may be used as a display to implement features described herein.
  • FIGS. 3 and 4 illustrate example table embodiments of the display shown in FIG. 2 .
  • FIGS. 5A and 5B illustrate example displays having one or more objects placed on top or against them as a security pattern.
  • FIG. 6 illustrates an example process employing various features described herein.
  • FIG. 7 illustrates an example attribute menu that may be displayed for proposed security pattern objects.
  • FIGS. 8 and 9 illustrate example interior patterns that may be used as part of a security pattern.
  • FIG. 1 illustrates an example of a suitable computing system environment 100 on which the features herein may be implemented.
  • the computing system environment 100 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of the features described herein. Neither should the computing environment 100 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary operating environment 100 .
  • program modules include routines, programs, objects, components, data structures, etc. that can perform particular tasks or implement particular abstract data types.
  • program modules may be located in both local and remote computer storage media including memory storage devices.
  • the exemplary system 100 for implementing features described herein includes a general purpose-computing device in the form of a computer 110 including a processing unit 120 , a system memory 130 , and a system bus 121 that couples various system components including the system memory to the processing unit 120 .
  • Computer 110 may include a variety of computer readable media.
  • computer readable media may include computer storage media and communication media.
  • the system memory 130 may include computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 131 and random access memory (RAM) 132 .
  • ROM read only memory
  • RAM random access memory
  • a basic input/output system 133 (BIOS) containing the basic routines that help to transfer information between elements within computer 110 , such as during start-up, may be stored in ROM 131 .
  • BIOS basic input/output system 133
  • RAM 132 may contain data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120 .
  • FIG. 1 illustrates operating system 134 , application programs 135 , other program modules 136 , and program data 137 .
  • the computer 110 may also include other removable/nonremovable, volatile/nonvolatile computer storage media.
  • FIG. 1 illustrates a hard disk drive 141 that reads from or writes to nonremovable, nonvolatile magnetic media, a magnetic disk drive 151 that reads from or writes to a removable, nonvolatile magnetic disk 152 , and an optical disk drive 155 that reads from or writes to a removable, nonvolatile optical disk 156 such as a CD ROM or other optical media.
  • removable/nonremovable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like.
  • the hard disk drive 141 may be connected to the system bus 121 through a non-removable memory interface such as interface 140
  • magnetic disk drive 151 and optical disk drive 155 may be connected to the system bus 121 by a removable memory interface, such as interface 150 .
  • the drives and their associated computer storage media discussed above and illustrated in FIG. 1 may provide storage of computer readable instructions, data structures, program modules and other data for the computer 110 .
  • hard disk drive 141 is illustrated as storing operating system 144 , application programs 145 , other program modules 146 , and program data 147 .
  • operating system 144 application programs 145 , other program modules 146 , and program data 147 are given different numbers here to illustrate that, at a minimum, they are different copies.
  • a user may enter commands and information into the computer 110 through input devices such as a keyboard 162 and pointing device 161 , commonly referred to as a mouse, trackball or touch pad.
  • Other input devices may include a microphone, joystick, game pad, satellite dish, scanner, or the like.
  • These and other input devices are often connected to the processing unit 120 through a user input interface 160 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB).
  • a monitor 191 or other type of display device may also be connected to the system bus 121 via an interface, such as a video interface 190 .
  • the video interface 190 may be bidirectional, and may receive video input from sensors associated with the monitor 191 .
  • the monitor 191 may be touch and/or proximity sensitive, such that contacts to a monitor surface may be used as input data.
  • the input sensors for affecting this could be a capacitive touch sensitive device, an array of resistive contact sensors, an optical sensor or camera, or any other desired sensor to make the monitor 191 touch and/or proximity sensitive.
  • the monitor itself may be optically sensitive, such as the example shown in FIG. 2 and described below.
  • a touch, optical and/or proximity sensitive input system may be separate from monitor 191 , and may include a planar surface such as a table top 192 and any applicable sensing systems to make the planar surface touch sensitive, such as camera 193 .
  • computers may also include other peripheral output devices such as speakers 197 and printer 196 , which may be connected through an output peripheral interface 195 .
  • the computer 110 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 180 .
  • the remote computer 180 may be a personal computer, and typically includes many or all of the elements described above relative to the computer 110 , although only a memory storage device 181 has been illustrated in FIG. 1 .
  • the logical connections depicted in FIG. 1 include a local area network (LAN) 171 and a wide area network (WAN) 173 , but may also include other networks.
  • LAN local area network
  • WAN wide area network
  • the computer 110 When used in a LAN networking environment, the computer 110 may be connected to the LAN 171 through a network interface or adapter 170 .
  • the computer 110 When used in a WAN networking environment, the computer 110 may include a modem 172 or other means for establishing communications over the WAN 173 , such as the Internet.
  • the modem 172 which may be internal or external, may be connected to the system bus 121 via the user input interface 160 , or other appropriate mechanism.
  • program modules depicted relative to the computer 110 may be stored in the remote memory storage device.
  • FIG. 1 illustrates remote application programs 185 as residing on memory device 181 .
  • the computing device shown in FIG. 1 may be incorporated into a system having table display device 200 , as shown in FIG. 2 .
  • the display device 200 may include a display surface 201 , which may be a planar surface such as the table top 192 . As described hereinafter, the display surface 201 may also help to serve as a user interface.
  • the display device 200 may display a computer-generated image on its display surface 201 , which allows the device 200 to be used as a display monitor (such as monitor 191 ) for computing processes, displaying graphical user interfaces, displaying television or other visual images, video games, and the like.
  • the display may be projection-based, and may use a digital light processing (e.g., DLP by Texas Instruments, Inc., Plano, Tex.) technique, or it may be based on other display technologies, such as liquid crystal display (LCD) technology.
  • a projector 202 may be used to project light onto the underside of the display surface 201 . It may do so directly, or may do so using one or more mirrors. As shown in FIG.
  • the projector 202 in this example projects light for a desired image onto a first reflective surface 203 a , which may in turn reflect light onto a second reflective surface 203 b , which may ultimately reflect that light onto the underside of the display surface 201 , causing the surface 201 to emit light corresponding to the desired display.
  • the device 200 may also be used as an input-receiving device.
  • the device 200 may include one or more light emitting devices 204 , such as IR light emitting diodes (LEDs), mounted in the device's interior.
  • the light from devices 204 may be projected upwards through the display surface 201 , and may reflect off of various objects that are above the display surface 201 .
  • one or more objects 205 may be placed in physical contact with the display surface 201 .
  • One or more other objects 206 may be placed near the display surface 201 , but not in physical contact (e.g., closely hovering).
  • the light emitted from the emitting device(s) 204 may reflect off of these objects, and may be detected by a camera 207 , which may be an IR camera if IR light is used.
  • the signals from the camera 207 may then be forwarded to a computing device (e.g., the computer 110 shown in FIG. 1 ) for processing, which, based on various configurations for various applications, may identify the object and its orientation (e.g. touching or hovering, tilted, partially touching, etc.) based on its shape and the amount/type of light reflected.
  • the objects may include a reflective pattern, such as a bar code, on their lower surface.
  • the display surface 201 may include a translucent layer that diffuses emitted light. Based on the amount of light reflected back to the camera 207 through this layer, the associated processing system may determine whether an object is touching the surface 201 , and if the object is not touching, a distance between the object and the surface 201 . Accordingly, various physical objects (e.g., fingers, elbows, hands, stylus pens, blocks, etc.) may be used as physical control members, providing input to the device 200 (or to an associated computing device).
  • various physical objects e.g., fingers, elbows, hands, stylus pens, blocks, etc.
  • the device 200 shown in FIG. 2 is illustrated as using light projection- and sensing techniques for the display of data and the reception of input, but other techniques may be used as well.
  • stylus-sensitive displays are currently available for use with Tablet-based laptop computers, and such displays may be used as device 200 .
  • stylus- and other touch-sensitive displays are available with many personal data assistants (PDAs), and those types of displays may also be used as device 200 .
  • PDAs personal data assistants
  • the device 200 is also shown in a substantially horizontal orientation, with the display surface 201 acting as a tabletop. Other orientations may also be used.
  • the device 200 may be oriented to project a display onto any desired surface, such as a vertical wall. Reflective IR light may also be received from any such oriented surface.
  • FIG. 3 illustrates an illustrative configuration of an implementation of the system shown in FIG. 2 , in which device 301 is used as a tabletop display device.
  • FIG. 4 illustrates an overhead view of such a table, around which a number of users 401 may be seated or standing. Each user 401 may wish to interact with the display on the surface of table 301 , for example to place and/or touch an object, or to play a party video game.
  • the display area of table 301 shown in this example is circular, it may be any desired shape, such as oval, rectangular, square, hexagon, octagon, etc.
  • FIG. 5A illustrates an example of a security pattern that may be used in the place of, or together with, alphanumeric passwords to restrict access to a location or content on a computing system.
  • the display 501 may be in a tabletop or horizontal configuration, and may have several objects resting on top (other configurations may also be used, with objects in contact with the display surface).
  • FIG. 5A shows the placement of a star-shaped object 502 on the display 501 .
  • the object may be a personal effect of the user, such as a keychain ornament, that the user can easily remember to place on the tabletop display whenever he/she needs to access some particular piece of secured content (e.g., parental control blocking predetermined program types, such as adult content, from programs viewed on a television or the display 501 itself).
  • the display 501 may detect the outline shape of the object 502 , and may require that it sees the object 502 somewhere on the display 501 before granting access to the restricted content.
  • the user may configure the display 501 (or the secured content) to require more than just the placement of the object 502 having the appropriate outline.
  • the user may configure the system to require that the object be placed at a predefined location on the display 501 surface, using for example X,Y pixel coordinates.
  • the user may also configure the system to require that the object 502 be placed at a particular angle 503 of rotation orientation from a normal orientation.
  • FIG. 5B illustrates an example in which four example objects are placed on the display 501 as a security pattern.
  • the security pattern involves one or more dice 504 , a trophy having an oval base 505 , and a stapler 506 all placed on the display 501 before being satisfied and granting access to the secured content.
  • these objects may also be required to be placed at the proper locations, orientations, and/or at the proper angles, before access will be granted to the secured content.
  • FIG. 6 illustrates a process by which a security pattern may be defined and used.
  • the system e.g., display 501 and/or its underlying computing system
  • the request may be generated in many ways. It may be generated in response to a user's request, for example, to set a parental control limit on a television program source, or to change the security pattern for a previously-secure application or piece of content.
  • the request may be generated automatically, such as when a security-enabled application program is first installed on the display 501 's computing system, or when a periodic update is scheduled.
  • the request may also identify the target that is to be locked by the security pattern.
  • the secured content can be any aspect of the system, such as application software, configuration settings, data files, Internet sites, television programs, radio channels, etc.
  • Step 602 may occur prior to, or simultaneously with, the request in step 601 .
  • the system proceeds to step 603 and scans to determine what objects are seen or detected on the display.
  • the system may trace the outlines of the objects detected on the display, and may measure and record placement and orientation data identifying various aspects (e.g., appearance, outline, location, rotation, etc. described below) of how the objects were arranged.
  • the system may display an attribute menu 701 (as illustrated in FIG. 7 ) for each detected object. The present discussion will digress briefly to discuss the attribute menu in greater detail.
  • a separate attribute menu 701 may be displayed adjacent to each object detected on the display surface in step 602 .
  • the attribute menu 701 may display a number of object characteristics that have been detected by the system and may be selected for inclusion in the security pattern being generated, as well as the option to have certain detected objects ignored or excluded from the security pattern being defined. For example, the user may be given an option to de-select a particular object, to indicate that a detected object is not intended to be part of the security pattern. This may be useful, for example, if the user has other miscellaneous objects on the display surface and does not wish to clear off the entire surface to generate the pattern. This may also be useful if inadvertent objects, such as the user's elbow, were placed on the display and detected in step 602 .
  • the user may have an option 703 to indicate that the object's outline shape is an attribute to be used in the security pattern. This may be useful, for example, if the user has a particular object (e.g., the star-shaped keychain ornament 502 ) that he/she intends to treat as the “key” to the secured content. If this option is selected (e.g., by checking a selection box), then when the security pattern is in use, it will require the presence, somewhere on the display, of the object's outline before granting access to the secured content.
  • a particular object e.g., the star-shaped keychain ornament 502
  • the user may request that the object's interior pattern also be a required attribute of the security pattern.
  • the display may detect visual patterns on the bottom surface of objects resting on the display, and may incorporate those patterns in the security pattern when this option is selected. So, for example, if object 504 is a die, then the pattern shown in FIG. 8 may be the detected interior pattern. If this interior pattern is included in the security pattern, then the secured content will only be unlocked if the object having this pattern (in the FIG. 8 example, the number ‘5’ on the die) is placed on the display surface.
  • the interior pattern may be any printed pattern or surface feature.
  • FIG. 9 shows an example of a bar code that may be affixed to (or carved in) the bottom of stapler 506 and detected. The pattern may be printed using visible ink, or any other form of ink (e.g., invisible to humans) that can be detected by the display system's camera 207 .
  • the user may request that the current position of the object be used as a required attribute of the security pattern.
  • the location may be expressed in terms of X,Y display (e.g., pixel) coordinates, or any other desired coordinate system. This requirement will require the placement of an object at the specified location, but will not necessarily require the same object used in defining the security pattern.
  • the security pattern only includes object position 705 criteria, without other object-specific criteria such as shape outline 703 and shape interior pattern 704 , the placement of any object at the specified location may serve to satisfy the security pattern. This may be useful, for example, if the user does not want to have to remember to bring specific objects to the display to unlock secured content. Instead, the user might simply want to remember that unlocking that content requires placing, for example, any object in the upper-left corner of the display, or placing multiple objects in the shape of a square at some location on the display.
  • the options may also include a granularity option.
  • the granularity option may specify a margin for error permitted in unlocking the secured content. So, for example, if the object's position 705 is a required part of the security pattern, the user may specify a 5% margin of error that will also satisfy the requirement. In such a case, if the user attempts to unlock content locked by a security pattern that requires placing an object at location 100, 100, but places the object 5% off of the required location (e.g., at coordinates 95, 100; or 95, 95), the system may still treat that as satisfying the requirement.
  • Some programs may automatically include a degree of granularity in some or all of the security pattern's requirements, to allow users some flexibility in use.
  • Some programs may place limitations on the ranges of granularity permitted, such as a guaranteed minimum margin for error to prevent users from requiring too high a degree of precision, or a granularity ceiling to avoid giving so much margin for error that the attribute loses value as a security measure.
  • Another option 706 may specify the rotational orientation angle made by the object.
  • Some objects may have a normal, vertical axis, based on their shape, and the rotational angle may indicate how much the object has been rotated away from its normal axis.
  • the star keychain 502 may be defined as upright with one point pointing straight up, and its arrangement in FIGS. 5A and 5B is considered to be rotated by an angle 503 - ⁇ .
  • the normal axis may be predetermined as part of shape recognition software used in the system, or the user may define the object's normal axis in a configuration process (e.g., by placing the object on the display and identifying the normal axis by touching two points to define the axis, or by touching a point outside the object through which the normal axis passes if the center of the object is otherwise specified).
  • the rotational offset is also subject to a granularity option to allow for angular imperfections when the pattern is in use.
  • These are example visual attributes that may be used, and other aspects of an object may be required as well. For example, some implementations may require specific colors, reflectivity/light scattering, patterns of movement or placement in multiple locations and/or orientations over a period of time, radio-frequency identifiers, etc. to be present in an object to satisfy a security pattern requirement.
  • step 604 the user selects the desired attribute(s) that will be requirements in the security pattern being defined (e.g., by checking selection boxes in the various attribute menus 701 ), and in step 605 , the user may define granularities for the various attributes.
  • the process may then move to step 606 to determine if a request has been made to unlock content that is secured by a security pattern.
  • the request may be made in any way desired. For example, if the security pattern is registered as restricting a particular video program (e.g., adult video content), the request may be made when a user operates a remote control (e.g., a handheld remote, or using a remote control graphical user interface on display 501 ) to select the restricted content.
  • the request may also occur automatically, for example, if a user had previously scheduled a program to be recorded before the program was locked by another user, but the program had become locked by the time it was scheduled to air.
  • step 607 the user attempting to access the content is prompted to present the required security pattern.
  • the system may continuously scan the display surface and determine whether the required pattern is present, or alternatively, it may simply ask the user to indicate (e.g., by pressing a “Ready” button displayed with the prompt) when he/she is finished arranging objects to present as the security pattern.
  • one or more of the required objects may already have been on the display surface, and might only need to be moved to the correct location after the request in order to satisfy the security pattern.
  • the user may already have objects like coffee mugs, staplers, telephones, computers, pen holders, etc. lying about on the display surface. Some of these may have been designated security objects, and the user might configure a security pattern to simply require moving one of these objects to its proper location after the request to unlock the secured content.
  • the system may scan the display surface to determine whether the required security pattern is present. During this process, the system may simply ignore those objects that do not satisfy any part of the security pattern, looking just for the required attributes. This may allow the user to avoid having to clear off the entire surface of the display 501 in order to present the pattern.
  • the security pattern may be configured to require just that—that only the objects specified by the pattern are placed on the display 501 during the pattern verification process.
  • applications may define a subset area of the display 501 that must be cleared of extraneous objects and only contain the security pattern objects (i.e., all objects detected in the subset area during verification are compared to the security pattern, and objects that do not satisfy an attribute of the security pattern cause the verification to fail).
  • the system may grant the requested access in step 609 . If the security pattern is not present, the system may deny access in step 610 . The security pattern process may then return to step 601 to begin anew. Note that the security pattern process may be used in conjunction with traditional alphanumeric passwords, if desired, to provide a higher degree of security.
  • the security pattern may be stored as a data structure in any of the computer-readable media and storage devices described above.
  • the security pattern may be stored as a file having the contents shown below:
  • the ⁇ Security Pattern Name> parameter may be an alphanumeric name given by the user when defining the security pattern.
  • the ⁇ Security Pattern Target> may be an identifier of the television program, television channel, Internet site, software application, data file, or other content whose access will require that the user correctly provide the security pattern.
  • the security pattern is not limited to locking just one piece of content, and may instead be used to lock a variety of different pieces of content (e.g., multiple files, applications, programs, etc.).
  • the ⁇ Number of Objects> parameter may identify a number of objects that will need to be presented to satisfy the security pattern.
  • the security pattern file may identify one or more attributes associated with the object. So, for example, Object 1 might only require that a particular item be placed somewhere on the display, and may list a single ⁇ Shape Outline ID> attribute.
  • the ⁇ Shape Outline ID> attribute may identify (e.g., a file pointer, file name, etc.) a source from which the outline of the required shape may be obtained.
  • This may be an image file, such as a *.BMP, *.JPG, *.PDF, etc., and may be generated using a computer assisted drawing (CAD) program, such as MICROSOFT VISIOTM (product of Microsoft Corporation, Redmond Wash.).
  • CAD computer assisted drawing
  • Object 2 might also include a ⁇ Shape Position> attribute.
  • This attribute may identify one or more pixel coordinates (e.g., 100, 200) on the display that are required to be occupied by the given object, as well as any granularity tolerances (e.g., 5%, 10%, etc.) specified by the user.
  • the position attribute may refer to relative positions among the objects, as opposed to fixed coordinates on the display.
  • the security pattern may indicate that three objects form the vertices of an equilateral triangle, with no requirement of how large or small the triangle must be. So long as the three objects maintain the correct relative position with one another, they may satisfy such an attribute.
  • Other objects may include a ⁇ Shape Interior Pattern> attribute that may identify the source of a pattern image (akin to the image file identified above) to be matched for the object, and a ⁇ Shape Rotation> attribute that may identify a required angle of rotation and associated granularity values.

Abstract

A password-type security system may be employed using the placement of physical objects as a security pattern that is to be matched before access to secured content is granted. The system may be implemented on a computing system that uses a display that can detect, e.g., via optical circuitry, the visual characteristics of the display surface. The system can visually detect the placement of objects, their orientations, locations, color, printed patterns, etc. The user may define a security pattern as comprising one or more objects placed at locations on the screen, or at a predetermined rotation angle. The outline shape of an object may be treated as a required pattern, such that access to secured content is permitted only if the object having that outline shape is detected on the display surface. Similarly, printed patterns on objects may also be detected and used as part of security patterns.

Description

BACKGROUND
Passwords have become ubiquitous in daily life. Today, it is not uncommon for a person to have to remember dozens of unique words, codes, numbers and phrases to gain access to bank automated teller machines (ATMs), subscription Internet sites, work computers, e-mail programs, cell phone accounts, cable television pay-per-view (and other television parental control features), and a plethora of other secure locations. Many times, these passwords are randomly-generated sequences of letters and numbers that may enhance security, but may also be difficult to remember. Couple that with the equally random account numbers that often go with these services, and it is easy to see why many have resorted to using the same password at different locations, or writing passwords down on a handy piece of paper by the computer. Obviously, such efforts compromise security, undermining the purpose for their existence in the first place.
As technology marches onward, new genres of products offer opportunities for doing things differently. One such technology offers video displays that can see, or optically detect, objects that are placed against or near the display surface. The description below offers password security features that may take advantage of the capabilities of these kinds of displays.
SUMMARY
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features, essential features, or required advantages of the claimed subject matter, or in limiting the scope of the appended claims.
Methods are described herein that allow users to define security levels in a computing system that uses the shape and/or layout arrangement of one or more physical objects. Accessing secured content may require the placement of physical objects against a display surface that can detect the objects. The user can request to define any security pattern they wish, and the system may respond by asking the user to place the desired object or objects in the desired pattern on the display. The system may then detect the various visual attributes of the placed object(s), such as their outline shape, position on the display, relative positioning with respect to other objects, rotation orientation, interior design pattern, etc., and may ask the user to select which ones will be used in the security pattern.
The display surface may be a table top configuration, which may be used as a desk, so it may contain other objects not necessarily intended by the user to be security pattern objects. The system, when defining a security pattern, may give the user the option of de-selecting certain objects so that they are ignored in the pattern being created.
When configuring the security pattern, the system may display an attribute menu adjacent to each detected object on the display. The menu can list the various detected attributes, and may give the user the option to check/un-check the individual attributes to allow the user to customize the level of security desired. For example, a user might simply wish to use the object's outline shape for one security pattern, and may wish to use the outline shapes and rotational orientations of a group of objects for another security pattern.
For the attributes that are to be used, the system may also allow the user to define a margin of error for the application of the security pattern. Accordingly, if the security pattern requires that an object having a particular shape be placed at a particular location on the display, the system may be configured to accept placements of the object in slightly different locations (e.g., 5% to the left or right, 10% above, etc.).
Users may find it easier to remember a physical object-based security pattern, and having a physical object associated with the security pattern may allow users to easily recognize how secure (or unsecure) their content is, both of which may give the user greater confidence in the system. A user might not readily understand the difference in security between a five-letter alphanumeric password and a ten-letter one, but the user may easily understand the relative difference between requiring the placement and/or arrangement of five objects versus ten. Additionally, since a physical object cannot be duplicated as easily as an alphanumeric password, a user who temporarily loses the object (e.g., to a child who sneaks the object from a mother's purse) can rest assured that when the object is returned, the security is restored (without having to create a new security pattern). These advantages and others may be realized using some or all of the features described herein.
These and other features will be described in greater detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 illustrates an example computing environment in which features described herein may be implemented.
FIG. 2 illustrates an example optical detection system that may be used as a display to implement features described herein.
FIGS. 3 and 4 illustrate example table embodiments of the display shown in FIG. 2.
FIGS. 5A and 5B illustrate example displays having one or more objects placed on top or against them as a security pattern.
FIG. 6 illustrates an example process employing various features described herein.
FIG. 7 illustrates an example attribute menu that may be displayed for proposed security pattern objects.
FIGS. 8 and 9 illustrate example interior patterns that may be used as part of a security pattern.
DETAILED DESCRIPTION
FIG. 1 illustrates an example of a suitable computing system environment 100 on which the features herein may be implemented. The computing system environment 100 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of the features described herein. Neither should the computing environment 100 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary operating environment 100.
The features herein are described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that can perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that the features may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like. The features may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
With reference to FIG. 1, the exemplary system 100 for implementing features described herein includes a general purpose-computing device in the form of a computer 110 including a processing unit 120, a system memory 130, and a system bus 121 that couples various system components including the system memory to the processing unit 120.
Computer 110 may include a variety of computer readable media. By way of example, and not limitation, computer readable media may include computer storage media and communication media. The system memory 130 may include computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 131 and random access memory (RAM) 132. A basic input/output system 133 (BIOS), containing the basic routines that help to transfer information between elements within computer 110, such as during start-up, may be stored in ROM 131. RAM 132 may contain data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120. By way of example, and not limitation, FIG. 1 illustrates operating system 134, application programs 135, other program modules 136, and program data 137.
The computer 110 may also include other removable/nonremovable, volatile/nonvolatile computer storage media. By way of example only, FIG. 1 illustrates a hard disk drive 141 that reads from or writes to nonremovable, nonvolatile magnetic media, a magnetic disk drive 151 that reads from or writes to a removable, nonvolatile magnetic disk 152, and an optical disk drive 155 that reads from or writes to a removable, nonvolatile optical disk 156 such as a CD ROM or other optical media. Other removable/nonremovable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like. The hard disk drive 141 may be connected to the system bus 121 through a non-removable memory interface such as interface 140, and magnetic disk drive 151 and optical disk drive 155 may be connected to the system bus 121 by a removable memory interface, such as interface 150.
The drives and their associated computer storage media discussed above and illustrated in FIG. 1 may provide storage of computer readable instructions, data structures, program modules and other data for the computer 110. In FIG. 1, for example, hard disk drive 141 is illustrated as storing operating system 144, application programs 145, other program modules 146, and program data 147. Note that these components can either be the same as or different from operating system 134, application programs 135, other program modules 136, and program data 137. Operating system 144, application programs 145, other program modules 146, and program data 147 are given different numbers here to illustrate that, at a minimum, they are different copies. A user may enter commands and information into the computer 110 through input devices such as a keyboard 162 and pointing device 161, commonly referred to as a mouse, trackball or touch pad. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the processing unit 120 through a user input interface 160 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB). A monitor 191 or other type of display device may also be connected to the system bus 121 via an interface, such as a video interface 190. The video interface 190 may be bidirectional, and may receive video input from sensors associated with the monitor 191. For example, the monitor 191 may be touch and/or proximity sensitive, such that contacts to a monitor surface may be used as input data. The input sensors for affecting this could be a capacitive touch sensitive device, an array of resistive contact sensors, an optical sensor or camera, or any other desired sensor to make the monitor 191 touch and/or proximity sensitive. The monitor itself may be optically sensitive, such as the example shown in FIG. 2 and described below. In an alternative arrangement, or in addition, a touch, optical and/or proximity sensitive input system may be separate from monitor 191, and may include a planar surface such as a table top 192 and any applicable sensing systems to make the planar surface touch sensitive, such as camera 193. In addition to the monitor, computers may also include other peripheral output devices such as speakers 197 and printer 196, which may be connected through an output peripheral interface 195.
The computer 110 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 180. The remote computer 180 may be a personal computer, and typically includes many or all of the elements described above relative to the computer 110, although only a memory storage device 181 has been illustrated in FIG. 1. The logical connections depicted in FIG. 1 include a local area network (LAN) 171 and a wide area network (WAN) 173, but may also include other networks.
When used in a LAN networking environment, the computer 110 may be connected to the LAN 171 through a network interface or adapter 170. When used in a WAN networking environment, the computer 110 may include a modem 172 or other means for establishing communications over the WAN 173, such as the Internet. The modem 172, which may be internal or external, may be connected to the system bus 121 via the user input interface 160, or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 110, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation, FIG. 1 illustrates remote application programs 185 as residing on memory device 181. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used. Many of the features described herein may be implemented using computer-executable instructions stored on one or more computer-readable media, such as the media described above, for execution on the one or more units that make up processing unit 120.
The computing device shown in FIG. 1 may be incorporated into a system having table display device 200, as shown in FIG. 2. The display device 200 may include a display surface 201, which may be a planar surface such as the table top 192. As described hereinafter, the display surface 201 may also help to serve as a user interface.
The display device 200 may display a computer-generated image on its display surface 201, which allows the device 200 to be used as a display monitor (such as monitor 191) for computing processes, displaying graphical user interfaces, displaying television or other visual images, video games, and the like. The display may be projection-based, and may use a digital light processing (e.g., DLP by Texas Instruments, Inc., Plano, Tex.) technique, or it may be based on other display technologies, such as liquid crystal display (LCD) technology. Where a projection-style display is used, a projector 202 may be used to project light onto the underside of the display surface 201. It may do so directly, or may do so using one or more mirrors. As shown in FIG. 2, the projector 202 in this example projects light for a desired image onto a first reflective surface 203 a, which may in turn reflect light onto a second reflective surface 203 b, which may ultimately reflect that light onto the underside of the display surface 201, causing the surface 201 to emit light corresponding to the desired display.
In addition to being used as an output display for displaying images, the device 200 may also be used as an input-receiving device. As illustrated in FIG. 2, the device 200 may include one or more light emitting devices 204, such as IR light emitting diodes (LEDs), mounted in the device's interior. The light from devices 204 may be projected upwards through the display surface 201, and may reflect off of various objects that are above the display surface 201. For example, one or more objects 205 may be placed in physical contact with the display surface 201. One or more other objects 206 may be placed near the display surface 201, but not in physical contact (e.g., closely hovering). The light emitted from the emitting device(s) 204 may reflect off of these objects, and may be detected by a camera 207, which may be an IR camera if IR light is used. The signals from the camera 207 may then be forwarded to a computing device (e.g., the computer 110 shown in FIG. 1) for processing, which, based on various configurations for various applications, may identify the object and its orientation (e.g. touching or hovering, tilted, partially touching, etc.) based on its shape and the amount/type of light reflected. To assist in identifying the objects 205, 206, the objects may include a reflective pattern, such as a bar code, on their lower surface. To assist in differentiating objects in contact 205 from hovering objects 206, the display surface 201 may include a translucent layer that diffuses emitted light. Based on the amount of light reflected back to the camera 207 through this layer, the associated processing system may determine whether an object is touching the surface 201, and if the object is not touching, a distance between the object and the surface 201. Accordingly, various physical objects (e.g., fingers, elbows, hands, stylus pens, blocks, etc.) may be used as physical control members, providing input to the device 200 (or to an associated computing device).
The device 200 shown in FIG. 2 is illustrated as using light projection- and sensing techniques for the display of data and the reception of input, but other techniques may be used as well. For example, stylus-sensitive displays are currently available for use with Tablet-based laptop computers, and such displays may be used as device 200. Additionally, stylus- and other touch-sensitive displays are available with many personal data assistants (PDAs), and those types of displays may also be used as device 200.
The device 200 is also shown in a substantially horizontal orientation, with the display surface 201 acting as a tabletop. Other orientations may also be used. For example, the device 200 may be oriented to project a display onto any desired surface, such as a vertical wall. Reflective IR light may also be received from any such oriented surface.
FIG. 3 illustrates an illustrative configuration of an implementation of the system shown in FIG. 2, in which device 301 is used as a tabletop display device. FIG. 4 illustrates an overhead view of such a table, around which a number of users 401 may be seated or standing. Each user 401 may wish to interact with the display on the surface of table 301, for example to place and/or touch an object, or to play a party video game. Although the display area of table 301 shown in this example is circular, it may be any desired shape, such as oval, rectangular, square, hexagon, octagon, etc.
FIG. 5A illustrates an example of a security pattern that may be used in the place of, or together with, alphanumeric passwords to restrict access to a location or content on a computing system. The display 501 may be in a tabletop or horizontal configuration, and may have several objects resting on top (other configurations may also be used, with objects in contact with the display surface). FIG. 5A shows the placement of a star-shaped object 502 on the display 501. The object may be a personal effect of the user, such as a keychain ornament, that the user can easily remember to place on the tabletop display whenever he/she needs to access some particular piece of secured content (e.g., parental control blocking predetermined program types, such as adult content, from programs viewed on a television or the display 501 itself). The display 501 may detect the outline shape of the object 502, and may require that it sees the object 502 somewhere on the display 501 before granting access to the restricted content. For added security, the user may configure the display 501 (or the secured content) to require more than just the placement of the object 502 having the appropriate outline. For example, the user may configure the system to require that the object be placed at a predefined location on the display 501 surface, using for example X,Y pixel coordinates. The user may also configure the system to require that the object 502 be placed at a particular angle 503 of rotation orientation from a normal orientation. These options are described further below.
The user may also configure the system to require the placement of more than one object. FIG. 5B illustrates an example in which four example objects are placed on the display 501 as a security pattern. In the FIG. 5B example, the security pattern involves one or more dice 504, a trophy having an oval base 505, and a stapler 506 all placed on the display 501 before being satisfied and granting access to the secured content. As with the keychain 502, these objects may also be required to be placed at the proper locations, orientations, and/or at the proper angles, before access will be granted to the secured content.
FIG. 6 illustrates a process by which a security pattern may be defined and used. In step 601, the system (e.g., display 501 and/or its underlying computing system) may check to see if a request has been made to register a new security pattern to restrict access to some content (e.g., a television program, a software application, a data file, etc.). The request may be generated in many ways. It may be generated in response to a user's request, for example, to set a parental control limit on a television program source, or to change the security pattern for a previously-secure application or piece of content. Alternatively, the request may be generated automatically, such as when a security-enabled application program is first installed on the display 501's computing system, or when a periodic update is scheduled. The request may also identify the target that is to be locked by the security pattern. The secured content can be any aspect of the system, such as application software, configuration settings, data files, Internet sites, television programs, radio channels, etc.
If a request has been received, the process may move to step 602 and await the user's placement of objects on the display 501. The system may prompt the user with a pop-up display or other message requesting that the user place the proposed security pattern objects on the display. Step 602 may occur prior to, or simultaneously with, the request in step 601.
When the proposed security pattern objects are placed on the display, the system proceeds to step 603 and scans to determine what objects are seen or detected on the display. During this process, the system may trace the outlines of the objects detected on the display, and may measure and record placement and orientation data identifying various aspects (e.g., appearance, outline, location, rotation, etc. described below) of how the objects were arranged. The system may display an attribute menu 701 (as illustrated in FIG. 7) for each detected object. The present discussion will digress briefly to discuss the attribute menu in greater detail.
A separate attribute menu 701 may be displayed adjacent to each object detected on the display surface in step 602. The attribute menu 701 may display a number of object characteristics that have been detected by the system and may be selected for inclusion in the security pattern being generated, as well as the option to have certain detected objects ignored or excluded from the security pattern being defined. For example, the user may be given an option to de-select a particular object, to indicate that a detected object is not intended to be part of the security pattern. This may be useful, for example, if the user has other miscellaneous objects on the display surface and does not wish to clear off the entire surface to generate the pattern. This may also be useful if inadvertent objects, such as the user's elbow, were placed on the display and detected in step 602.
For objects that the user wishes to include in the security pattern, the user may have an option 703 to indicate that the object's outline shape is an attribute to be used in the security pattern. This may be useful, for example, if the user has a particular object (e.g., the star-shaped keychain ornament 502) that he/she intends to treat as the “key” to the secured content. If this option is selected (e.g., by checking a selection box), then when the security pattern is in use, it will require the presence, somewhere on the display, of the object's outline before granting access to the secured content.
As another option 704, the user may request that the object's interior pattern also be a required attribute of the security pattern. The display may detect visual patterns on the bottom surface of objects resting on the display, and may incorporate those patterns in the security pattern when this option is selected. So, for example, if object 504 is a die, then the pattern shown in FIG. 8 may be the detected interior pattern. If this interior pattern is included in the security pattern, then the secured content will only be unlocked if the object having this pattern (in the FIG. 8 example, the number ‘5’ on the die) is placed on the display surface. The interior pattern may be any printed pattern or surface feature. FIG. 9 shows an example of a bar code that may be affixed to (or carved in) the bottom of stapler 506 and detected. The pattern may be printed using visible ink, or any other form of ink (e.g., invisible to humans) that can be detected by the display system's camera 207.
As another option 705, the user may request that the current position of the object be used as a required attribute of the security pattern. The location may be expressed in terms of X,Y display (e.g., pixel) coordinates, or any other desired coordinate system. This requirement will require the placement of an object at the specified location, but will not necessarily require the same object used in defining the security pattern. For example, if the security pattern only includes object position 705 criteria, without other object-specific criteria such as shape outline 703 and shape interior pattern 704, the placement of any object at the specified location may serve to satisfy the security pattern. This may be useful, for example, if the user does not want to have to remember to bring specific objects to the display to unlock secured content. Instead, the user might simply want to remember that unlocking that content requires placing, for example, any object in the upper-left corner of the display, or placing multiple objects in the shape of a square at some location on the display.
The options may also include a granularity option. The granularity option may specify a margin for error permitted in unlocking the secured content. So, for example, if the object's position 705 is a required part of the security pattern, the user may specify a 5% margin of error that will also satisfy the requirement. In such a case, if the user attempts to unlock content locked by a security pattern that requires placing an object at location 100, 100, but places the object 5% off of the required location (e.g., at coordinates 95, 100; or 95, 95), the system may still treat that as satisfying the requirement. Some programs may automatically include a degree of granularity in some or all of the security pattern's requirements, to allow users some flexibility in use. Some programs may place limitations on the ranges of granularity permitted, such as a guaranteed minimum margin for error to prevent users from requiring too high a degree of precision, or a granularity ceiling to avoid giving so much margin for error that the attribute loses value as a security measure.
Another option 706 may specify the rotational orientation angle made by the object. Some objects may have a normal, vertical axis, based on their shape, and the rotational angle may indicate how much the object has been rotated away from its normal axis. For example, the star keychain 502 may be defined as upright with one point pointing straight up, and its arrangement in FIGS. 5A and 5B is considered to be rotated by an angle 503-θ. The normal axis may be predetermined as part of shape recognition software used in the system, or the user may define the object's normal axis in a configuration process (e.g., by placing the object on the display and identifying the normal axis by touching two points to define the axis, or by touching a point outside the object through which the normal axis passes if the center of the object is otherwise specified). The rotational offset is also subject to a granularity option to allow for angular imperfections when the pattern is in use. These are example visual attributes that may be used, and other aspects of an object may be required as well. For example, some implementations may require specific colors, reflectivity/light scattering, patterns of movement or placement in multiple locations and/or orientations over a period of time, radio-frequency identifiers, etc. to be present in an object to satisfy a security pattern requirement.
Returning now to the process of FIG. 6, in step 604, the user selects the desired attribute(s) that will be requirements in the security pattern being defined (e.g., by checking selection boxes in the various attribute menus 701), and in step 605, the user may define granularities for the various attributes.
The process may then move to step 606 to determine if a request has been made to unlock content that is secured by a security pattern. The request may be made in any way desired. For example, if the security pattern is registered as restricting a particular video program (e.g., adult video content), the request may be made when a user operates a remote control (e.g., a handheld remote, or using a remote control graphical user interface on display 501) to select the restricted content. The request may also occur automatically, for example, if a user had previously scheduled a program to be recorded before the program was locked by another user, but the program had become locked by the time it was scheduled to air.
If no request is made, the process may return to step 601 to begin anew. If a request is made to access locked content, the process may proceed to step 607, where the user attempting to access the content is prompted to present the required security pattern. The system may continuously scan the display surface and determine whether the required pattern is present, or alternatively, it may simply ask the user to indicate (e.g., by pressing a “Ready” button displayed with the prompt) when he/she is finished arranging objects to present as the security pattern. In some instances, one or more of the required objects may already have been on the display surface, and might only need to be moved to the correct location after the request in order to satisfy the security pattern. For example, if the user is using the display 501 as a desk, he/she may already have objects like coffee mugs, staplers, telephones, computers, pen holders, etc. lying about on the display surface. Some of these may have been designated security objects, and the user might configure a security pattern to simply require moving one of these objects to its proper location after the request to unlock the secured content.
Once the objects are in place, in step 608, the system may scan the display surface to determine whether the required security pattern is present. During this process, the system may simply ignore those objects that do not satisfy any part of the security pattern, looking just for the required attributes. This may allow the user to avoid having to clear off the entire surface of the display 501 in order to present the pattern. Alternatively, the security pattern may be configured to require just that—that only the objects specified by the pattern are placed on the display 501 during the pattern verification process. In some situations, applications may define a subset area of the display 501 that must be cleared of extraneous objects and only contain the security pattern objects (i.e., all objects detected in the subset area during verification are compared to the security pattern, and objects that do not satisfy an attribute of the security pattern cause the verification to fail).
If the security pattern is present, the system may grant the requested access in step 609. If the security pattern is not present, the system may deny access in step 610. The security pattern process may then return to step 601 to begin anew. Note that the security pattern process may be used in conjunction with traditional alphanumeric passwords, if desired, to provide a higher degree of security.
The security pattern may be stored as a data structure in any of the computer-readable media and storage devices described above. For example, the security pattern may be stored as a file having the contents shown below:
<Security Pattern Name>
<Security Pattern Target>
<Number of Objects>
Object 1:
   <Shape Outline ID>
Object 2:
   <Shape Outline ID>
   <Shape Position>
Object 3:
   <Shape Interior Pattern>
   <Shape Position>
   <Shape Rotation>
The <Security Pattern Name> parameter may be an alphanumeric name given by the user when defining the security pattern. The <Security Pattern Target> may be an identifier of the television program, television channel, Internet site, software application, data file, or other content whose access will require that the user correctly provide the security pattern. The security pattern is not limited to locking just one piece of content, and may instead be used to lock a variety of different pieces of content (e.g., multiple files, applications, programs, etc.).
The <Number of Objects> parameter may identify a number of objects that will need to be presented to satisfy the security pattern. For each object, the security pattern file may identify one or more attributes associated with the object. So, for example, Object 1 might only require that a particular item be placed somewhere on the display, and may list a single <Shape Outline ID> attribute. The <Shape Outline ID> attribute may identify (e.g., a file pointer, file name, etc.) a source from which the outline of the required shape may be obtained. This may be an image file, such as a *.BMP, *.JPG, *.PDF, etc., and may be generated using a computer assisted drawing (CAD) program, such as MICROSOFT VISIO™ (product of Microsoft Corporation, Redmond Wash.).
Other parameters may be stored as well. For example, Object 2 might also include a <Shape Position> attribute. This attribute may identify one or more pixel coordinates (e.g., 100, 200) on the display that are required to be occupied by the given object, as well as any granularity tolerances (e.g., 5%, 10%, etc.) specified by the user. If multiple objects are included in the security pattern, the position attribute may refer to relative positions among the objects, as opposed to fixed coordinates on the display. For example, the security pattern may indicate that three objects form the vertices of an equilateral triangle, with no requirement of how large or small the triangle must be. So long as the three objects maintain the correct relative position with one another, they may satisfy such an attribute.
Other objects, such as Object 3, may include a <Shape Interior Pattern> attribute that may identify the source of a pattern image (akin to the image file identified above) to be matched for the object, and a <Shape Rotation> attribute that may identify a required angle of rotation and associated granularity values.
Using one or more of the features and approaches described above, users' experiences with various orientations can be improved. Although the description above provides illustrative examples and sequences of actions, it should be understood that the various examples and sequences may be rearranged, divided, combined and subcombined as desired. For example, steps and features described may be omitted, or additional steps and features may be added. Accordingly, although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (14)

1. A method for controlling access to restricted content in a computing system, comprising the steps of:
registering a security pattern on the computing system, wherein registering the security pattern includes:
detecting a security object proximal the display of the computer system,
detecting physical attributes of the security object including a physical shape of the security object, a physical pattern on the security object, a position of the security object in relation to the display, and an angular orientation of the security object in relation to the display,
generating a list of the physical attributes of the security object in relation to the display,
upon selection, displaying an attribute menu on the display, wherein the attribute menu displays selectable options that comprise the generated list of physical attributes for the security object that is proximal to the display of the computer system, wherein the selectable options are used to determine attributes of the security object to include in the security pattern;
receiving a user selection of one or more of the selectable options from the displayed attribute menu—wherein receiving user selection of the position of the security object includes receiving user specification of a position-based percentage margin of error and receiving user selection of the angular orientation of the security object includes receiving user specification of an angle-based percentage margin of error,
generating the security pattern for granting access to the secured content based on the user selection from the attribute menu, and
registering and storing the security pattern in association with security credentials for a user;
after registering and storing the security-pattern in association with security credentials for the user:
generating a prompt that requests security credentials,
detecting an object proximal the display of the computing system,
detecting physical attributes of the detected object in relation to the display,
determining whether any of the detected physical attributes of the detected object matches the security pattern of the security object,
when the detected physical attributes of the detected object matches the stored security pattern of the security object, providing the security credentials and causing the computer system to provide access to the restricted content, and
when the detected physical attributes of the detected object do not match the stored security pattern of the security object, maintaining the restriction to the restricted content of the computer system.
2. The method of claim 1, further comprising a matching threshold, wherein determining whether any of the detected physical attributes of the detected object matches the stored security pattern of the security object includes matching according to the threshold.
3. The method of claim 1, further comprising, indicating that the detected object is the security object when any of the detected physical attributes of the detected object matches the stored security pattern of the security object.
4. The method of claim 1, further comprising, indicating that the detected object is not the security object when any of the detected physical attributes of the detected do not match the stored security pattern of the security object.
5. The method of claim 1, wherein at least one physical attribute of the physical attributes of the security object is a position of the security object in relation to the display, wherein the security object is different than the detected object.
6. A memory having computer executable instructions for controlling access to restricted content in a computing system, the instructions comprising:
registering a security pattern, wherein registering the security pattern includes:
detecting a security object proximal the display,
detecting physical attributes of the security object including a physical shape of the security object, a physical pattern on the security object, a position of the security object in relation to the display, and an angular orientation of the security object in relation to the display,
generating a list of the attributes of the security object in relation to the display,
displaying an attribute menu on the display, wherein the attribute menu displays selectable options that comprise the generated list of attributes of the security object that is proximal to the display of the computer system, wherein the selectable options are used to determine attributes to include in the security pattern;
receiving a user selection of one or more of the selectable options from the displayed attribute menu;
generating the security pattern for granting access to the secured content based on the user selection from the attribute menu, and
registering and storing the security pattern in association with security credentials for a user;
detecting an object proximal the display;
detecting attributes of the detected object in relation to the display;
determining whether any of the detected attributes of the detected object matches the stored security pattern of the security object;
when the detected attributes of the detected object matches the stored security pattern of the security object, providing security credentials for a user to access to the restricted content; and
when the detected attributes of the detected object does not match the stored security pattern of the security object, maintaining the restriction to the restricted content.
7. The memory of claim 6, further comprising a matching threshold, wherein determining whether any of the detected attributes of the detected object matches the stored security pattern of the security object includes matching according to the threshold.
8. The memory of claim 6, further comprising, indicating that the detected object is the security object when any of the detected attributes of the detected object matches the stored security pattern of the security object.
9. The memory of claim 6, further comprising, indicating that the detected object is not the security object when any of the detected attributes of the detected object do not match the stored security pattern of the security object.
10. The memory of claim 6, wherein at least one attribute of the plurality of attributes of the security object is a position of the security object in relation to the display, wherein the security object is different than the detected object.
11. A system for controlling access to secured content in a computing system, the instructions comprising:
a processor; and
a computer readable storage medium having computer executable instructions stored thereon, wherein the computer executable instructions are executed by the process and perform actions, comprising:
registering a security pattern, wherein registering the security pattern includes:
detecting a security object proximal the display,
detecting physical attributes of the security object including a physical shape of the security object, a physical pattern on the security object, a position of the security object in relation to the display, and an angular orientation of the security object in relation to the display,
generating a list of the attributes of the security object in relation to the display,
displaying an attribute menu on the display, wherein the attribute menu displays selectable options that comprise the generated list of physical attributes for the security object that is proximal to the display, wherein the selectable options are used to determine attributes to include in the security pattern;
receiving a user selection of one or more of the selectable options from the displayed attribute menu,
generating the security pattern for granting access to the secured content based on the user selection from the attribute menu, and
registering and storing the security pattern in association with security credentials for a user;
detecting an object proximal the display;
detecting attributes of the detected object in relation to the display;
determining whether any of the detected attributes of the detected object matches the stored security pattern of the security object;
when the detected attributes of the detected object matches the stored security pattern of the security object, providing security credentials for a user to access to the restricted content; and
when the detected attributes of the detected object does not match the stored security pattern of the security object, maintaining the restriction to the restricted content.
12. The system of claim 11, further comprising a matching threshold, wherein determining whether any of the detected attributes of the detected object matches the stored security pattern of the security object includes matching according to the threshold.
13. The system of claim 11, further comprising, indicating that the detected object is the security object when any of the detected attributes of the detected object matches the stored security pattern of the security object.
14. The system of claim 11, further comprising, indicating that the detected object is not the security object when any of the detected attributes of the detected object do not match the stored security pattern of the security object.
US11/426,101 2006-06-23 2006-06-23 Security using physical objects Expired - Fee Related US8001613B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/426,101 US8001613B2 (en) 2006-06-23 2006-06-23 Security using physical objects

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11/426,101 US8001613B2 (en) 2006-06-23 2006-06-23 Security using physical objects

Publications (2)

Publication Number Publication Date
US20070300307A1 US20070300307A1 (en) 2007-12-27
US8001613B2 true US8001613B2 (en) 2011-08-16

Family

ID=38874947

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/426,101 Expired - Fee Related US8001613B2 (en) 2006-06-23 2006-06-23 Security using physical objects

Country Status (1)

Country Link
US (1) US8001613B2 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110025651A1 (en) * 2008-04-01 2011-02-03 Koninklijke Philips Electronics N.V. Pointing device for use on an interactive surface
US20110095992A1 (en) * 2009-10-26 2011-04-28 Aten International Co., Ltd. Tools with multiple contact points for use on touch panel
US20110307952A1 (en) * 2010-06-11 2011-12-15 Hon Hai Precision Industry Co., Ltd. Electronic device with password generating function and method thereof
US20120137259A1 (en) * 2010-03-26 2012-05-31 Robert Campbell Associated file
US8590020B1 (en) * 2007-01-19 2013-11-19 Veronika Orlovskaya Authentication system and method using arrangements of objects
US20150205345A1 (en) * 2014-01-21 2015-07-23 Seiko Epson Corporation Position detection system and control method of position detection system
US10102366B2 (en) 2012-04-25 2018-10-16 Arcanum Technology Llc Fraud resistant passcode entry system
US10579786B2 (en) * 2014-04-02 2020-03-03 Sony Corporation Information processing system
US11048529B2 (en) 2017-11-23 2021-06-29 Research & Business Foundation Sungkyunkwan University Method for user based application grouping under multi-user environment and table top display apparatus for performing the same

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7612786B2 (en) * 2006-02-10 2009-11-03 Microsoft Corporation Variable orientation input mode
US8930834B2 (en) 2006-03-20 2015-01-06 Microsoft Corporation Variable orientation user interface
US8139059B2 (en) 2006-03-31 2012-03-20 Microsoft Corporation Object illumination in a virtual environment
US20070284429A1 (en) * 2006-06-13 2007-12-13 Microsoft Corporation Computer component recognition and setup
US7552402B2 (en) * 2006-06-22 2009-06-23 Microsoft Corporation Interface orientation using shadows
JP4933304B2 (en) 2006-10-16 2012-05-16 キヤノン株式会社 Image processing apparatus, control method thereof, and program
JP5482023B2 (en) * 2009-08-27 2014-04-23 ソニー株式会社 Information processing apparatus, information processing method, and program
US8935767B2 (en) 2010-05-14 2015-01-13 Microsoft Corporation Overlay human interactive proof system and techniques
US8621583B2 (en) * 2010-05-14 2013-12-31 Microsoft Corporation Sensor-based authentication to a computer network-based service
US11354958B2 (en) 2010-06-16 2022-06-07 Delphian Systems, LLC Wireless device enabled locking system having different modalities
EP2583430B1 (en) * 2010-06-16 2019-09-25 Delphian Systems, LLC Wireless device enabled locking system
US9141779B2 (en) 2011-05-19 2015-09-22 Microsoft Technology Licensing, Llc Usable security of online password management with sensor-based authentication
US20140210703A1 (en) * 2013-01-31 2014-07-31 Samsung Electronics Co. Ltd. Method of unlocking and subsequent application launch in portable electronic device via orientation sensing
US10529156B2 (en) 2013-05-20 2020-01-07 Delphian Systems, LLC Access control via selective direct and indirect wireless communications
US20170123622A1 (en) * 2015-10-28 2017-05-04 Microsoft Technology Licensing, Llc Computing device having user-input accessory
US20180081484A1 (en) * 2016-09-20 2018-03-22 Sony Interactive Entertainment Inc. Input method for modeling physical objects in vr/digital
CN206251154U (en) * 2016-12-09 2017-06-13 李权恩 Screen protective shield
CN109670291B (en) * 2017-10-17 2022-08-09 腾讯科技(深圳)有限公司 Verification code implementation method and device and storage medium
KR102511777B1 (en) * 2020-04-21 2023-03-20 로브록스 코포레이션 Systems and methods for accessible computer-user interaction
US11741517B1 (en) 2020-10-26 2023-08-29 Wells Fargo Bank, N.A. Smart table system for document management
US11457730B1 (en) 2020-10-26 2022-10-04 Wells Fargo Bank, N.A. Tactile input device for a touch screen
US11572733B1 (en) 2020-10-26 2023-02-07 Wells Fargo Bank, N.A. Smart table with built-in lockers
US11740853B1 (en) 2020-10-26 2023-08-29 Wells Fargo Bank, N.A. Smart table system utilizing extended reality
US11429957B1 (en) 2020-10-26 2022-08-30 Wells Fargo Bank, N.A. Smart table assisted financial health
US11727483B1 (en) 2020-10-26 2023-08-15 Wells Fargo Bank, N.A. Smart table assisted financial health
US11397956B1 (en) 2020-10-26 2022-07-26 Wells Fargo Bank, N.A. Two way screen mirroring using a smart table
US20220171845A1 (en) * 2020-11-30 2022-06-02 Rovi Guides, Inc. Enhancing intelligence in parental control
US11543931B2 (en) * 2021-01-27 2023-01-03 Ford Global Technologies, Llc Systems and methods for interacting with a tabletop model using a mobile device
US11747936B2 (en) * 2021-07-13 2023-09-05 Novatek Microelectronics Corp. Transmission system, processor, and transmission method

Citations (113)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4817176A (en) * 1986-02-14 1989-03-28 William F. McWhortor Method and apparatus for pattern recognition
US5230063A (en) 1989-03-15 1993-07-20 Sun Microsystems, Inc. Method and apparatus for selecting button function and retaining selected optics on a display
US5252951A (en) 1989-04-28 1993-10-12 International Business Machines Corporation Graphical user interface with gesture recognition in a multiapplication environment
US5345549A (en) * 1992-10-30 1994-09-06 International Business Machines Corporation Multimedia based security systems
US5423554A (en) 1993-09-24 1995-06-13 Metamedia Ventures, Inc. Virtual reality game method and apparatus
US5434964A (en) 1990-01-25 1995-07-18 Radius Inc. Movement and redimensioning of computer display windows
US5463725A (en) 1992-12-31 1995-10-31 International Business Machines Corp. Data processing system graphical user interface which emulates printed material
US5665951A (en) 1996-02-08 1997-09-09 Newman; Gary H. Customer indicia storage and utilization system
US5714698A (en) 1994-02-03 1998-02-03 Canon Kabushiki Kaisha Gesture input method and apparatus
US5804803A (en) 1996-04-02 1998-09-08 International Business Machines Corporation Mechanism for retrieving information using data encoded on an object
US5818450A (en) 1996-03-07 1998-10-06 Toshiba Kikai Kabushiki Kaisha Method of displaying data setting menu on touch input display provided with touch-sensitive panel and apparatus for carrying out the same method
US5883626A (en) 1997-03-31 1999-03-16 International Business Machines Corporation Docking and floating menu/tool bar
US5910653A (en) 1997-04-09 1999-06-08 Telxon Corporation Shelf tag with ambient light detector
US5943164A (en) 1994-11-14 1999-08-24 Texas Instruments Incorporated Curved 3-D object description from single aerial images using shadows
US6159100A (en) 1998-04-23 2000-12-12 Smith; Michael D. Virtual reality game
US6181343B1 (en) 1997-12-23 2001-01-30 Philips Electronics North America Corp. System and method for permitting three-dimensional navigation through a virtual reality environment using camera-based gesture inputs
US6240207B1 (en) 1993-08-11 2001-05-29 Sony Corporation Handwriting input display apparatus having improved speed in changing display of entered handwriting
US6247128B1 (en) 1997-07-22 2001-06-12 Compaq Computer Corporation Computer manufacturing with smart configuration methods
US20010012001A1 (en) 1997-07-07 2001-08-09 Junichi Rekimoto Information input apparatus
US20010054082A1 (en) 2000-03-15 2001-12-20 Rudolph Richard F. Controlled remote product internet access and distribution
US6333735B1 (en) 1999-03-16 2001-12-25 International Business Machines Corporation Method and apparatus for mouse positioning device based on infrared light sources and detectors
US20020109737A1 (en) 2001-02-15 2002-08-15 Denny Jaeger Arrow logic system for creating and operating control systems
US6445364B2 (en) 1995-11-28 2002-09-03 Vega Vista, Inc. Portable game display and method for controlling same
US6448964B1 (en) 1999-03-15 2002-09-10 Computer Associates Think, Inc. Graphic object manipulating tool
US6452593B1 (en) 1999-02-19 2002-09-17 International Business Machines Corporation Method and system for rendering a virtual three-dimensional graphical display
US20020151337A1 (en) 2001-03-29 2002-10-17 Konami Corporation Video game device, video game method, video game program, and video game system
US20020154214A1 (en) 2000-11-02 2002-10-24 Laurent Scallie Virtual reality game system using pseudo 3D display driver
US20020180811A1 (en) * 2001-05-31 2002-12-05 Chu Sing Yun Systems, methods, and articles of manufacture for providing a user interface with selection and scrolling
US6512507B1 (en) 1998-03-31 2003-01-28 Seiko Epson Corporation Pointing position detection device, presentation system, and method, and computer-readable medium
US20030025676A1 (en) 2001-08-02 2003-02-06 Koninklijke Philips Electronics N.V. Sensor-based menu for a touch screen panel
US20030063132A1 (en) 2001-08-16 2003-04-03 Frank Sauer User interface for augmented and virtual reality systems
US6545663B1 (en) 1999-04-19 2003-04-08 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method and input device for controlling the position of an object to be graphically displayed in virtual reality
US6568596B1 (en) 2000-10-02 2003-05-27 Symbol Technologies, Inc. XML-based barcode scanner
US6577330B1 (en) 1997-08-12 2003-06-10 Matsushita Electric Industrial Co., Ltd. Window display device with a three-dimensional orientation of windows
US20030119576A1 (en) 2001-12-20 2003-06-26 Mcclintic Monica A. Gaming devices and methods incorporating interactive physical skill bonus games and virtual reality games in a shared bonus event
US6593945B1 (en) 1999-05-21 2003-07-15 Xsides Corporation Parallel graphical user interface
US6624833B1 (en) 2000-04-17 2003-09-23 Lucent Technologies Inc. Gesture-based input interface system with shadow detection
US6630943B1 (en) 1999-09-21 2003-10-07 Xsides Corporation Method and system for controlling a complementary user interface on a display surface
US6662365B1 (en) 1999-08-17 2003-12-09 Gateway, Inc. Unified parental locks
US6667741B1 (en) 1997-12-24 2003-12-23 Kabushiki Kaisha Sega Enterprises Image generating device and image generating method
US20030234773A1 (en) 2002-06-24 2003-12-25 Fujitsu Limited Touch panel device
US6672961B1 (en) 2000-03-16 2004-01-06 Sony Computer Entertainment America Inc. Computer system and method of displaying images
US20040005920A1 (en) 2002-02-05 2004-01-08 Mindplay Llc Method, apparatus, and article for reading identifying information from, for example, stacks of chips
US6686931B1 (en) * 1997-06-13 2004-02-03 Motorola, Inc. Graphical password methodology for a microprocessor device accepting non-alphanumeric user input
US20040032409A1 (en) 2002-08-14 2004-02-19 Martin Girard Generating image data
US20040046784A1 (en) 2000-08-29 2004-03-11 Chia Shen Multi-user collaborative graphical user interfaces
US20040051733A1 (en) 2000-12-28 2004-03-18 David Katzir Method and system for parental internet control
US6720860B1 (en) * 2000-06-30 2004-04-13 International Business Machines Corporation Password protection using spatial and temporal variation in a high-resolution touch sensitive display
US6735625B1 (en) 1998-05-29 2004-05-11 Cisco Technology, Inc. System and method for automatically determining whether a product is compatible with a physical device in a network
US20040090432A1 (en) 2002-11-01 2004-05-13 Fujitsu Limited, Touch panel device and contact position detection method
US6745234B1 (en) 1998-09-11 2004-06-01 Digital:Convergence Corporation Method and apparatus for accessing a remote location by scanning an optical code
US20040119746A1 (en) * 2002-12-23 2004-06-24 Authenture, Inc. System and method for user authentication interface
US20040127272A1 (en) 2001-04-23 2004-07-01 Chan-Jong Park System and method for virtual game
US20040141008A1 (en) 2001-03-07 2004-07-22 Alexander Jarczyk Positioning of areas displayed on a user interface
US6768419B2 (en) 1998-08-14 2004-07-27 3M Innovative Properties Company Applications for radio frequency identification systems
US6767287B1 (en) 2000-03-16 2004-07-27 Sony Computer Entertainment America Inc. Computer system and method for implementing a virtual reality environment for a multi-player game
US6792452B1 (en) 1998-09-11 2004-09-14 L.V. Partners, L.P. Method for configuring a piece of equipment with the use of an associated machine resolvable code
US6791530B2 (en) 2000-08-29 2004-09-14 Mitsubishi Electric Research Laboratories, Inc. Circular graphical user interfaces
US20040212617A1 (en) 2003-01-08 2004-10-28 George Fitzmaurice User interface having a placement and layout suitable for pen-based computers
US6822635B2 (en) 2000-01-19 2004-11-23 Immersion Corporation Haptic interface for laptop computers and other portable devices
US6847856B1 (en) 2003-08-29 2005-01-25 Lucent Technologies Inc. Method for determining juxtaposition of physical components with use of RFID tags
US20050054392A1 (en) 2003-09-04 2005-03-10 Too Yew Teng Portable digital device orientation
US20050069186A1 (en) 2003-09-30 2005-03-31 Konica Minolta Meical & Graphic, Inc. Medical image processing apparatus
US20050110781A1 (en) 2003-11-25 2005-05-26 Geaghan Bernard O. Light emitting stylus and user input device using same
US20050122308A1 (en) 2002-05-28 2005-06-09 Matthew Bell Self-contained interactive video display system
US6910076B2 (en) 1999-06-23 2005-06-21 Intel Corporation Network-based detection and display of product replacement information
US20050134578A1 (en) 2001-07-13 2005-06-23 Universal Electronics Inc. System and methods for interacting with a control environment
US20050146508A1 (en) 2004-01-06 2005-07-07 International Business Machines Corporation System and method for improved user input on personal computing devices
US20050153128A1 (en) 2000-06-30 2005-07-14 Selinfreund Richard H. Product packaging including digital data
US20050166264A1 (en) 2002-01-08 2005-07-28 Kazuhiro Yamada Content delivery method and content delivery system
US20050162402A1 (en) 2004-01-27 2005-07-28 Watanachote Susornpol J. Methods of interacting with a computer using a finger(s) touch sensing input device with visual feedback
US20050177054A1 (en) 2004-02-10 2005-08-11 Dingrong Yi Device and process for manipulating real and virtual objects in three-dimensional space
US20050183035A1 (en) 2003-11-20 2005-08-18 Ringel Meredith J. Conflict resolution for graphic multi-user interface
US20050193120A1 (en) 2000-03-16 2005-09-01 Sony Computer Entertainment America Inc. Data transmission protocol and visual display for a networked computer system
US20050200291A1 (en) 2004-02-24 2005-09-15 Naugler W. E.Jr. Method and device for reading display pixel emission and ambient luminance levels
US6950534B2 (en) 1998-08-10 2005-09-27 Cybernet Systems Corporation Gesture-controlled interfaces for self-service machines and other applications
US20050251800A1 (en) * 2004-05-05 2005-11-10 Microsoft Corporation Invoking applications with virtual objects on an interactive display
US20050248729A1 (en) 2004-05-04 2005-11-10 Microsoft Corporation Selectable projector and imaging modes of display table
US6965842B2 (en) 2001-06-01 2005-11-15 Sony Corporation User input apparatus
US20050253872A1 (en) 2003-10-09 2005-11-17 Goss Michael E Method and system for culling view dependent visual data streams for a virtual environment
US20050275622A1 (en) * 2004-06-14 2005-12-15 Patel Himesh G Computer-implemented system and method for defining graphics primitives
US20050277071A1 (en) 2004-06-14 2005-12-15 Microsoft Corporation Method for controlling an intensity of an infrared source used to detect objects adjacent to an interactive display surface
WO2005122557A1 (en) 2004-06-04 2005-12-22 Thomson Licensing Method and apparatus for controlling an apparatus having a parental control function
US20050280631A1 (en) 2004-06-17 2005-12-22 Microsoft Corporation Mediacube
US20060015501A1 (en) * 2004-07-19 2006-01-19 International Business Machines Corporation System, method and program product to determine a time interval at which to check conditions to permit access to a file
US6990660B2 (en) 2000-09-22 2006-01-24 Patchlink Corporation Non-invasive automatic offsite patch fingerprinting and updating system and method
US20060017709A1 (en) 2004-07-22 2006-01-26 Pioneer Corporation Touch panel apparatus, method of detecting touch area, and computer product
US20060026535A1 (en) 2004-07-30 2006-02-02 Apple Computer Inc. Mode-based graphical user interfaces for touch sensitive input devices
US20060075250A1 (en) 2004-09-24 2006-04-06 Chung-Wen Liao Touch panel lock and unlock function and hand-held device
US20060077211A1 (en) 2004-09-29 2006-04-13 Mengyao Zhou Embedded device with image rotation
US7036090B1 (en) 2001-09-24 2006-04-25 Digeo, Inc. Concentric polygonal menus for a graphical user interface
US20060090078A1 (en) * 2004-10-21 2006-04-27 Blythe Michael M Initiation of an application
US20060119541A1 (en) * 2004-12-02 2006-06-08 Blythe Michael M Display system
US20060156249A1 (en) 2005-01-12 2006-07-13 Blythe Michael M Rotate a user interface
US20060161871A1 (en) 2004-07-30 2006-07-20 Apple Computer, Inc. Proximity detector in handheld device
US7085590B2 (en) 2003-12-31 2006-08-01 Sony Ericsson Mobile Communications Ab Mobile terminal with ergonomic imaging functions
US7098891B1 (en) 1992-09-18 2006-08-29 Pryor Timothy R Method for providing human input to a computer
US7104891B2 (en) 2002-05-16 2006-09-12 Nintendo Co., Ltd. Game machine and game program for displaying a first object casting a shadow formed by light from a light source on a second object on a virtual game space
US20060244734A1 (en) 2005-05-02 2006-11-02 Douglas Hill Large scale touch system and methods for interacting with same
US20060244719A1 (en) 2005-04-29 2006-11-02 Microsoft Corporation Using a light pointer for input on an interactive display surface
US7148876B2 (en) 2001-10-10 2006-12-12 Wacom Co., Ltd. Input system, program, and recording medium
US20070063981A1 (en) 2005-09-16 2007-03-22 Galyean Tinsley A Iii System and method for providing an interactive interface
US20070188518A1 (en) 2006-02-10 2007-08-16 Microsoft Corporation Variable orientation input mode
US7259747B2 (en) 2001-06-05 2007-08-21 Reactrix Systems, Inc. Interactive video display system
US20070220444A1 (en) 2006-03-20 2007-09-20 Microsoft Corporation Variable orientation user interface
US20070236485A1 (en) 2006-03-31 2007-10-11 Microsoft Corporation Object Illumination in a Virtual Environment
US20070284429A1 (en) 2006-06-13 2007-12-13 Microsoft Corporation Computer component recognition and setup
US20070300182A1 (en) 2006-06-22 2007-12-27 Microsoft Corporation Interface orientation using shadows
US20080040692A1 (en) 2006-06-29 2008-02-14 Microsoft Corporation Gesture input
US7397464B1 (en) * 2004-04-30 2008-07-08 Microsoft Corporation Associating application states with a physical object
US20080192005A1 (en) 2004-10-20 2008-08-14 Jocelyn Elgoyhen Automated Gesture Recognition
US7469381B2 (en) 2007-01-07 2008-12-23 Apple Inc. List scrolling and document translation, scaling, and rotation on a touch-screen display
US7483015B2 (en) 2004-02-17 2009-01-27 Aruze Corp. Image display system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US645364A (en) * 1897-04-20 1900-03-13 John C Riebe Mechanical movement.
WO1999027466A2 (en) * 1997-11-26 1999-06-03 The Government Of The United States Of America, As Represented By The Secretary, Department Of Health And Human Services, The National Institutes Of Health System and method for intelligent quality control of a process
DE19821165A1 (en) * 1998-05-12 1999-11-18 Volkswagen Ag Damper for transferring forces and moments esp. in automobiles
US6795452B2 (en) * 2002-05-31 2004-09-21 Sandbridge Technologies, Inc. Method of tracking time intervals for a communication signal
JP4284595B2 (en) * 2003-05-13 2009-06-24 株式会社セガ Control program for image display device
US20050017709A1 (en) * 2003-07-25 2005-01-27 Honeywell International Inc. Magnetoresistive turbocharger compressor wheel speed sensor

Patent Citations (116)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4817176A (en) * 1986-02-14 1989-03-28 William F. McWhortor Method and apparatus for pattern recognition
US5230063A (en) 1989-03-15 1993-07-20 Sun Microsystems, Inc. Method and apparatus for selecting button function and retaining selected optics on a display
US5252951A (en) 1989-04-28 1993-10-12 International Business Machines Corporation Graphical user interface with gesture recognition in a multiapplication environment
US5434964A (en) 1990-01-25 1995-07-18 Radius Inc. Movement and redimensioning of computer display windows
US7098891B1 (en) 1992-09-18 2006-08-29 Pryor Timothy R Method for providing human input to a computer
US5345549A (en) * 1992-10-30 1994-09-06 International Business Machines Corporation Multimedia based security systems
US5463725A (en) 1992-12-31 1995-10-31 International Business Machines Corp. Data processing system graphical user interface which emulates printed material
US6240207B1 (en) 1993-08-11 2001-05-29 Sony Corporation Handwriting input display apparatus having improved speed in changing display of entered handwriting
US5423554A (en) 1993-09-24 1995-06-13 Metamedia Ventures, Inc. Virtual reality game method and apparatus
US5714698A (en) 1994-02-03 1998-02-03 Canon Kabushiki Kaisha Gesture input method and apparatus
US5943164A (en) 1994-11-14 1999-08-24 Texas Instruments Incorporated Curved 3-D object description from single aerial images using shadows
US6445364B2 (en) 1995-11-28 2002-09-03 Vega Vista, Inc. Portable game display and method for controlling same
US5665951A (en) 1996-02-08 1997-09-09 Newman; Gary H. Customer indicia storage and utilization system
US5818450A (en) 1996-03-07 1998-10-06 Toshiba Kikai Kabushiki Kaisha Method of displaying data setting menu on touch input display provided with touch-sensitive panel and apparatus for carrying out the same method
US5804803A (en) 1996-04-02 1998-09-08 International Business Machines Corporation Mechanism for retrieving information using data encoded on an object
US5883626A (en) 1997-03-31 1999-03-16 International Business Machines Corporation Docking and floating menu/tool bar
US5910653A (en) 1997-04-09 1999-06-08 Telxon Corporation Shelf tag with ambient light detector
US6686931B1 (en) * 1997-06-13 2004-02-03 Motorola, Inc. Graphical password methodology for a microprocessor device accepting non-alphanumeric user input
US20010012001A1 (en) 1997-07-07 2001-08-09 Junichi Rekimoto Information input apparatus
US6414672B2 (en) 1997-07-07 2002-07-02 Sony Corporation Information input apparatus
US6247128B1 (en) 1997-07-22 2001-06-12 Compaq Computer Corporation Computer manufacturing with smart configuration methods
US6577330B1 (en) 1997-08-12 2003-06-10 Matsushita Electric Industrial Co., Ltd. Window display device with a three-dimensional orientation of windows
US6181343B1 (en) 1997-12-23 2001-01-30 Philips Electronics North America Corp. System and method for permitting three-dimensional navigation through a virtual reality environment using camera-based gesture inputs
US6667741B1 (en) 1997-12-24 2003-12-23 Kabushiki Kaisha Sega Enterprises Image generating device and image generating method
US6512507B1 (en) 1998-03-31 2003-01-28 Seiko Epson Corporation Pointing position detection device, presentation system, and method, and computer-readable medium
US6159100A (en) 1998-04-23 2000-12-12 Smith; Michael D. Virtual reality game
US6735625B1 (en) 1998-05-29 2004-05-11 Cisco Technology, Inc. System and method for automatically determining whether a product is compatible with a physical device in a network
US6950534B2 (en) 1998-08-10 2005-09-27 Cybernet Systems Corporation Gesture-controlled interfaces for self-service machines and other applications
US6768419B2 (en) 1998-08-14 2004-07-27 3M Innovative Properties Company Applications for radio frequency identification systems
US6745234B1 (en) 1998-09-11 2004-06-01 Digital:Convergence Corporation Method and apparatus for accessing a remote location by scanning an optical code
US6792452B1 (en) 1998-09-11 2004-09-14 L.V. Partners, L.P. Method for configuring a piece of equipment with the use of an associated machine resolvable code
US6452593B1 (en) 1999-02-19 2002-09-17 International Business Machines Corporation Method and system for rendering a virtual three-dimensional graphical display
US6448964B1 (en) 1999-03-15 2002-09-10 Computer Associates Think, Inc. Graphic object manipulating tool
US6333735B1 (en) 1999-03-16 2001-12-25 International Business Machines Corporation Method and apparatus for mouse positioning device based on infrared light sources and detectors
US6545663B1 (en) 1999-04-19 2003-04-08 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method and input device for controlling the position of an object to be graphically displayed in virtual reality
US6593945B1 (en) 1999-05-21 2003-07-15 Xsides Corporation Parallel graphical user interface
US6910076B2 (en) 1999-06-23 2005-06-21 Intel Corporation Network-based detection and display of product replacement information
US6662365B1 (en) 1999-08-17 2003-12-09 Gateway, Inc. Unified parental locks
US6630943B1 (en) 1999-09-21 2003-10-07 Xsides Corporation Method and system for controlling a complementary user interface on a display surface
US6822635B2 (en) 2000-01-19 2004-11-23 Immersion Corporation Haptic interface for laptop computers and other portable devices
US20010054082A1 (en) 2000-03-15 2001-12-20 Rudolph Richard F. Controlled remote product internet access and distribution
US20050193120A1 (en) 2000-03-16 2005-09-01 Sony Computer Entertainment America Inc. Data transmission protocol and visual display for a networked computer system
US6767287B1 (en) 2000-03-16 2004-07-27 Sony Computer Entertainment America Inc. Computer system and method for implementing a virtual reality environment for a multi-player game
US6672961B1 (en) 2000-03-16 2004-01-06 Sony Computer Entertainment America Inc. Computer system and method of displaying images
US6624833B1 (en) 2000-04-17 2003-09-23 Lucent Technologies Inc. Gesture-based input interface system with shadow detection
US20050153128A1 (en) 2000-06-30 2005-07-14 Selinfreund Richard H. Product packaging including digital data
US6720860B1 (en) * 2000-06-30 2004-04-13 International Business Machines Corporation Password protection using spatial and temporal variation in a high-resolution touch sensitive display
US6791530B2 (en) 2000-08-29 2004-09-14 Mitsubishi Electric Research Laboratories, Inc. Circular graphical user interfaces
US20040046784A1 (en) 2000-08-29 2004-03-11 Chia Shen Multi-user collaborative graphical user interfaces
US7327376B2 (en) 2000-08-29 2008-02-05 Mitsubishi Electric Research Laboratories, Inc. Multi-user collaborative graphical user interfaces
US6990660B2 (en) 2000-09-22 2006-01-24 Patchlink Corporation Non-invasive automatic offsite patch fingerprinting and updating system and method
US6568596B1 (en) 2000-10-02 2003-05-27 Symbol Technologies, Inc. XML-based barcode scanner
US20020154214A1 (en) 2000-11-02 2002-10-24 Laurent Scallie Virtual reality game system using pseudo 3D display driver
US20040051733A1 (en) 2000-12-28 2004-03-18 David Katzir Method and system for parental internet control
US20020109737A1 (en) 2001-02-15 2002-08-15 Denny Jaeger Arrow logic system for creating and operating control systems
US20040141008A1 (en) 2001-03-07 2004-07-22 Alexander Jarczyk Positioning of areas displayed on a user interface
US20020151337A1 (en) 2001-03-29 2002-10-17 Konami Corporation Video game device, video game method, video game program, and video game system
US20040127272A1 (en) 2001-04-23 2004-07-01 Chan-Jong Park System and method for virtual game
US20020180811A1 (en) * 2001-05-31 2002-12-05 Chu Sing Yun Systems, methods, and articles of manufacture for providing a user interface with selection and scrolling
US6965842B2 (en) 2001-06-01 2005-11-15 Sony Corporation User input apparatus
US7259747B2 (en) 2001-06-05 2007-08-21 Reactrix Systems, Inc. Interactive video display system
US20050134578A1 (en) 2001-07-13 2005-06-23 Universal Electronics Inc. System and methods for interacting with a control environment
US20030025676A1 (en) 2001-08-02 2003-02-06 Koninklijke Philips Electronics N.V. Sensor-based menu for a touch screen panel
US20030063132A1 (en) 2001-08-16 2003-04-03 Frank Sauer User interface for augmented and virtual reality systems
US7036090B1 (en) 2001-09-24 2006-04-25 Digeo, Inc. Concentric polygonal menus for a graphical user interface
US7148876B2 (en) 2001-10-10 2006-12-12 Wacom Co., Ltd. Input system, program, and recording medium
US20030119576A1 (en) 2001-12-20 2003-06-26 Mcclintic Monica A. Gaming devices and methods incorporating interactive physical skill bonus games and virtual reality games in a shared bonus event
US20050166264A1 (en) 2002-01-08 2005-07-28 Kazuhiro Yamada Content delivery method and content delivery system
US20040005920A1 (en) 2002-02-05 2004-01-08 Mindplay Llc Method, apparatus, and article for reading identifying information from, for example, stacks of chips
US7104891B2 (en) 2002-05-16 2006-09-12 Nintendo Co., Ltd. Game machine and game program for displaying a first object casting a shadow formed by light from a light source on a second object on a virtual game space
US20050122308A1 (en) 2002-05-28 2005-06-09 Matthew Bell Self-contained interactive video display system
US20030234773A1 (en) 2002-06-24 2003-12-25 Fujitsu Limited Touch panel device
US20040032409A1 (en) 2002-08-14 2004-02-19 Martin Girard Generating image data
US20040090432A1 (en) 2002-11-01 2004-05-13 Fujitsu Limited, Touch panel device and contact position detection method
US20040119746A1 (en) * 2002-12-23 2004-06-24 Authenture, Inc. System and method for user authentication interface
US20040212617A1 (en) 2003-01-08 2004-10-28 George Fitzmaurice User interface having a placement and layout suitable for pen-based computers
US6847856B1 (en) 2003-08-29 2005-01-25 Lucent Technologies Inc. Method for determining juxtaposition of physical components with use of RFID tags
US20050054392A1 (en) 2003-09-04 2005-03-10 Too Yew Teng Portable digital device orientation
US20050069186A1 (en) 2003-09-30 2005-03-31 Konica Minolta Meical & Graphic, Inc. Medical image processing apparatus
US20050253872A1 (en) 2003-10-09 2005-11-17 Goss Michael E Method and system for culling view dependent visual data streams for a virtual environment
US20050183035A1 (en) 2003-11-20 2005-08-18 Ringel Meredith J. Conflict resolution for graphic multi-user interface
US20050110781A1 (en) 2003-11-25 2005-05-26 Geaghan Bernard O. Light emitting stylus and user input device using same
US7085590B2 (en) 2003-12-31 2006-08-01 Sony Ericsson Mobile Communications Ab Mobile terminal with ergonomic imaging functions
US20050146508A1 (en) 2004-01-06 2005-07-07 International Business Machines Corporation System and method for improved user input on personal computing devices
US20050162402A1 (en) 2004-01-27 2005-07-28 Watanachote Susornpol J. Methods of interacting with a computer using a finger(s) touch sensing input device with visual feedback
US20050177054A1 (en) 2004-02-10 2005-08-11 Dingrong Yi Device and process for manipulating real and virtual objects in three-dimensional space
US7483015B2 (en) 2004-02-17 2009-01-27 Aruze Corp. Image display system
US20050200291A1 (en) 2004-02-24 2005-09-15 Naugler W. E.Jr. Method and device for reading display pixel emission and ambient luminance levels
US7397464B1 (en) * 2004-04-30 2008-07-08 Microsoft Corporation Associating application states with a physical object
US20050248729A1 (en) 2004-05-04 2005-11-10 Microsoft Corporation Selectable projector and imaging modes of display table
US20050251800A1 (en) * 2004-05-05 2005-11-10 Microsoft Corporation Invoking applications with virtual objects on an interactive display
WO2005122557A1 (en) 2004-06-04 2005-12-22 Thomson Licensing Method and apparatus for controlling an apparatus having a parental control function
US20050277071A1 (en) 2004-06-14 2005-12-15 Microsoft Corporation Method for controlling an intensity of an infrared source used to detect objects adjacent to an interactive display surface
US20050275622A1 (en) * 2004-06-14 2005-12-15 Patel Himesh G Computer-implemented system and method for defining graphics primitives
US20050280631A1 (en) 2004-06-17 2005-12-22 Microsoft Corporation Mediacube
US20060015501A1 (en) * 2004-07-19 2006-01-19 International Business Machines Corporation System, method and program product to determine a time interval at which to check conditions to permit access to a file
US20060017709A1 (en) 2004-07-22 2006-01-26 Pioneer Corporation Touch panel apparatus, method of detecting touch area, and computer product
US20060026535A1 (en) 2004-07-30 2006-02-02 Apple Computer Inc. Mode-based graphical user interfaces for touch sensitive input devices
US20060161871A1 (en) 2004-07-30 2006-07-20 Apple Computer, Inc. Proximity detector in handheld device
US20060075250A1 (en) 2004-09-24 2006-04-06 Chung-Wen Liao Touch panel lock and unlock function and hand-held device
US20060077211A1 (en) 2004-09-29 2006-04-13 Mengyao Zhou Embedded device with image rotation
US20080192005A1 (en) 2004-10-20 2008-08-14 Jocelyn Elgoyhen Automated Gesture Recognition
US20060090078A1 (en) * 2004-10-21 2006-04-27 Blythe Michael M Initiation of an application
US20060119541A1 (en) * 2004-12-02 2006-06-08 Blythe Michael M Display system
US20060156249A1 (en) 2005-01-12 2006-07-13 Blythe Michael M Rotate a user interface
US20060244719A1 (en) 2005-04-29 2006-11-02 Microsoft Corporation Using a light pointer for input on an interactive display surface
US20060244734A1 (en) 2005-05-02 2006-11-02 Douglas Hill Large scale touch system and methods for interacting with same
US20070063981A1 (en) 2005-09-16 2007-03-22 Galyean Tinsley A Iii System and method for providing an interactive interface
US20070188518A1 (en) 2006-02-10 2007-08-16 Microsoft Corporation Variable orientation input mode
US7612786B2 (en) 2006-02-10 2009-11-03 Microsoft Corporation Variable orientation input mode
US20070220444A1 (en) 2006-03-20 2007-09-20 Microsoft Corporation Variable orientation user interface
US20070236485A1 (en) 2006-03-31 2007-10-11 Microsoft Corporation Object Illumination in a Virtual Environment
US20070284429A1 (en) 2006-06-13 2007-12-13 Microsoft Corporation Computer component recognition and setup
US20070300182A1 (en) 2006-06-22 2007-12-27 Microsoft Corporation Interface orientation using shadows
US20080040692A1 (en) 2006-06-29 2008-02-14 Microsoft Corporation Gesture input
US7469381B2 (en) 2007-01-07 2008-12-23 Apple Inc. List scrolling and document translation, scaling, and rotation on a touch-screen display

Non-Patent Citations (32)

* Cited by examiner, † Cited by third party
Title
Elzabadani et al., "Self-Sensing Spaces: Smart Plugs for Smart Environments", http://www.icta.ufl.edu/projects/publications/2005-ICOST-Selfsensingspaces.pdf, received Apr. 7, 2006, 8 pp.
Hiroshi Sasaki et al., "Hands-Free User Interface for Seamless Collaborative Works in Shared MR Space", 6 pages, date unknown (3rd CREST/ISWC Workshop on Advanced Computing and Communicating Techniques for Wearable Information Playing, pp. 84-89 (Oct. 31, 2004)).
http://www.softsland.com/Natural-Login-Pro.html, Apr. 13, 2006, 3 pages.
Krishna et al., "23.3: Tactile Sensor Based on Piezoelectric Resonance", 2002 IEEE, pp. 1643-1647.
Logitech, "SecureConnect: A Major Leap in the Cordless Desktop Experience", http://www.logitech.com/pub/pdf/bluetooth/secure-connect-whitepaper.pdf, received Apr. 7, 2006, 5 pp.
Microsoft® Paint Version 5.1 Screenshots, Microsoft Corporation, 2007, 2 pp.
Nikitin et al., "Real-Time Simulation of Elastic Objects in Virtual Environments Using Finite Element Method and Precomputed Green's Functions", Eighth Eurographics Workshop on Virtual Environments, 2002, 6 pp.
Shen et al., "DiamondSpin: An Extensible Toolkit for Around-the-Table Interaction", http://hci.stanford.edu/publications/2004/diamondspin/diamondspin.pdf, Apr. 2004, 8 pp.
Stoakley et al., "Virtual Reality on a WIM: Interactive Worlds in Miniature", Conference on Human factors in Computer Systems, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM Press/Addison-Wesley Publishing Co., http://delivery.acm.org/10.1145/230000/223938/p265-stoakley.html?key1=223938&key2=5808034411&coll=GUIDE&dl=GUIDE&CFID=73042672&CFTOKEN=344092262, 1995, 14 pp.
Symantec, "Symantec Discovery: Track hardware/software assets and monitor license compliance throughout a multiplatform IT infrastructure", http://eval.veritas.com/mktginfo/enterprise/fact-sheets/ent-factsheet-discovery-12-2005.en-us.pdf, Dec. 2005, 5 pp.
TouchTable(TM), Northrop Grumman, http://www.ms.northropgrumman.com/touchtable.index.html, 2006, 2 pp.
TouchTable(TM), Northrop Grumman, www.northropgrumman.com, 2005, 2 pp.
TouchTable™, Northrop Grumman, http://www.ms.northropgrumman.com/touchtable.index.html, 2006, 2 pp.
TouchTable™, Northrop Grumman, www.northropgrumman.com, 2005, 2 pp.
U.S. Official Action mailed Apr. 14, 2009, in U.S. Appl. No. 11/278,264.
U.S. Official Action mailed Apr. 20, 2011 in U.S. Appl. No. 11/427,684.
U.S. Official Action mailed Dec. 2, 2008 in U.S. Appl. No. 11/278,264.
U.S. Official Action mailed Jan. 12, 2009 cited in U.S. Appl. No. 11/378,267.
U.S. Official Action mailed Jan. 23, 2009 cited in U.S. Appl. No. 11/423,883.
U.S. Official Action mailed Jul. 10, 2008 in U.S. Appl. No. 11/423,883.
U.S. Official Action mailed Jul. 9, 2008 in U.S. Appl. No. 11/378,267.
U.S. Official Action mailed Mar. 3, 2008 in U.S. Appl. No. 11/278,264.
U.S. Official Action mailed Mar. 5, 2010 in U.S. Appl. No. 11/378,267.
U.S. Official Action mailed May 12, 2010 in U.S. Appl. No. 11/278,264.
U.S. Official Action mailed May 30, 2008 in U.S. Appl. No. 11/425,843.
U.S. Official Action mailed Nov. 12, 2009 in U.S. Appl. No. 11/278,264.
U.S. Official Action mailed Oct. 27, 2010 in U.S. Appl. No. 11/278,264.
U.S. Official Action mailed Oct. 5, 2009, in U.S. Appl. No. 11/378,267.
U.S. Official Action mailed Oct. 6, 2010 in U.S. Appl. No. 11/427,684.
U.S. Official Action mailed Oct. 7, 2008 in U.S. Appl. No. 11/350,853.
Wilson, "PlayAnywhere: A Compact Interactive Tabletop Projection-Vision System", http://research.microsoft.com/~awilson/papers/Wilson%20PlayAnywhere%20UIST%202005.pdf, Oct. 2005, 10 pp.
Wilson, "PlayAnywhere: A Compact Interactive Tabletop Projection-Vision System", http://research.microsoft.com/˜awilson/papers/Wilson%20PlayAnywhere%20UIST%202005.pdf, Oct. 2005, 10 pp.

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8943563B1 (en) 2007-01-19 2015-01-27 Veronika Orlovskaya Authentication system and method using arrangements of objects
US8590020B1 (en) * 2007-01-19 2013-11-19 Veronika Orlovskaya Authentication system and method using arrangements of objects
US8816961B2 (en) * 2008-04-01 2014-08-26 Koninklijke Philips N.V. Pointing device for use on an interactive surface
US20110025651A1 (en) * 2008-04-01 2011-02-03 Koninklijke Philips Electronics N.V. Pointing device for use on an interactive surface
US20110095992A1 (en) * 2009-10-26 2011-04-28 Aten International Co., Ltd. Tools with multiple contact points for use on touch panel
US20120137259A1 (en) * 2010-03-26 2012-05-31 Robert Campbell Associated file
US9213410B2 (en) * 2010-03-26 2015-12-15 Hewlett-Packard Development Company L.P. Associated file
US20110307952A1 (en) * 2010-06-11 2011-12-15 Hon Hai Precision Industry Co., Ltd. Electronic device with password generating function and method thereof
US10102366B2 (en) 2012-04-25 2018-10-16 Arcanum Technology Llc Fraud resistant passcode entry system
US10572648B2 (en) 2012-04-25 2020-02-25 Arcanum Technology Llc Fraud resistant passcode entry system
US20150205345A1 (en) * 2014-01-21 2015-07-23 Seiko Epson Corporation Position detection system and control method of position detection system
US9639165B2 (en) * 2014-01-21 2017-05-02 Seiko Epson Corporation Position detection system and control method of position detection system
US10114475B2 (en) 2014-01-21 2018-10-30 Seiko Epson Corporation Position detection system and control method of position detection system
US10579786B2 (en) * 2014-04-02 2020-03-03 Sony Corporation Information processing system
US11048529B2 (en) 2017-11-23 2021-06-29 Research & Business Foundation Sungkyunkwan University Method for user based application grouping under multi-user environment and table top display apparatus for performing the same

Also Published As

Publication number Publication date
US20070300307A1 (en) 2007-12-27

Similar Documents

Publication Publication Date Title
US8001613B2 (en) Security using physical objects
US9171142B2 (en) Arrangements for identifying users in a multi-touch surface environment
RU2589397C2 (en) Authentication graphic gestures
US7552402B2 (en) Interface orientation using shadows
US7574739B2 (en) Password authenticating apparatus, method, and program
US20090106667A1 (en) Dividing a surface of a surface-based computing device into private, user-specific areas
US9736137B2 (en) System and method for managing multiuser tools
US20170109516A1 (en) System and method for authentication with a computer stylus
US8930834B2 (en) Variable orientation user interface
EP1815379B1 (en) Distinctive user identification and authentication for multiple user access to display devices
US7953983B2 (en) Image or pictographic based computer login systems and methods
US7124300B1 (en) Handheld computer system configured to authenticate a user and power-up in response to a single action by the user
AU2006307996B2 (en) Method and system for secure password/PIN input via mouse scroll wheel
US20050183035A1 (en) Conflict resolution for graphic multi-user interface
US20070188478A1 (en) Uniquely identifiable inking instruments
US20070188445A1 (en) Uniquely identifiable inking instruments
JPH11149454A (en) Authenticating device, user authenticating method, card for authenticating user and recording medium
US20050240871A1 (en) Identification of object on interactive display surface by identifying coded pattern
US9557914B2 (en) Electronic device, unlocking method, and non-transitory storage medium
US20050154897A1 (en) Protected access to a secured entity through a randomly selected password requested through an interactive computer controlled display terminal
US20170031588A1 (en) Universal keyboard
Ritter et al. Miba: Multitouch image-based authentication on smartphones
US20170199994A1 (en) Imaging devices and methods for authenticating a user
US11089171B2 (en) Recording medium storing control program and electronic device for controlling use of function
CN107437015B (en) System and method for orientation sensing of objects on electronic devices

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:., DUNCAN;REEL/FRAME:017855/0564

Effective date: 20060622

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034542/0001

Effective date: 20141014

FPAY Fee payment

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20190816