US20090183071A1 - Perspective based tagging and visualization of avatars in a virtual world - Google Patents

Perspective based tagging and visualization of avatars in a virtual world Download PDF

Info

Publication number
US20090183071A1
US20090183071A1 US11/972,064 US97206408A US2009183071A1 US 20090183071 A1 US20090183071 A1 US 20090183071A1 US 97206408 A US97206408 A US 97206408A US 2009183071 A1 US2009183071 A1 US 2009183071A1
Authority
US
United States
Prior art keywords
user
avatar
information
tagged
virtual world
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US11/972,064
Other versions
US8495505B2 (en
Inventor
Andrew Bryan Smith
Brian Ronald Bokor
Daniel Edward House
William Bruce Nicol II
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
International Business Machines Corp
Original Assignee
International Business Machines Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by International Business Machines Corp filed Critical International Business Machines Corp
Priority to US11/972,064 priority Critical patent/US8495505B2/en
Assigned to INTERNATIONAL BUSINESS MACHINES CORPORATION reassignment INTERNATIONAL BUSINESS MACHINES CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BOKOR, BRIAN RONALD, HOUSE, DANIEL EDWARD, NICOL, WILLIAM BRUCE, II, SMITH, ANDREW BRYAN
Publication of US20090183071A1 publication Critical patent/US20090183071A1/en
Priority to US13/900,111 priority patent/US9195363B2/en
Application granted granted Critical
Publication of US8495505B2 publication Critical patent/US8495505B2/en
Priority to US14/884,443 priority patent/US9927868B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Definitions

  • the present invention relates to simulations, virtual world simulations of the real-world or real-life or a virtual world and the like, and more particularly to a system and method for perspective based tagging and visualization of avatar in a virtual world.
  • Computer based simulations are becoming more ubiquitous. Simulations may be used for training purposes, for entertainment or for other purposes.
  • Computer simulations such as Second LifeTM or similar simulations present a virtual world which allows users or players to be represented by characters known as avatars.
  • Second Life is a trademark of Linden Research, Inc. in the United States, other countries or both.
  • Second Life is an Internet-based virtual world launched in 2003 by Linden Research, Inc.
  • a downloadable client program called the Second Life Viewer enables users, called “Residents”, to interact with others in the virtual world through motional avatars.
  • the virtual world basically simulates the real world or environment.
  • the users or residents via their avatar can explore the virtual world, meet other users or residents, socialize, participate in individual and group activities, create and trade items (virtual property) and services from one another.
  • a method for perspective based tagging and visualization of avatars in a virtual world may include determining if another avatar has moved within a predetermined proximity range of a user's avatar in a virtual world. The method may also include allowing the user to tag the other avatar with information in response to the other avatar being within the predetermined proximity range of the user's avatar.
  • a method for perspective based tagging and visualization of avatars in a virtual world may include determining if another avatar has moved within a predetermined proximity range of a user's avatar in a virtual world and determining if the user has previously tagged the other avatar with tagged information. The method may also include presenting the tagged information in response to the other avatar coming within the predetermined proximity range of the user's avatar and the user having previously tagged the other avatar with the tagged information.
  • a method for perspective based tagging and visualization of avatars in a virtual world may include determining if another avatar has moved within a predetermined proximity range of a user's avatar in a virtual world and determining if the user has previously tagged the other avatar with tagged information.
  • the tagged information may include information that is pertinent from a perspective of the user.
  • the method may further include presenting the tagged information in response to the other avatar coming within the predetermined proximity range of the user's avatar and the user having previously tagged the other avatar with the tagged information.
  • the method may further include permitting the user to perform at least one of the following: tag the other avatar with information that is pertinent to the perspective of the user in the virtual world in response to the other avatar coming within the predetermined proximity range of the user's avatar and the user not having previously tagged the other avatar with information; and edit the tagged information in response to the other avatar coming within the predetermined proximity range of the user's avatar and the user having previously tagged the other avatar with information.
  • FIG. 1 is a flow chart of an example of a method for perspective based tagging and visualization of avatars in a virtual world in accordance with an embodiment of the present invention.
  • FIG. 2A is an illustration of an example of a portion of a virtual world illustrating a predetermined proximity range of a user's avatar in accordance with an embodiment of the present invention.
  • FIG. 2B is an illustration of an example of another avatar coming within the predetermined proximity range of the user's avatar in the virtual world portion of FIG. 2A to permit perspective based tagging and visualization of the other avatar in accordance with an embodiment of the present invention.
  • FIG. 3 is an illustration of an example of tagging the other avatar within the predetermined proximity range of the user's avatar in accordance with an embodiment of the present invention.
  • FIG. 4 is an illustration of an example of presentation of tagged information to the user when a tagged avatar is or comes within the predetermined proximity range of the user's avatar in accordance with an embodiment of the present invention.
  • FIG. 5 is a block schematic diagram of an example of a system for perspective based tagging and visualization of avatars in a virtual world in accordance with an embodiment of the present invention.
  • the present invention may be embodied as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, the present invention may take the form of a computer program product on a computer-usable storage medium having computer-usable program code embodied in the medium.
  • the computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection having one or more wires, a tangible medium such as a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), or other tangible optical or magnetic storage device; or transmission media such as those supporting the Internet or an intranet.
  • a tangible medium such as a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory
  • an optical fiber such as those supporting the Internet or an intranet.
  • CD-ROM compact disc read
  • a computer-usable or computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
  • a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the computer-usable medium may include a propagated data signal with the computer-usable program code embodied therewith, either in baseband or as part of a carrier wave.
  • the computer usable program code may be transmitted using any appropriate medium, including but not limited to the Internet, wireline, optical fiber cable, radio frequency (RF) or other means.
  • RF radio frequency
  • Computer program code for carrying out operations of the present invention may be written in an object oriented programming language such as Java, Smalltalk, C++ or the like.
  • the computer program code for carrying out operations of the present invention may also be written in conventional procedural programming languages, such as the “C” programming language or similar programming languages, or in functional programming languages, such as Haskell, Standard Meta Language (SML) or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • FIG. 1 is a flow chart of an example of a method 100 for perspective based tagging and visualization of avatars in a virtual world in accordance with an embodiment of the present invention.
  • another avatar is determined to be within the predetermined proximity range of the user's avatar.
  • the other avatar may move into the predetermined proximity range of the user's avatar in the virtual world through actions of the other user controlling the other avatar, and/or the user may move his avatar such that the predetermined proximity range of the user's avatar encompasses the other avatar.
  • FIG. 2A is an illustration of an example of a portion of a virtual world 200 illustrating a predetermined proximity range 202 in accordance with an embodiment of the present invention.
  • the portion of the virtual world 200 may be presented to the user on a display of a computer device as described herein.
  • the perimeter of the predetermined proximity range 202 is illustrated by a dashed or broken line in FIGS. 2A and 2B and defines an area 204 surrounding the user's avatar 206 in FIGS. 2A and 2B .
  • the predetermined proximity range 202 may be a two-dimensional or three-dimensional range extending a predetermined distance D from the user's avatar in any direction.
  • the predetermined proximity range 202 may be substantially circular as illustrated in FIGS. 2A and 2B or may be shape selected by the user.
  • FIG. 2A illustrates an avatar 208 of another user that may be approaching the user's avatar 206 from outside of the predetermined proximity range 202 .
  • FIG. 2B illustrates the other avatar 208 coming within the predetermined proximity range 202 of the user's avatar 206 either by movement of the other avatar 208 , the user's avatar 206 or both.
  • the tagged information may be presented to the user visually, audibly or both, as described herein, in response to the other avatar coming within the predetermined proximity range.
  • the user may also edit the tagged information. If the other avatar 208 has not been previously tagged with information, the user may be allowed to tag the other avatar 208 with information in response to the other avatar being within the predetermined proximity range of the user's avatar 206 , as further described herein.
  • the method 100 may advance to block 112 .
  • a determination may be made whether the user has previously tagged the avatar 208 within the proximity range 202 with custom information. If not, the method 100 may advance to block 114 .
  • a determination may be made whether the user desires to tag the other avatar 208 that is within the proximity range 202 with information. For example, the user may be presented a graphical user interface to ask the user whether or not the user desires to tag information. Alternatively, the user may indicate his desire to tag information for this avatar by any activation means, such as pressing one or more keyboard keys, clicking a mouse or other computer pointing device, selecting an option on an onscreen menu or the like. If the user does not desire to tag information, then the method may terminate, as shown in block 128 .
  • the user may select the other avatar 208 using a computer pointing device or the like. Selecting a particular other avatar may be necessary when there are multiple other avatars within the predetermined proximity range 202 of the user's avatar. In this case, the user must select which avatar is to be tagged with the information.
  • the user may select the other avatar 208 by giving some indication that the other avatar 208 is the avatar the user wishes to tag with information. For example, the user may select the other avatar by choosing the other avatar with the user's mouse, by typing the other avatar's name into a graphical user interface or by other means.
  • the user may be presented a graphical user interface (GUI) to enter tagged information about the other avatar 208 .
  • GUI graphical user interface
  • the user may also be presented a GUI to define various display or presentation options for presenting the tagged information.
  • These GUIs may be presented to the user in separate GUIs or may be combined and presented to the user in one GUI.
  • the information may be presented in a visual form, audible form, or both.
  • the tagged information may be presented only to the user that entered the tagged information.
  • the user is allowed to enter text and other information about the other avatar 202 .
  • the user may enter this information by any means, such as through the GUI as described above.
  • the tagged information may be information that is pertinent from a perspective of the user.
  • the tagged information may include information to identify the other avatar 208 to the user and to present a relevance of the other avatar 208 to the user.
  • the information about the other avatar 208 may include any information, such as the avatar's occupation, the avatar's name, how the user knows the avatar, information obtained by the user based on interactions the user's avatar 206 has had with the other avatar 208 , user thoughts about the avatar or any information to define a relevance of the other avatar 208 to the user or the user's avatar 206 .
  • the user may be allowed to define settings or properties for the presentation of the tagged information to the user.
  • the user may define a format for presenting the tagged information, such as how the information will be visually or audibly presented to the user, a frequency for presenting the tagged information, a duration for presenting the information to the user, and any other properties for presenting the tagged information.
  • the user may define when to present the tagged information to the user. For example, the user may set the tagged information to be delivered to the user every time a tagged avatar enters the proximity range 202 of the user's avatar 206 . The user may set the tagged information to be delivered only once per a certain time period (e.g.
  • the tagged avatar enters the proximity range 202 of the user's avatar 206 .
  • the user may desire to set the tagged information to be delivered at the next meeting between the user's avatar 206 and the tagged avatar.
  • the user may also define the display settings to not present the tagged information to the user at all or only on request of the user.
  • the user may define visual and auditory options or properties for the information tagged to the other avatar 208 .
  • Visual options or features the user may set could include a number of symbols, color schemes, or any other visual options which may be desired by the user.
  • Optional text and a legend may be included with the visual options.
  • the user may desire to mark all the user's avatar's co-workers in the virtual world with a certain symbol or optional text so that when any of the avatars come into the proximity range 202 of the user's avatar 206 , the user immediately recognizes the other avatar as a co-worker.
  • the audible options or settings would allow the user to present any information to the user via an audible means.
  • any text entered by the user and tagged to another avatar may be read to the user via any text-to-voice reader technologies in response to the other avatar coming within the predetermined proximity range 202 of the user's avatar 206 .
  • Another example includes allowing the user to tag preset sound recordings, such as audible alerts or warnings, any pre-recorded sounds or statements by the user or other available pre-recorded sounds.
  • FIG. 3 is an illustration of an example of tagging another avatar 302 within the predetermined proximity range 304 of the user's avatar 306 in accordance with an embodiment of the present invention.
  • the tagged information and associated settings or properties may be referred to as a tag definition 308 associated with the tagged avatar 302 .
  • the user is allowed to create or define the tag definition 308 .
  • the tag definition 308 may include display properties 310 assigned by the user, text 312 assigned or entered by the user to be associated with the tagged avatar 302 , visual and/or audible settings 314 assigned by the user, as well as any other information to be tagged to the other or tagged avatar 302 and properties relative to presentation of the tagged information.
  • the tagged definition 308 may typically only be presented to the user that entered or created the tagged definition 308 .
  • the user of the tagged avatar 302 may be unaware of the tagged information or tagged definition 308 .
  • the tagged definition 308 may be present on a computing device associated with the user similar to that described with reference to FIG. 5 .
  • FIG. 3 may also illustrate an example of delivery or presentation of the tagged information corresponding to the tag definition 306 in accordance with an embodiment of the present invention. Accordingly, when the tagged avatar 302 enters the predetermined proximity range 304 , the tag definition 308 associated with the previously tagged avatar 302 may be presented to the user. The user may also be permitted to edit the information and any associated settings or properties in the tag definition 308 .
  • any tagged information or tagged information settings or properties may be stored in a database or inventory associated with the user.
  • Each database or inventory may be specific to each user and may be stored locally on the user's computer readable storage medium or remotely on a server.
  • the method 100 may advance to block 122 .
  • a determination may be made whether the user's previously entered settings allow presentation of the tagged information to the user. Recall that the user may enter properties to limit presentation of the tagged information under certain circumstance. For example the tagged information may only be presented once per a certain time period, only on request of the user or based on some other parameters. If the user settings allow presentation of the tagged information in block 122 , the method 100 may advance to blocks 123 and 124 . If not, the method may advance to block 126 .
  • the tagged information and any tagged information settings relating to the tagged avatar may be retrieved from the user's inventory, defined above with regard to block 121 .
  • the tagged information of the tagged avatar may be presented to the user in accordance with the tagged information settings that the user had previously defined. If no tagged information settings exist, certain settings have not been defined, or none are retrieved, the tagged information may be displayed to the user according to default display settings. As previously described, the tagged information may be visual, auditory or both.
  • one or more GUIs may be presented to the user to modify tagged information and/or tagged information settings for the tagged avatar. Additionally, one or more GUIs may allow for the user to add tagged information or define tagged information settings. It should be noted that all of the above GUIs may be combined into one GUI to perform one or more of the above actions.
  • FIG. 4 is an illustration of an example of presentation 400 of tagged information to the user (not shown) when a tagged avatar 402 is or comes within the predetermined proximity range 404 of the user's avatar 406 in accordance with an embodiment of the present invention.
  • the presentation 400 of tagged information may include a visual display 408 or an audible presentation 410 of tagged information.
  • FIG. 4 may also illustrate that the presentation 410 of tagged information (visual information 408 or audible information 410 ) may only occur for the user that previously tagged the information to the tagged avatar 402 .
  • the tagged information is typically not delivered or presented to the user of the tagged avatar 402 and the user of the tagged avatar 402 may be unaware that his avatar is tagged with information.
  • FIG. 4 also illustrates an example of managing tagged information in accordance with an embodiment of the present invention.
  • a user can configure settings or properties associated with tagged information to add additional details or modify settings and delivery or presentation. For example, as interaction levels between avatars may increase and more avatars enter the proximity range 404 of the user's avatar 406 , the user may turn off delivery of the tagged information for selected other avatars, such as secondary avatar 412 either because the tagged information is no longer needed or for other reasons.
  • Management of tagged information may be accomplished by the user selecting to manage tagged avatars in a drop down menu, right-clicking a computer pointing device on the other avatar whose tagged information is to be managed or by any other means for selecting or activating features in a computer application, on the Internet or in a virtual world.
  • a GUI or similar mechanism may be presented to permit the user to edit the information, add additional details, modify settings, change delivery properties and the like.
  • the tagged information may also be shared with other users or user's avatars. As previously discussed with respect to block 121 in FIG. 1 , the tagged information may be stored locally in a inventory associated with the user, such as in a file system or memory on the user's computer or server running the virtual world. The tagged information and settings may then be shared or given to other users or avatars of other users in the virtual world. This sharing feature may enable teams of users to share information and operate more effectively in the virtual world or environment.
  • a user's avatar may attend a virtual conference and identify a potential follow-up partnership with another avatar.
  • the user may tag the other avatar with the conference name, company name, and possible partnership information.
  • the tagged information may be presented to the user and the user may immediately recognize the other avatar.
  • a user may tag members of a virtual book club. When the user next encounters another book club avatar, the user can immediately respond appropriately without delay because the user was presented tagged information and thus the user, understands the other avatar is a member of the book club.
  • a user may tag a potential customer avatar in a virtual space with details pertinent to sales. On a follow-up meeting between the user's avatar and the other avatar, the user may receive a notification of the tagged information via predefined methods (visual/audible) enabling the user to provide better customer relations and feedback.
  • FIG. 5 is a block schematic diagram of an example of a system 500 for perspective based tagging and visualization of avatars in a virtual world in accordance with an embodiment of the present invention.
  • the system 500 may include a module 502 for perspective based tagging and visualization of avatars in a virtual world (hereinafter “custom tagging module 502 ”).
  • the method 100 may be embodied in or performed by the custom tagging module 502 .
  • the custom tagging module 502 may be part of a virtual world simulation system 504 or program, such as Second LifeTM or similar virtual world system.
  • the virtual world simulation system 504 may be operable on a server 506 which may be accessed by a plurality of users 508 or participants via a network 510 using an Internet browser on a computer system 512 , personal computer or similar device.
  • the network 510 may be the Internet, a private network or other network.
  • Each computer system 512 may be similar to the exemplary computer system 514 and associated components illustrated in FIG. 5 for purposes of explaining the embodiment of the present invention illustrated in FIG. 5 .
  • the custom tagging module 502 may be a self contained system with embedded logic, decision making, state based operations and other functions that may operate in conjunction with a virtual world simulation, such as Second LifeTM.
  • the self contained system may allow businesses, individuals, services, locations, and the like in the virtual world to interact.
  • the custom tagging module 502 may be stored on a file system 516 or memory of the computer system 514 .
  • the custom tagging module 502 may accessed from the file system 516 and run on a processor 518 associated with the computer system 514 .
  • the custom tagging module 502 may be operable on the server 506 rather than each user's computer system 512 and 514 and may be accessed by the plurality of users 508 or participants via the network 510 using an Internet browser on the computer system 512 or 514 or by similar means.
  • the custom tagging module 502 may include a tagged information management module 520 .
  • the tagged information management module 520 manages the information tagged to other avatars by the user of the computer system 514 .
  • the tagged information management module 520 allows presentation of the tagged information for a tagged avatar when the tagged avatar comes within the user's avatar's proximity range.
  • the tagged information management module 520 may retrieve tagged information and associated settings from a user's inventory 522 and present the tagged information on a display 524 , if visual, or via a speaker 526 or speakers, if auditory, in accordance with the tagged information presentation settings. Also, the tagged information module 520 collects the tagged information after the tagged information has been entered by the user and places the tagged information into the user's inventory 522 .
  • the custom tagging module 502 may present one or more predetermined graphical user interfaces 528 , similar to the GUIs described with respect to the method 100 of FIG. 1 , to permit perspective based tagging of information to other user's avatars.
  • the tagged information is from the user's perspective relative to the other user's avatar, hence perspective based tagging.
  • the perspective based tagged information may assist the user in identifying the other users' avatars as well as a relevance of the other users' avatars to the user's own avatar in the virtual world.
  • the GUIs 528 may include a GUI to allow the user to enter tagged information and/or tagged information presentation settings. These GUIs 528 may be predetermined and presented in response to the user indicating the user would like to add/or modify information or information presentation settings for another avatar. The predetermined GUIs 528 may be generated by the custom tagging module 502 as described herein and may be presented on the display 524 of the computer system 514 .
  • the custom tagging module 502 may also include a proximity range detection module 530 .
  • the proximity range detection module 530 may determine when an avatar comes within the predetermined proximity range of the user's avatar. When an avatar comes within the predetermined proximity range, the proximity range detection module 530 sends an alert to the custom tagging module 502 to present any tagged information or the tag definition 308 as illustrated in FIG. 3 .
  • the custom tagging module 502 may also include inventory 522 .
  • the inventory 522 may include tagged information and associated settings for presentation of the tagged information.
  • the inventory may be store locally on the user's computer or the user's computer readable storage medium.
  • the inventory 522 may also be stored remotely on the network 510 in a database (not shown).
  • the custom tagging module 502 may determine if tagged information exists for an avatar that has entered a user's proximity range by checking the user's inventory 522 for information associated the avatar.
  • the custom tagging module 502 may further include an options feature 532 .
  • the options feature 532 may include any future enhancements, configurations, and extensions to the existing system or any additions relating to the custom tagging module 502 .
  • a notification system could be added to the custom tagging module 502 which could be configured as an option to provide instant updates or provide notification by some other mechanism.
  • the user computer system may also include one or more input devices, output devices or combination input and output device, collectively I/O devices 534 in FIG. 5 .
  • the I/O devices 534 may include a keyboard, computer pointing device or similar means to control operation of the custom tagging module 502 and to enter information into the various GUIs as described herein.
  • the I/O devices 534 may also include disk drives or devices for reading computer media including computer-readable or computer-operable instructions.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.

Abstract

A method for perspective based tagging and visualization of avatars in a virtual world may include determining if another avatar has moved within a predetermined proximity range of a user's avatar in a virtual world. The method may also include allowing the user to tag the other avatar with information in response to the other avatar being within the predetermined proximity range of the user's avatar.

Description

    BACKGROUND OF THE INVENTION
  • The present invention relates to simulations, virtual world simulations of the real-world or real-life or a virtual world and the like, and more particularly to a system and method for perspective based tagging and visualization of avatar in a virtual world.
  • Computer based simulations are becoming more ubiquitous. Simulations may be used for training purposes, for entertainment or for other purposes. Computer simulations such as Second Life™ or similar simulations present a virtual world which allows users or players to be represented by characters known as avatars. Second Life is a trademark of Linden Research, Inc. in the United States, other countries or both. Second Life is an Internet-based virtual world launched in 2003 by Linden Research, Inc. A downloadable client program called the Second Life Viewer enables users, called “Residents”, to interact with others in the virtual world through motional avatars. The virtual world basically simulates the real world or environment. The users or residents via their avatar can explore the virtual world, meet other users or residents, socialize, participate in individual and group activities, create and trade items (virtual property) and services from one another.
  • As peer and social networks expand into virtual world realms, it is becoming increasingly difficult for a user to keep track of avatars that the user has met. Additionally, many avatars in the virtual world look alike and an avatar's appearance may frequently change. Accordingly, it is difficult for users to remember the identity of an avatar or what interactions, if any, the user has had with a particular avatar based on the visualization of the avatar or the avatars name. Currently, a user may view an avatar's profile, but this is tedious and is dependent on the avatar entering adequate information into his profile. Nevertheless, there is no way to link the identity of a particular avatar (e.g. the name, visual representation, avatar identity, etc.) with the relevance of that avatar to the user's avatar (e.g. co-worker, customer, friend, new acquaintance, interactions the user has had with the avatar, etc.). Additionally, there is no way to automatically present the user with customized information about each specific avatar when the avatar comes into contact with the user's avatar.
  • BRIEF SUMMARY OF THE INVENTION
  • In accordance with an aspect of the present invention, a method for perspective based tagging and visualization of avatars in a virtual world may include determining if another avatar has moved within a predetermined proximity range of a user's avatar in a virtual world. The method may also include allowing the user to tag the other avatar with information in response to the other avatar being within the predetermined proximity range of the user's avatar.
  • In accordance with another aspect of the present invention, a method for perspective based tagging and visualization of avatars in a virtual world may include determining if another avatar has moved within a predetermined proximity range of a user's avatar in a virtual world and determining if the user has previously tagged the other avatar with tagged information. The method may also include presenting the tagged information in response to the other avatar coming within the predetermined proximity range of the user's avatar and the user having previously tagged the other avatar with the tagged information.
  • In accordance with another aspect of the present invention, a method for perspective based tagging and visualization of avatars in a virtual world may include determining if another avatar has moved within a predetermined proximity range of a user's avatar in a virtual world and determining if the user has previously tagged the other avatar with tagged information. The tagged information may include information that is pertinent from a perspective of the user. The method may further include presenting the tagged information in response to the other avatar coming within the predetermined proximity range of the user's avatar and the user having previously tagged the other avatar with the tagged information. The method may further include permitting the user to perform at least one of the following: tag the other avatar with information that is pertinent to the perspective of the user in the virtual world in response to the other avatar coming within the predetermined proximity range of the user's avatar and the user not having previously tagged the other avatar with information; and edit the tagged information in response to the other avatar coming within the predetermined proximity range of the user's avatar and the user having previously tagged the other avatar with information.
  • Other aspects and features of the present invention, as defined solely by the claims, will become apparent to those ordinarily skilled in the art upon review of the following non-limited detailed description of the invention in conjunction with the accompanying figures.
  • BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
  • FIG. 1 is a flow chart of an example of a method for perspective based tagging and visualization of avatars in a virtual world in accordance with an embodiment of the present invention.
  • FIG. 2A is an illustration of an example of a portion of a virtual world illustrating a predetermined proximity range of a user's avatar in accordance with an embodiment of the present invention.
  • FIG. 2B is an illustration of an example of another avatar coming within the predetermined proximity range of the user's avatar in the virtual world portion of FIG. 2A to permit perspective based tagging and visualization of the other avatar in accordance with an embodiment of the present invention.
  • FIG. 3 is an illustration of an example of tagging the other avatar within the predetermined proximity range of the user's avatar in accordance with an embodiment of the present invention.
  • FIG. 4 is an illustration of an example of presentation of tagged information to the user when a tagged avatar is or comes within the predetermined proximity range of the user's avatar in accordance with an embodiment of the present invention.
  • FIG. 5 is a block schematic diagram of an example of a system for perspective based tagging and visualization of avatars in a virtual world in accordance with an embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • The following detailed description of embodiments refers to the accompanying drawings, which illustrate specific embodiments of the invention. Other embodiments having different structures and operations do not depart from the scope of the present invention.
  • As will be appreciated by one of skill in the art, the present invention may be embodied as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, the present invention may take the form of a computer program product on a computer-usable storage medium having computer-usable program code embodied in the medium.
  • Any suitable computer usable or computer readable medium may be utilized. The computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection having one or more wires, a tangible medium such as a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), or other tangible optical or magnetic storage device; or transmission media such as those supporting the Internet or an intranet. Note that the computer-usable or computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory. In the context of this document, a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The computer-usable medium may include a propagated data signal with the computer-usable program code embodied therewith, either in baseband or as part of a carrier wave. The computer usable program code may be transmitted using any appropriate medium, including but not limited to the Internet, wireline, optical fiber cable, radio frequency (RF) or other means.
  • Computer program code for carrying out operations of the present invention may be written in an object oriented programming language such as Java, Smalltalk, C++ or the like. However, the computer program code for carrying out operations of the present invention may also be written in conventional procedural programming languages, such as the “C” programming language or similar programming languages, or in functional programming languages, such as Haskell, Standard Meta Language (SML) or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • The present invention is described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • FIG. 1 is a flow chart of an example of a method 100 for perspective based tagging and visualization of avatars in a virtual world in accordance with an embodiment of the present invention. In block 110, another avatar is determined to be within the predetermined proximity range of the user's avatar. The other avatar may move into the predetermined proximity range of the user's avatar in the virtual world through actions of the other user controlling the other avatar, and/or the user may move his avatar such that the predetermined proximity range of the user's avatar encompasses the other avatar. Referring also to FIG. 2A, FIG. 2A is an illustration of an example of a portion of a virtual world 200 illustrating a predetermined proximity range 202 in accordance with an embodiment of the present invention. The portion of the virtual world 200 may be presented to the user on a display of a computer device as described herein. The perimeter of the predetermined proximity range 202 is illustrated by a dashed or broken line in FIGS. 2A and 2B and defines an area 204 surrounding the user's avatar 206 in FIGS. 2A and 2B. The predetermined proximity range 202 may be a two-dimensional or three-dimensional range extending a predetermined distance D from the user's avatar in any direction. The predetermined proximity range 202 may be substantially circular as illustrated in FIGS. 2A and 2B or may be shape selected by the user. FIG. 2A illustrates an avatar 208 of another user that may be approaching the user's avatar 206 from outside of the predetermined proximity range 202.
  • Referring also to FIG. 2B, FIG. 2B illustrates the other avatar 208 coming within the predetermined proximity range 202 of the user's avatar 206 either by movement of the other avatar 208, the user's avatar 206 or both. If the other avatar 208 has been previously tagged with information by the user, the tagged information may be presented to the user visually, audibly or both, as described herein, in response to the other avatar coming within the predetermined proximity range. The user may also edit the tagged information. If the other avatar 208 has not been previously tagged with information, the user may be allowed to tag the other avatar 208 with information in response to the other avatar being within the predetermined proximity range of the user's avatar 206, as further described herein.
  • Referring back to FIG. 1, when another avatar is determined to be within the predetermined proximity range 202 of the user's avatar 206 as illustrated in FIG. 2B, the method 100 may advance to block 112. In block 112, a determination may be made whether the user has previously tagged the avatar 208 within the proximity range 202 with custom information. If not, the method 100 may advance to block 114.
  • In block 114, a determination may be made whether the user desires to tag the other avatar 208 that is within the proximity range 202 with information. For example, the user may be presented a graphical user interface to ask the user whether or not the user desires to tag information. Alternatively, the user may indicate his desire to tag information for this avatar by any activation means, such as pressing one or more keyboard keys, clicking a mouse or other computer pointing device, selecting an option on an onscreen menu or the like. If the user does not desire to tag information, then the method may terminate, as shown in block 128.
  • In block 116, in accordance with an embodiment of the invention, at any time the user wishes to tag information to the other avatar 208, the user may select the other avatar 208 using a computer pointing device or the like. Selecting a particular other avatar may be necessary when there are multiple other avatars within the predetermined proximity range 202 of the user's avatar. In this case, the user must select which avatar is to be tagged with the information. The user may select the other avatar 208 by giving some indication that the other avatar 208 is the avatar the user wishes to tag with information. For example, the user may select the other avatar by choosing the other avatar with the user's mouse, by typing the other avatar's name into a graphical user interface or by other means.
  • In block 117, the user may be presented a graphical user interface (GUI) to enter tagged information about the other avatar 208. The user may also be presented a GUI to define various display or presentation options for presenting the tagged information. These GUIs may be presented to the user in separate GUIs or may be combined and presented to the user in one GUI. As described herein, the information may be presented in a visual form, audible form, or both. The tagged information may be presented only to the user that entered the tagged information.
  • In block 118, the user is allowed to enter text and other information about the other avatar 202. The user may enter this information by any means, such as through the GUI as described above. The tagged information may be information that is pertinent from a perspective of the user. For example, the tagged information may include information to identify the other avatar 208 to the user and to present a relevance of the other avatar 208 to the user. The information about the other avatar 208 may include any information, such as the avatar's occupation, the avatar's name, how the user knows the avatar, information obtained by the user based on interactions the user's avatar 206 has had with the other avatar 208, user thoughts about the avatar or any information to define a relevance of the other avatar 208 to the user or the user's avatar 206.
  • In block 119, the user may be allowed to define settings or properties for the presentation of the tagged information to the user. For example, the user may define a format for presenting the tagged information, such as how the information will be visually or audibly presented to the user, a frequency for presenting the tagged information, a duration for presenting the information to the user, and any other properties for presenting the tagged information. Also, the user may define when to present the tagged information to the user. For example, the user may set the tagged information to be delivered to the user every time a tagged avatar enters the proximity range 202 of the user's avatar 206. The user may set the tagged information to be delivered only once per a certain time period (e.g. once per day, week, month, etc.) that the tagged avatar enters the proximity range 202 of the user's avatar 206. Further, the user may desire to set the tagged information to be delivered at the next meeting between the user's avatar 206 and the tagged avatar. The user may also define the display settings to not present the tagged information to the user at all or only on request of the user.
  • In block 120, the user may define visual and auditory options or properties for the information tagged to the other avatar 208. Visual options or features the user may set could include a number of symbols, color schemes, or any other visual options which may be desired by the user. Optional text and a legend may be included with the visual options. For example, the user may desire to mark all the user's avatar's co-workers in the virtual world with a certain symbol or optional text so that when any of the avatars come into the proximity range 202 of the user's avatar 206, the user immediately recognizes the other avatar as a co-worker. The audible options or settings would allow the user to present any information to the user via an audible means. For example, any text entered by the user and tagged to another avatar may be read to the user via any text-to-voice reader technologies in response to the other avatar coming within the predetermined proximity range 202 of the user's avatar 206. Another example includes allowing the user to tag preset sound recordings, such as audible alerts or warnings, any pre-recorded sounds or statements by the user or other available pre-recorded sounds.
  • Referring also to FIG. 3, FIG. 3 is an illustration of an example of tagging another avatar 302 within the predetermined proximity range 304 of the user's avatar 306 in accordance with an embodiment of the present invention. The tagged information and associated settings or properties may be referred to as a tag definition 308 associated with the tagged avatar 302. As described above, the user is allowed to create or define the tag definition 308. The tag definition 308 may include display properties 310 assigned by the user, text 312 assigned or entered by the user to be associated with the tagged avatar 302, visual and/or audible settings 314 assigned by the user, as well as any other information to be tagged to the other or tagged avatar 302 and properties relative to presentation of the tagged information. The tagged definition 308 may typically only be presented to the user that entered or created the tagged definition 308. The user of the tagged avatar 302 may be unaware of the tagged information or tagged definition 308. The tagged definition 308 may be present on a computing device associated with the user similar to that described with reference to FIG. 5.
  • FIG. 3 may also illustrate an example of delivery or presentation of the tagged information corresponding to the tag definition 306 in accordance with an embodiment of the present invention. Accordingly, when the tagged avatar 302 enters the predetermined proximity range 304, the tag definition 308 associated with the previously tagged avatar 302 may be presented to the user. The user may also be permitted to edit the information and any associated settings or properties in the tag definition 308.
  • Referring back to FIG. 1, in block 121, any tagged information or tagged information settings or properties may be stored in a database or inventory associated with the user. Each database or inventory may be specific to each user and may be stored locally on the user's computer readable storage medium or remotely on a server.
  • Referring back to block 112, if the other avatar 208 (FIG. 2B) is within the proximity range 202 of the user's avatar 206 and the user has previously tagged information to the other avatar 208, the method 100 may advance to block 122. In block 122 a determination may be made whether the user's previously entered settings allow presentation of the tagged information to the user. Recall that the user may enter properties to limit presentation of the tagged information under certain circumstance. For example the tagged information may only be presented once per a certain time period, only on request of the user or based on some other parameters. If the user settings allow presentation of the tagged information in block 122, the method 100 may advance to blocks 123 and 124. If not, the method may advance to block 126.
  • In block 123, the tagged information and any tagged information settings relating to the tagged avatar may be retrieved from the user's inventory, defined above with regard to block 121.
  • In block 124, the tagged information of the tagged avatar may be presented to the user in accordance with the tagged information settings that the user had previously defined. If no tagged information settings exist, certain settings have not been defined, or none are retrieved, the tagged information may be displayed to the user according to default display settings. As previously described, the tagged information may be visual, auditory or both.
  • In block 126, a determination may be made whether the user desires to modify and/or add any tagged information and/or settings for the tagged avatar. If the user does not desire to edit the tagged information or settings, the method 100 may end at termination 128. However, if the user desires to perform one or more of the above actions, the method 100 may advance to block 130. To determine whether the user desires to edit the tagged information and settings, the user may indicate so by any means, such as entering such information in a GUI, selecting the tagged avatar, clicking the mouse, clicking an on-screen pull-down menu, pressing one or more keys on the keyboard, or other action.
  • In block 130, one or more GUIs may be presented to the user to modify tagged information and/or tagged information settings for the tagged avatar. Additionally, one or more GUIs may allow for the user to add tagged information or define tagged information settings. It should be noted that all of the above GUIs may be combined into one GUI to perform one or more of the above actions.
  • FIG. 4 is an illustration of an example of presentation 400 of tagged information to the user (not shown) when a tagged avatar 402 is or comes within the predetermined proximity range 404 of the user's avatar 406 in accordance with an embodiment of the present invention. As previously discussed with regard to block 120, the presentation 400 of tagged information may include a visual display 408 or an audible presentation 410 of tagged information.
  • As previously discussed, FIG. 4 may also illustrate that the presentation 410 of tagged information (visual information 408 or audible information 410) may only occur for the user that previously tagged the information to the tagged avatar 402. The tagged information is typically not delivered or presented to the user of the tagged avatar 402 and the user of the tagged avatar 402 may be unaware that his avatar is tagged with information.
  • FIG. 4 also illustrates an example of managing tagged information in accordance with an embodiment of the present invention. As previously discussed, a user can configure settings or properties associated with tagged information to add additional details or modify settings and delivery or presentation. For example, as interaction levels between avatars may increase and more avatars enter the proximity range 404 of the user's avatar 406, the user may turn off delivery of the tagged information for selected other avatars, such as secondary avatar 412 either because the tagged information is no longer needed or for other reasons. Management of tagged information may be accomplished by the user selecting to manage tagged avatars in a drop down menu, right-clicking a computer pointing device on the other avatar whose tagged information is to be managed or by any other means for selecting or activating features in a computer application, on the Internet or in a virtual world. A GUI or similar mechanism may be presented to permit the user to edit the information, add additional details, modify settings, change delivery properties and the like.
  • The tagged information may also be shared with other users or user's avatars. As previously discussed with respect to block 121 in FIG. 1, the tagged information may be stored locally in a inventory associated with the user, such as in a file system or memory on the user's computer or server running the virtual world. The tagged information and settings may then be shared or given to other users or avatars of other users in the virtual world. This sharing feature may enable teams of users to share information and operate more effectively in the virtual world or environment.
  • The following are exemplary embodiments of the present invention. First, a user's avatar may attend a virtual conference and identify a potential follow-up partnership with another avatar. The user may tag the other avatar with the conference name, company name, and possible partnership information. When the user's avatar next encounters the other avatar, as previously described, the tagged information may be presented to the user and the user may immediately recognize the other avatar. Second, as another example, a user may tag members of a virtual book club. When the user next encounters another book club avatar, the user can immediately respond appropriately without delay because the user was presented tagged information and thus the user, understands the other avatar is a member of the book club. Third, as a last example, a user may tag a potential customer avatar in a virtual space with details pertinent to sales. On a follow-up meeting between the user's avatar and the other avatar, the user may receive a notification of the tagged information via predefined methods (visual/audible) enabling the user to provide better customer relations and feedback.
  • FIG. 5 is a block schematic diagram of an example of a system 500 for perspective based tagging and visualization of avatars in a virtual world in accordance with an embodiment of the present invention. The system 500 may include a module 502 for perspective based tagging and visualization of avatars in a virtual world (hereinafter “custom tagging module 502”). The method 100 may be embodied in or performed by the custom tagging module 502.
  • The custom tagging module 502 may be part of a virtual world simulation system 504 or program, such as Second Life™ or similar virtual world system. The virtual world simulation system 504 may be operable on a server 506 which may be accessed by a plurality of users 508 or participants via a network 510 using an Internet browser on a computer system 512, personal computer or similar device. The network 510 may be the Internet, a private network or other network. Each computer system 512 may be similar to the exemplary computer system 514 and associated components illustrated in FIG. 5 for purposes of explaining the embodiment of the present invention illustrated in FIG. 5.
  • The custom tagging module 502 may be a self contained system with embedded logic, decision making, state based operations and other functions that may operate in conjunction with a virtual world simulation, such as Second Life™. The self contained system may allow businesses, individuals, services, locations, and the like in the virtual world to interact. The custom tagging module 502 may be stored on a file system 516 or memory of the computer system 514. The custom tagging module 502 may accessed from the file system 516 and run on a processor 518 associated with the computer system 514.
  • In accordance with another embodiment of the present invention, the custom tagging module 502 may be operable on the server 506 rather than each user's computer system 512 and 514 and may be accessed by the plurality of users 508 or participants via the network 510 using an Internet browser on the computer system 512 or 514 or by similar means.
  • The custom tagging module 502 may include a tagged information management module 520. The tagged information management module 520 manages the information tagged to other avatars by the user of the computer system 514. For example, the tagged information management module 520 allows presentation of the tagged information for a tagged avatar when the tagged avatar comes within the user's avatar's proximity range.
  • The tagged information management module 520 may retrieve tagged information and associated settings from a user's inventory 522 and present the tagged information on a display 524, if visual, or via a speaker 526 or speakers, if auditory, in accordance with the tagged information presentation settings. Also, the tagged information module 520 collects the tagged information after the tagged information has been entered by the user and places the tagged information into the user's inventory 522.
  • The custom tagging module 502 may present one or more predetermined graphical user interfaces 528, similar to the GUIs described with respect to the method 100 of FIG. 1, to permit perspective based tagging of information to other user's avatars. As previously discussed the tagged information is from the user's perspective relative to the other user's avatar, hence perspective based tagging. The perspective based tagged information may assist the user in identifying the other users' avatars as well as a relevance of the other users' avatars to the user's own avatar in the virtual world.
  • The GUIs 528 may include a GUI to allow the user to enter tagged information and/or tagged information presentation settings. These GUIs 528 may be predetermined and presented in response to the user indicating the user would like to add/or modify information or information presentation settings for another avatar. The predetermined GUIs 528 may be generated by the custom tagging module 502 as described herein and may be presented on the display 524 of the computer system 514.
  • The custom tagging module 502 may also include a proximity range detection module 530. The proximity range detection module 530 may determine when an avatar comes within the predetermined proximity range of the user's avatar. When an avatar comes within the predetermined proximity range, the proximity range detection module 530 sends an alert to the custom tagging module 502 to present any tagged information or the tag definition 308 as illustrated in FIG. 3.
  • The custom tagging module 502 may also include inventory 522. The inventory 522 may include tagged information and associated settings for presentation of the tagged information. The inventory may be store locally on the user's computer or the user's computer readable storage medium. The inventory 522 may also be stored remotely on the network 510 in a database (not shown). The custom tagging module 502 may determine if tagged information exists for an avatar that has entered a user's proximity range by checking the user's inventory 522 for information associated the avatar.
  • The custom tagging module 502 may further include an options feature 532. The options feature 532 may include any future enhancements, configurations, and extensions to the existing system or any additions relating to the custom tagging module 502. As an example, a notification system could be added to the custom tagging module 502 which could be configured as an option to provide instant updates or provide notification by some other mechanism.
  • The user computer system may also include one or more input devices, output devices or combination input and output device, collectively I/O devices 534 in FIG. 5. The I/O devices 534 may include a keyboard, computer pointing device or similar means to control operation of the custom tagging module 502 and to enter information into the various GUIs as described herein. The I/O devices 534 may also include disk drives or devices for reading computer media including computer-readable or computer-operable instructions.
  • The flowcharts and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
  • The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
  • Although specific embodiments have been illustrated and described herein, those of ordinary skill in the art appreciate that any arrangement which is calculated to achieve the same purpose may be substituted for the specific embodiments shown and that the invention has other applications in other environments. This application is intended to cover any adaptations or variations of the present invention. The following claims are in no way intended to limit the scope of the invention to the specific embodiments described herein.

Claims (20)

1. A method for perspective based tagging and visualization of avatars in a virtual world, comprising:
determining if another avatar has moved within a predetermined proximity range of a user's avatar in a virtual world; and
allowing the user to tag the other avatar with information in response to the other avatar being within the predetermined proximity range of the user's avatar.
2. The method of claim 1, further comprising:
determining if the user has previously tagged the other avatar with the information in response to the other avatar coming within the predetermined proximity range of the user's avatar; and
presenting the tagged information in response to the other avatar coming within the predetermined proximity range of the user's avatar and in response to the user having previously tagged the other avatar with the tagged information.
3. The method of claim 2, wherein presenting the tagged information comprises presenting the tagged information only to the user.
4. The method of claim 2, wherein presenting the tagged information comprises presenting the tagged information in at least one of a visual form and an audible form.
5. The method of claim 2, further comprising presenting an interface for the user to modify any tagged information previously entered by the user for the other avatar.
6. The method of claim 1, further comprising:
allowing the user to select the other avatar for tagging the information; and
presenting an interface for the user to enter the information to be tagged to the other avatar in response to the user selecting the other avatar.
7. The method of claim 1, further comprising allowing the user to define properties for presenting the tagged information.
8. The method of claim 7, wherein allowing the user to define properties for presenting the tagged information comprises:
allowing the user to define a format for presenting the tagged information to the user;
allowing the user to define a frequency for presenting the tagged information; and
allowing the user to define a duration for presenting the tagged information.
9. The method of claim 1, further comprising allowing the user to define at least one of visual and auditory information for the tagged information.
10. The method of claim 1, further comprising storing the tagged information and associated settings locally in an inventory associated with the user.
11. The method of claim 10, further comprising allowing the tagged information and associated settings to be shared with another user and the other user's avatar.
12. The method of claim 1, wherein allowing the user to tag the other avatar with information comprises allowing the user to tag the other avatar with information that is pertinent from a perspective of the user.
13. The method of claim 1, further comprising allowing the user to tag the other avatar with information comprises allowing the user to tag the other avatar with information to identify the other avatar to the user and to present a relevance of the other avatar to the user's avatar in the virtual world.
14. The method of claim 1, further comprising allowing a management entity to update and modify the tagged information.
15. A method for perspective based tagging and visualization of avatars in a virtual world, comprising:
determining if another avatar has moved within a predetermined proximity range of a user's avatar in a virtual world;
determining if the user has previously tagged the other avatar with tagged information; and
presenting the tagged information in response to the other avatar coming within the predetermined proximity range of the user's avatar and the user having previously tagged the other avatar with the tagged information.
16. The method of claim 15, further comprising:
permitting the user to tag the other avatar with information in response to the other avatar coming within the predetermined proximity range of the user's avatar and the user not having previously tagged the other avatar with information.
17. The method of claim 15, further comprising:
allowing the user to define properties for presenting the tagged information.
18. The method of claim 15, further comprising allowing the user's avatar to transfer the tagged information for the other avatar to any further avatar in the virtual world.
19. A method for perspective based tagging and visualization of avatars in a virtual world, comprising:
determining if another avatar has moved within a predetermined proximity range of a user's avatar in a virtual world;
determining if the user has previously tagged the other avatar with tagged information, wherein the tagged information includes information that is pertinent from a perspective of the user; and
presenting the tagged information in response to the other avatar coming within the predetermined proximity range of the user's avatar and the user having previously tagged the other avatar with the tagged information; and
permitting the user to perform at least one of:
tag the other avatar with information that is pertinent to the perspective of the user in the virtual world in response to the other avatar coming within the predetermined proximity range of the user's avatar and the user not having previously tagged the other avatar with information; and
edit the tagged information in response to the other avatar coming within the predetermined proximity range of the user's avatar and the user having previously tagged the other avatar with information.
20. The method of claim 19, further comprising permitting the user to tag the other avatar with information related to an identification of the other avatar and a relevance of the other avatar to the user's avatar in the virtual world.
US11/972,064 2008-01-10 2008-01-10 Perspective based tagging and visualization of avatars in a virtual world Expired - Fee Related US8495505B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US11/972,064 US8495505B2 (en) 2008-01-10 2008-01-10 Perspective based tagging and visualization of avatars in a virtual world
US13/900,111 US9195363B2 (en) 2008-01-10 2013-05-22 Perspective based tagging and visualization of avatars in a virtual world
US14/884,443 US9927868B2 (en) 2008-01-10 2015-10-15 Perspective based tagging and visualization of avatars in a virtual world

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11/972,064 US8495505B2 (en) 2008-01-10 2008-01-10 Perspective based tagging and visualization of avatars in a virtual world

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/900,111 Continuation US9195363B2 (en) 2008-01-10 2013-05-22 Perspective based tagging and visualization of avatars in a virtual world

Publications (2)

Publication Number Publication Date
US20090183071A1 true US20090183071A1 (en) 2009-07-16
US8495505B2 US8495505B2 (en) 2013-07-23

Family

ID=40851760

Family Applications (3)

Application Number Title Priority Date Filing Date
US11/972,064 Expired - Fee Related US8495505B2 (en) 2008-01-10 2008-01-10 Perspective based tagging and visualization of avatars in a virtual world
US13/900,111 Expired - Fee Related US9195363B2 (en) 2008-01-10 2013-05-22 Perspective based tagging and visualization of avatars in a virtual world
US14/884,443 Active 2028-07-18 US9927868B2 (en) 2008-01-10 2015-10-15 Perspective based tagging and visualization of avatars in a virtual world

Family Applications After (2)

Application Number Title Priority Date Filing Date
US13/900,111 Expired - Fee Related US9195363B2 (en) 2008-01-10 2013-05-22 Perspective based tagging and visualization of avatars in a virtual world
US14/884,443 Active 2028-07-18 US9927868B2 (en) 2008-01-10 2015-10-15 Perspective based tagging and visualization of avatars in a virtual world

Country Status (1)

Country Link
US (3) US8495505B2 (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090204907A1 (en) * 2008-02-13 2009-08-13 Finn Peter G Virtual object tagging for use in marketing
US20090210797A1 (en) * 2008-02-20 2009-08-20 Brian John Cragun Accessibility in virtual worlds using tags
US20090222274A1 (en) * 2008-02-28 2009-09-03 Hamilton Ii Rick A Preventing fraud in a virtual universe
US20100169796A1 (en) * 2008-12-28 2010-07-01 Nortel Networks Limited Visual Indication of Audio Context in a Computer-Generated Virtual Environment
US20100287510A1 (en) * 2009-05-08 2010-11-11 International Business Machines Corporation Assistive group setting management in a virtual world
US20110087968A1 (en) * 2009-10-09 2011-04-14 International Business Machines Corporation Managing connections between real world and virtual world communities
US20110087732A1 (en) * 2009-10-09 2011-04-14 International Business Machines Corporation Linking virtual worlds and collaboration platforms bi-directionally using a central identity management system
US20110296043A1 (en) * 2010-06-01 2011-12-01 Microsoft Corporation Managing Shared Sessions in a Shared Resource Computing Environment
US20130031475A1 (en) * 2010-10-18 2013-01-31 Scene 53 Inc. Social network based virtual assembly places
US10169767B2 (en) 2008-09-26 2019-01-01 International Business Machines Corporation Method and system of providing information during content breakpoints in a virtual universe
US20190042597A1 (en) * 2013-02-06 2019-02-07 John A. Fortkort Method for populating a map with a plurality of avatars through the use of a mobile technology platform
US10599289B1 (en) * 2017-11-13 2020-03-24 Snap Inc. Interface to display animated icon
US20210406542A1 (en) * 2020-06-30 2021-12-30 Ilteris Canberk Augmented reality eyewear with mood sharing
WO2022213104A1 (en) * 2021-04-01 2022-10-06 A1A Software Llc Experiential virtual reality event host delivery dais
US11557093B1 (en) * 2019-09-10 2023-01-17 Meta Platforms Technologies, Llc Using social connections to define graphical representations of users in an artificial reality setting
US20240062430A1 (en) * 2022-08-17 2024-02-22 At&T Intellectual Property I, L.P. Contextual avatar presentation based on relationship data

Families Citing this family (139)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8495505B2 (en) * 2008-01-10 2013-07-23 International Business Machines Corporation Perspective based tagging and visualization of avatars in a virtual world
US9105014B2 (en) 2009-02-03 2015-08-11 International Business Machines Corporation Interactive avatar in messaging environment
WO2013096489A1 (en) * 2011-12-20 2013-06-27 Icelero Inc Method and system for creating a virtual social and gaming experience
US10155168B2 (en) 2012-05-08 2018-12-18 Snap Inc. System and method for adaptable avatars
US10586570B2 (en) 2014-02-05 2020-03-10 Snap Inc. Real time video processing for changing proportions of an object in the video
US10339365B2 (en) 2016-03-31 2019-07-02 Snap Inc. Automated avatar generation
US10474353B2 (en) 2016-05-31 2019-11-12 Snap Inc. Application control using a gesture based trigger
US10360708B2 (en) 2016-06-30 2019-07-23 Snap Inc. Avatar based ideogram generation
US10855632B2 (en) 2016-07-19 2020-12-01 Snap Inc. Displaying customized electronic messaging graphics
US10609036B1 (en) 2016-10-10 2020-03-31 Snap Inc. Social media post subscribe requests for buffer user accounts
US10198626B2 (en) 2016-10-19 2019-02-05 Snap Inc. Neural networks for facial modeling
US10593116B2 (en) 2016-10-24 2020-03-17 Snap Inc. Augmented reality object manipulation
US10432559B2 (en) 2016-10-24 2019-10-01 Snap Inc. Generating and displaying customized avatars in electronic messages
US11616745B2 (en) 2017-01-09 2023-03-28 Snap Inc. Contextual generation and selection of customized media content
US10242503B2 (en) 2017-01-09 2019-03-26 Snap Inc. Surface aware lens
US10242477B1 (en) 2017-01-16 2019-03-26 Snap Inc. Coded vision system
US10951562B2 (en) 2017-01-18 2021-03-16 Snap. Inc. Customized contextual media content item generation
US10454857B1 (en) 2017-01-23 2019-10-22 Snap Inc. Customized digital avatar accessories
US11069103B1 (en) 2017-04-20 2021-07-20 Snap Inc. Customized user interface for electronic communications
US11893647B2 (en) 2017-04-27 2024-02-06 Snap Inc. Location-based virtual avatars
WO2018201102A1 (en) 2017-04-27 2018-11-01 Snap Inc. Friend location sharing mechanism for social media platforms
US10212541B1 (en) 2017-04-27 2019-02-19 Snap Inc. Selective location-based identity communication
US10679428B1 (en) 2017-05-26 2020-06-09 Snap Inc. Neural network-based image stream modification
US11122094B2 (en) 2017-07-28 2021-09-14 Snap Inc. Software application manager for messaging applications
US10586368B2 (en) 2017-10-26 2020-03-10 Snap Inc. Joint audio-video facial animation system
US10657695B2 (en) 2017-10-30 2020-05-19 Snap Inc. Animated chat presence
US11460974B1 (en) 2017-11-28 2022-10-04 Snap Inc. Content discovery refresh
KR102433817B1 (en) 2017-11-29 2022-08-18 스냅 인코포레이티드 Group stories in an electronic messaging application
WO2019108702A1 (en) 2017-11-29 2019-06-06 Snap Inc. Graphic rendering for electronic messaging applications
US10949648B1 (en) 2018-01-23 2021-03-16 Snap Inc. Region-based stabilized face tracking
US10979752B1 (en) 2018-02-28 2021-04-13 Snap Inc. Generating media content items based on location information
US10726603B1 (en) 2018-02-28 2020-07-28 Snap Inc. Animated expressive icon
US11310176B2 (en) 2018-04-13 2022-04-19 Snap Inc. Content suggestion system
KR20200143464A (en) 2018-04-18 2020-12-23 스냅 인코포레이티드 Augmented expression system
US11074675B2 (en) 2018-07-31 2021-07-27 Snap Inc. Eye texture inpainting
US11030813B2 (en) 2018-08-30 2021-06-08 Snap Inc. Video clip object tracking
US10896534B1 (en) 2018-09-19 2021-01-19 Snap Inc. Avatar style transformation using neural networks
US10895964B1 (en) 2018-09-25 2021-01-19 Snap Inc. Interface to display shared user groups
US10698583B2 (en) 2018-09-28 2020-06-30 Snap Inc. Collaborative achievement interface
US11189070B2 (en) 2018-09-28 2021-11-30 Snap Inc. System and method of generating targeted user lists using customizable avatar characteristics
US10904181B2 (en) 2018-09-28 2021-01-26 Snap Inc. Generating customized graphics having reactions to electronic message content
US11245658B2 (en) 2018-09-28 2022-02-08 Snap Inc. System and method of generating private notifications between users in a communication session
US11103795B1 (en) 2018-10-31 2021-08-31 Snap Inc. Game drawer
US10872451B2 (en) 2018-10-31 2020-12-22 Snap Inc. 3D avatar rendering
US11176737B2 (en) 2018-11-27 2021-11-16 Snap Inc. Textured mesh building
US10902661B1 (en) 2018-11-28 2021-01-26 Snap Inc. Dynamic composite user identifier
US10861170B1 (en) 2018-11-30 2020-12-08 Snap Inc. Efficient human pose tracking in videos
US11199957B1 (en) 2018-11-30 2021-12-14 Snap Inc. Generating customized avatars based on location information
US11055514B1 (en) 2018-12-14 2021-07-06 Snap Inc. Image face manipulation
US11516173B1 (en) 2018-12-26 2022-11-29 Snap Inc. Message composition interface
US11032670B1 (en) 2019-01-14 2021-06-08 Snap Inc. Destination sharing in location sharing system
US10939246B1 (en) 2019-01-16 2021-03-02 Snap Inc. Location-based context information sharing in a messaging system
US11294936B1 (en) 2019-01-30 2022-04-05 Snap Inc. Adaptive spatial density based clustering
US10984575B2 (en) 2019-02-06 2021-04-20 Snap Inc. Body pose estimation
US10656797B1 (en) 2019-02-06 2020-05-19 Snap Inc. Global event-based avatar
US10936066B1 (en) 2019-02-13 2021-03-02 Snap Inc. Sleep detection in a location sharing system
US10964082B2 (en) 2019-02-26 2021-03-30 Snap Inc. Avatar based on weather
US10852918B1 (en) 2019-03-08 2020-12-01 Snap Inc. Contextual information in chat
US11868414B1 (en) 2019-03-14 2024-01-09 Snap Inc. Graph-based prediction for contact suggestion in a location sharing system
US11852554B1 (en) 2019-03-21 2023-12-26 Snap Inc. Barometer calibration in a location sharing system
US11166123B1 (en) 2019-03-28 2021-11-02 Snap Inc. Grouped transmission of location data in a location sharing system
US10674311B1 (en) 2019-03-28 2020-06-02 Snap Inc. Points of interest in a location sharing system
US10992619B2 (en) 2019-04-30 2021-04-27 Snap Inc. Messaging system with avatar generation
USD916871S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a transitional graphical user interface
USD916810S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a graphical user interface
USD916872S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a graphical user interface
USD916809S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a transitional graphical user interface
USD916811S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a transitional graphical user interface
US10893385B1 (en) 2019-06-07 2021-01-12 Snap Inc. Detection of a physical collision between two client devices in a location sharing system
US11676199B2 (en) 2019-06-28 2023-06-13 Snap Inc. Generating customizable avatar outfits
US11189098B2 (en) 2019-06-28 2021-11-30 Snap Inc. 3D object camera customization system
US11188190B2 (en) 2019-06-28 2021-11-30 Snap Inc. Generating animation overlays in a communication session
US11307747B2 (en) 2019-07-11 2022-04-19 Snap Inc. Edge gesture interface with smart interactions
US11455081B2 (en) 2019-08-05 2022-09-27 Snap Inc. Message thread prioritization interface
US10911387B1 (en) 2019-08-12 2021-02-02 Snap Inc. Message reminder interface
US11320969B2 (en) 2019-09-16 2022-05-03 Snap Inc. Messaging system with battery level sharing
US11425062B2 (en) 2019-09-27 2022-08-23 Snap Inc. Recommended content viewed by friends
US11080917B2 (en) 2019-09-30 2021-08-03 Snap Inc. Dynamic parameterized user avatar stories
US11218838B2 (en) 2019-10-31 2022-01-04 Snap Inc. Focused map-based context information surfacing
US11063891B2 (en) 2019-12-03 2021-07-13 Snap Inc. Personalized avatar notification
US11128586B2 (en) 2019-12-09 2021-09-21 Snap Inc. Context sensitive avatar captions
US11036989B1 (en) 2019-12-11 2021-06-15 Snap Inc. Skeletal tracking using previous frames
US11227442B1 (en) 2019-12-19 2022-01-18 Snap Inc. 3D captions with semantic graphical elements
US11263817B1 (en) 2019-12-19 2022-03-01 Snap Inc. 3D captions with face tracking
US11128715B1 (en) 2019-12-30 2021-09-21 Snap Inc. Physical friend proximity in chat
US11140515B1 (en) 2019-12-30 2021-10-05 Snap Inc. Interfaces for relative device positioning
US11169658B2 (en) 2019-12-31 2021-11-09 Snap Inc. Combined map icon with action indicator
US11284144B2 (en) 2020-01-30 2022-03-22 Snap Inc. Video generation system to render frames on demand using a fleet of GPUs
KR20220133249A (en) 2020-01-30 2022-10-04 스냅 인코포레이티드 A system for creating media content items on demand
US11356720B2 (en) 2020-01-30 2022-06-07 Snap Inc. Video generation system to render frames on demand
US11036781B1 (en) 2020-01-30 2021-06-15 Snap Inc. Video generation system to render frames on demand using a fleet of servers
US11619501B2 (en) 2020-03-11 2023-04-04 Snap Inc. Avatar based on trip
US11217020B2 (en) 2020-03-16 2022-01-04 Snap Inc. 3D cutout image modification
US11625873B2 (en) 2020-03-30 2023-04-11 Snap Inc. Personalized media overlay recommendation
US11818286B2 (en) 2020-03-30 2023-11-14 Snap Inc. Avatar recommendation and reply
US11922010B2 (en) 2020-06-08 2024-03-05 Snap Inc. Providing contextual information with keyboard interface for messaging system
US11543939B2 (en) 2020-06-08 2023-01-03 Snap Inc. Encoded image based messaging system
US11356392B2 (en) 2020-06-10 2022-06-07 Snap Inc. Messaging system including an external-resource dock and drawer
US11580682B1 (en) 2020-06-30 2023-02-14 Snap Inc. Messaging system with augmented reality makeup
US11863513B2 (en) 2020-08-31 2024-01-02 Snap Inc. Media content playback and comments management
US11360733B2 (en) 2020-09-10 2022-06-14 Snap Inc. Colocated shared augmented reality without shared backend
US11470025B2 (en) 2020-09-21 2022-10-11 Snap Inc. Chats with micro sound clips
US11452939B2 (en) 2020-09-21 2022-09-27 Snap Inc. Graphical marker generation system for synchronizing users
US11910269B2 (en) 2020-09-25 2024-02-20 Snap Inc. Augmented reality content items including user avatar to share location
US11660022B2 (en) 2020-10-27 2023-05-30 Snap Inc. Adaptive skeletal joint smoothing
US11615592B2 (en) 2020-10-27 2023-03-28 Snap Inc. Side-by-side character animation from realtime 3D body motion capture
US11748931B2 (en) 2020-11-18 2023-09-05 Snap Inc. Body animation sharing and remixing
US11450051B2 (en) 2020-11-18 2022-09-20 Snap Inc. Personalized avatar real-time motion capture
US11734894B2 (en) 2020-11-18 2023-08-22 Snap Inc. Real-time motion transfer for prosthetic limbs
US11790531B2 (en) 2021-02-24 2023-10-17 Snap Inc. Whole body segmentation
US11798201B2 (en) 2021-03-16 2023-10-24 Snap Inc. Mirroring device with whole-body outfits
US11734959B2 (en) 2021-03-16 2023-08-22 Snap Inc. Activating hands-free mode on mirroring device
US11908243B2 (en) 2021-03-16 2024-02-20 Snap Inc. Menu hierarchy navigation on electronic mirroring devices
US11809633B2 (en) 2021-03-16 2023-11-07 Snap Inc. Mirroring device with pointing based navigation
US11544885B2 (en) 2021-03-19 2023-01-03 Snap Inc. Augmented reality experience based on physical items
US11562548B2 (en) 2021-03-22 2023-01-24 Snap Inc. True size eyewear in real time
US11636654B2 (en) 2021-05-19 2023-04-25 Snap Inc. AR-based connected portal shopping
US11941227B2 (en) 2021-06-30 2024-03-26 Snap Inc. Hybrid search system for customizable media
US11854069B2 (en) 2021-07-16 2023-12-26 Snap Inc. Personalized try-on ads
US11908083B2 (en) 2021-08-31 2024-02-20 Snap Inc. Deforming custom mesh based on body mesh
US11670059B2 (en) 2021-09-01 2023-06-06 Snap Inc. Controlling interactive fashion based on body gestures
US11673054B2 (en) 2021-09-07 2023-06-13 Snap Inc. Controlling AR games on fashion items
US11663792B2 (en) 2021-09-08 2023-05-30 Snap Inc. Body fitted accessory with physics simulation
US11900506B2 (en) 2021-09-09 2024-02-13 Snap Inc. Controlling interactive fashion based on facial expressions
US11734866B2 (en) 2021-09-13 2023-08-22 Snap Inc. Controlling interactive fashion based on voice
US11798238B2 (en) 2021-09-14 2023-10-24 Snap Inc. Blending body mesh into external mesh
US11836866B2 (en) 2021-09-20 2023-12-05 Snap Inc. Deforming real-world object using an external mesh
US11636662B2 (en) 2021-09-30 2023-04-25 Snap Inc. Body normal network light and rendering control
US11790614B2 (en) 2021-10-11 2023-10-17 Snap Inc. Inferring intent from pose and speech input
US11651572B2 (en) 2021-10-11 2023-05-16 Snap Inc. Light and rendering of garments
US11836862B2 (en) 2021-10-11 2023-12-05 Snap Inc. External mesh with vertex attributes
US11763481B2 (en) 2021-10-20 2023-09-19 Snap Inc. Mirror-based augmented reality experience
US11748958B2 (en) 2021-12-07 2023-09-05 Snap Inc. Augmented reality unboxing experience
US11880947B2 (en) 2021-12-21 2024-01-23 Snap Inc. Real-time upper-body garment exchange
US11887260B2 (en) 2021-12-30 2024-01-30 Snap Inc. AR position indicator
US11928783B2 (en) 2021-12-30 2024-03-12 Snap Inc. AR position and orientation along a plane
US11823346B2 (en) 2022-01-17 2023-11-21 Snap Inc. AR body part tracking system
US11870745B1 (en) 2022-06-28 2024-01-09 Snap Inc. Media gallery sharing and management
US11893166B1 (en) 2022-11-08 2024-02-06 Snap Inc. User avatar movement control using an augmented reality eyewear device

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6366285B1 (en) * 1997-11-21 2002-04-02 International Business Machines Corporation Selection by proximity with inner and outer sensitivity ranges
US6396509B1 (en) * 1998-02-21 2002-05-28 Koninklijke Philips Electronics N.V. Attention-based interaction in a virtual environment
US6954902B2 (en) * 1999-03-31 2005-10-11 Sony Corporation Information sharing processing method, information sharing processing program storage medium, information sharing processing apparatus, and information sharing processing system
US20050256867A1 (en) * 2004-03-15 2005-11-17 Yahoo! Inc. Search systems and methods with integration of aggregate user annotations
US7036082B1 (en) * 2000-09-21 2006-04-25 Nortel Networks Limited Controlling communications through a virtual reality environment
US20070043688A1 (en) * 2005-08-18 2007-02-22 Microsoft Corporation Annotating shared contacts with public descriptors
US7213206B2 (en) * 2003-09-09 2007-05-01 Fogg Brian J Relationship user interface
US20070203645A1 (en) * 2006-02-24 2007-08-30 Dees Ian S Attaching measurement data to an area map
US20080026839A1 (en) * 2006-07-28 2008-01-31 Ujogo. Inc. Interactive Gaming System With Attribute Indicators, Such As Online Poker Rooms With Attribute Indicators
US20080091723A1 (en) * 2006-10-11 2008-04-17 Mark Zuckerberg System and method for tagging digital media
US20090113313A1 (en) * 2007-10-30 2009-04-30 Abernethy Jr Michael Negley Dynamic update of contact information and speed dial settings based on a virtual world interaction
US7529797B2 (en) * 2006-08-16 2009-05-05 Tagged, Inc. User created tags for online social networking
US20090150804A1 (en) * 2007-12-06 2009-06-11 Bokor Brian R Contract amendment mechanism in a virtual world
US7667705B2 (en) * 2001-05-15 2010-02-23 Nintendo Of America Inc. System and method for controlling animation by tagging objects within a game environment
US7668821B1 (en) * 2005-11-17 2010-02-23 Amazon Technologies, Inc. Recommendations based on item tagging activities of users

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8495505B2 (en) * 2008-01-10 2013-07-23 International Business Machines Corporation Perspective based tagging and visualization of avatars in a virtual world
US8564591B2 (en) * 2009-11-30 2013-10-22 International Business Machines Corporation Rendering of artifacts in a virtual universe environment in response to user tags
US9710821B2 (en) * 2011-09-15 2017-07-18 Stephan HEATH Systems and methods for mobile and online payment systems for purchases related to mobile and online promotions or offers provided using impressions tracking and analysis, location information, 2D and 3D mapping, mobile mapping, social media, and user behavior and
US8938679B1 (en) * 2013-11-18 2015-01-20 Axure Software Solutions, Inc. Comment system for interactive graphical designs
US9037455B1 (en) * 2014-01-08 2015-05-19 Google Inc. Limiting notification interruptions

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6366285B1 (en) * 1997-11-21 2002-04-02 International Business Machines Corporation Selection by proximity with inner and outer sensitivity ranges
US6396509B1 (en) * 1998-02-21 2002-05-28 Koninklijke Philips Electronics N.V. Attention-based interaction in a virtual environment
US6954902B2 (en) * 1999-03-31 2005-10-11 Sony Corporation Information sharing processing method, information sharing processing program storage medium, information sharing processing apparatus, and information sharing processing system
US7036082B1 (en) * 2000-09-21 2006-04-25 Nortel Networks Limited Controlling communications through a virtual reality environment
US7667705B2 (en) * 2001-05-15 2010-02-23 Nintendo Of America Inc. System and method for controlling animation by tagging objects within a game environment
US7213206B2 (en) * 2003-09-09 2007-05-01 Fogg Brian J Relationship user interface
US20050256867A1 (en) * 2004-03-15 2005-11-17 Yahoo! Inc. Search systems and methods with integration of aggregate user annotations
US20070043688A1 (en) * 2005-08-18 2007-02-22 Microsoft Corporation Annotating shared contacts with public descriptors
US7668821B1 (en) * 2005-11-17 2010-02-23 Amazon Technologies, Inc. Recommendations based on item tagging activities of users
US20070203645A1 (en) * 2006-02-24 2007-08-30 Dees Ian S Attaching measurement data to an area map
US20080026839A1 (en) * 2006-07-28 2008-01-31 Ujogo. Inc. Interactive Gaming System With Attribute Indicators, Such As Online Poker Rooms With Attribute Indicators
US7529797B2 (en) * 2006-08-16 2009-05-05 Tagged, Inc. User created tags for online social networking
US20080091723A1 (en) * 2006-10-11 2008-04-17 Mark Zuckerberg System and method for tagging digital media
US20090113313A1 (en) * 2007-10-30 2009-04-30 Abernethy Jr Michael Negley Dynamic update of contact information and speed dial settings based on a virtual world interaction
US20090150804A1 (en) * 2007-12-06 2009-06-11 Bokor Brian R Contract amendment mechanism in a virtual world

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8332781B2 (en) * 2008-02-13 2012-12-11 International Business Machines Corporation Virtual object tagging for use in marketing
US20090204907A1 (en) * 2008-02-13 2009-08-13 Finn Peter G Virtual object tagging for use in marketing
US20090210797A1 (en) * 2008-02-20 2009-08-20 Brian John Cragun Accessibility in virtual worlds using tags
US8645846B2 (en) * 2008-02-20 2014-02-04 International Business Machines Corporation Accessibility in virtual worlds using tags
US20090222274A1 (en) * 2008-02-28 2009-09-03 Hamilton Ii Rick A Preventing fraud in a virtual universe
US10909549B2 (en) 2008-09-26 2021-02-02 International Business Machines Corporation Method and system of providing information during content breakpoints in a virtual universe
US10169767B2 (en) 2008-09-26 2019-01-01 International Business Machines Corporation Method and system of providing information during content breakpoints in a virtual universe
US20100169796A1 (en) * 2008-12-28 2010-07-01 Nortel Networks Limited Visual Indication of Audio Context in a Computer-Generated Virtual Environment
US20100287510A1 (en) * 2009-05-08 2010-11-11 International Business Machines Corporation Assistive group setting management in a virtual world
US8161398B2 (en) * 2009-05-08 2012-04-17 International Business Machines Corporation Assistive group setting management in a virtual world
US20110087968A1 (en) * 2009-10-09 2011-04-14 International Business Machines Corporation Managing connections between real world and virtual world communities
US8484288B2 (en) * 2009-10-09 2013-07-09 International Business Machines Corporation Linking virtual worlds and collaboration platforms bi-directionally using a central identity management system
US20110087732A1 (en) * 2009-10-09 2011-04-14 International Business Machines Corporation Linking virtual worlds and collaboration platforms bi-directionally using a central identity management system
US8862482B2 (en) 2009-10-09 2014-10-14 International Business Machines Corporation Managing connections between real world and virtual world communities
US20110296043A1 (en) * 2010-06-01 2011-12-01 Microsoft Corporation Managing Shared Sessions in a Shared Resource Computing Environment
US20130031475A1 (en) * 2010-10-18 2013-01-31 Scene 53 Inc. Social network based virtual assembly places
US10902034B2 (en) * 2013-02-06 2021-01-26 John A. Fortkort Method for populating a map with a plurality of avatars through the use of a mobile technology platform
US20190042597A1 (en) * 2013-02-06 2019-02-07 John A. Fortkort Method for populating a map with a plurality of avatars through the use of a mobile technology platform
US10599289B1 (en) * 2017-11-13 2020-03-24 Snap Inc. Interface to display animated icon
US10942624B1 (en) 2017-11-13 2021-03-09 Snap Inc. Interface to display animated icon
US11775134B2 (en) 2017-11-13 2023-10-03 Snap Inc. Interface to display animated icon
US11557093B1 (en) * 2019-09-10 2023-01-17 Meta Platforms Technologies, Llc Using social connections to define graphical representations of users in an artificial reality setting
US20210406542A1 (en) * 2020-06-30 2021-12-30 Ilteris Canberk Augmented reality eyewear with mood sharing
WO2022213104A1 (en) * 2021-04-01 2022-10-06 A1A Software Llc Experiential virtual reality event host delivery dais
US20240062430A1 (en) * 2022-08-17 2024-02-22 At&T Intellectual Property I, L.P. Contextual avatar presentation based on relationship data

Also Published As

Publication number Publication date
US20160034028A1 (en) 2016-02-04
US20130254684A1 (en) 2013-09-26
US8495505B2 (en) 2013-07-23
US9927868B2 (en) 2018-03-27
US9195363B2 (en) 2015-11-24

Similar Documents

Publication Publication Date Title
US9927868B2 (en) Perspective based tagging and visualization of avatars in a virtual world
US11206301B2 (en) User interaction with desktop environment
US11501255B2 (en) Digital processing systems and methods for virtual file-based electronic white board in collaborative work systems
US20230036942A1 (en) Channeling messaging communications in a selected group-based communication interface
US9164664B2 (en) System and method for avatar cloning
US20190332400A1 (en) System and method for cross-platform sharing of virtual assistants
JP7305011B2 (en) program for displaying messages
US20090259648A1 (en) Automated avatar creation and interaction in a virtual world
Saffer Microinteractions: designing with details
US7471646B2 (en) System and methods for inline property editing in tree view based editors
US9946802B2 (en) Site-wide navigation element for user activity in a social networking site
KR20210019111A (en) Method and apparatus for displaying a chat room associated with a messenger application
US10447723B2 (en) Creating notes on lock screen
WO2013082041A1 (en) Shared collaboration using head-mounted display
US9299178B2 (en) Generation of animated gesture responses in a virtual world
US11488585B2 (en) Real-time discussion relevance feedback interface
US20190213528A1 (en) Digital assistant task management
US20190166076A1 (en) Method and apparatus for sharing booking information and ticket
US20160283073A1 (en) Intelligent interactive screen capture
US20190138165A1 (en) Web parts integration in social networking system
JP2010027023A (en) System, method, and computer readable medium for identifying and rating virtual universe object
US11775736B2 (en) Method and system for providing mini-map in chatroom
KR20200141838A (en) Method and system for providing keyword chat room
CN112352223A (en) Method and system for inputting suggestions
CN116762055A (en) Synchronizing virtual reality notifications

Legal Events

Date Code Title Description
AS Assignment

Owner name: INTERNATIONAL BUSINESS MACHINES CORPORATION, NEW Y

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SMITH, ANDREW BRYAN;BOKOR, BRIAN RONALD;HOUSE, DANIEL EDWARD;AND OTHERS;REEL/FRAME:020347/0336

Effective date: 20071210

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20210723