US20100203968A1 - Apparatus And Method Of Avatar Customisation - Google Patents
Apparatus And Method Of Avatar Customisation Download PDFInfo
- Publication number
- US20100203968A1 US20100203968A1 US12/667,775 US66777508A US2010203968A1 US 20100203968 A1 US20100203968 A1 US 20100203968A1 US 66777508 A US66777508 A US 66777508A US 2010203968 A1 US2010203968 A1 US 2010203968A1
- Authority
- US
- United States
- Prior art keywords
- avatar
- user
- user avatar
- dimensional mesh
- modified
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/40—3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/131—Protocols for games, networked simulations or virtual reality
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/50—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
- A63F2300/55—Details of game data or player data management
- A63F2300/5546—Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history
- A63F2300/5553—Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history user representation in the game field, e.g. avatar
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6009—Methods for processing data by generating or executing the game program for importing or creating game content, e.g. authoring tools during game development, adapting content to different platforms, use of a scripting language to create content
- A63F2300/6018—Methods for processing data by generating or executing the game program for importing or creating game content, e.g. authoring tools during game development, adapting content to different platforms, use of a scripting language to create content where the game content is authored by the player, e.g. level editor or by game device at runtime, e.g. level is created from music data on CD
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2021—Shape modification
Definitions
- This invention relates to an apparatus and method of avatar customisation.
- online-gaming it is conventional for players to adopt distinctive names for their in-game characters (generally termed ‘avatars’).
- these avatars may also be customised, for example according to race (real or fictional) or gender, and a range of different heads and bodies are often provided.
- features such as hair styles, skin tone and age may be customised.
- Conventional means of further customising a user's avatar for such a purpose may include uploading the user's own face as a texture to use on the avatar (for example, see http://research.microsoft.com/ ⁇ zhang/Face/redherringReport.htm), or modifying the gestures and expressions of the avatar to reflect a particular mood that the user wishes to express.
- gestures and expressions are not instantly recognisable as they first need to be carried out.
- uploading images of users faces is potentially intrusive, and rendering the images in a consistent manner when each face may be captured under different lighting conditions and at different effective resolutions is difficult.
- the user may be dissatisfied with the result if it is a poor approximation.
- many people online wish to present a fictional appearance whilst remaining true to their personality, or wish to appear appropriately ‘in character’ within the game environment; in this case a captured image is not a desirable solution.
- the present invention seeks to address the above concerns.
- an entertainment device comprises skeletal modelling means to configure a three dimensional mesh representing some or all of a user avatar in response to at least a first property of one or more skeletal components of the user avatar, skeleton modification means to modify one or more properties of one or more skeletal components of the user avatar via a user interface, and rendering means to render the user avatar in accordance with the three dimensional mesh as configured in response to the modified user avatar skeleton.
- a server operable to administer a multi-player online virtual environment comprises reception means to receive data descriptive of respective modified avatar skeletons from a plurality of remote entertainment devices, and transmission means to transmit data descriptive of respective modified avatar skeletons to a plurality of remote entertainment devices.
- a system comprising a server and two or more entertainment devices as described in the above aspects co-operate to allow the two or more entertainment devices to render the modified avatars of the users of the respective other devices.
- a population of avatars within an on-line environment can therefore be more easily differentiated when the populated environment is rendered by each participating remote entertainment device.
- FIG. 1 is a schematic diagram of an entertainment device
- FIG. 2 is a schematic diagram of a cell processor
- FIG. 3 is a schematic diagram of a video graphics processor
- FIG. 4 is a schematic diagram of an interconnected set of game zones in accordance with an embodiment of the present invention.
- FIG. 5 is a schematic diagram of a Home environment online client/server arrangement in accordance with an embodiment of the present invention.
- FIG. 6 a is a schematic diagram of a lobby zone in accordance with an embodiment of the present invention.
- FIG. 6 b is a schematic diagram of a lobby zone in accordance with an embodiment of the present invention.
- FIG. 6 c is a schematic diagram of a cinema zone in accordance with an embodiment of the present invention.
- FIG. 6 d is a schematic diagram of a developer/publisher zone in accordance with an embodiment of the present invention.
- FIG. 7 is a flow diagram of a method of on-line transaction in accordance with an embodiment of the present invention.
- FIG. 8 a is schematic diagram of an apartment zone in accordance with an embodiment of the present invention.
- FIG. 8 b is schematic diagram of a trophy room zone in accordance with an embodiment of the present invention.
- FIG. 9 is a schematic diagram of a communication menu in accordance with an embodiment of the present invention.
- FIG. 10 is a schematic diagram of an interactive virtual user device in accordance with an embodiment of the present invention.
- FIG. 11 is a schematic diagram of a user interface in accordance with an embodiment of the present invention.
- FIG. 12 is a schematic diagram of a user interface in accordance with an embodiment of the present invention.
- FIG. 13 is a schematic diagram of a user interface in accordance with an embodiment of the present invention.
- FIGS. 14A and 14B are schematic diagrams of a user interface in accordance with an embodiment of the present invention.
- FIGS. 15A , B &C are schematic diagrams of a user avatar in accordance with an embodiment of the present invention.
- FIG. 16 is a flow diagram of a method of user identification in accordance with an embodiment of the present invention.
- a user of an entertainment device connected to an on-line virtual environment selects and customises an avatar using conventional options such as gender, clothing and skin-tone.
- the user can also modify the three-dimensional mesh used to define the surface of the user's avatar, upon which textures relating to gender, skin tone, age etc., can be applied.
- This configuration is achieved using a comparatively simple user interface by positioning vertices of the avatar mesh in response to a skeletal model underpinning the avatar's mesh structure. The user can therefore modify the mesh of their avatar by making simple parametric adjustments to the so-called ‘bones’ of the skeletal models.
- the user interface provides a hierarchy of adjustments, from whole-face skeletal adjustments (e.g. by race) to partial face skeletal adjustments (e.g. upper face, lower face, cranium) to individual features (e.g. cheek bones). This allows a quick modification of the avatar without compromising the ability to fine tune the results. Moreover, the user interface allows the modification of bone parameters to provide an asymmetric mesh, as this conveys a more naturalistic appearance for the avatar, as well as providing an additional source of distinctiveness and identity.
- FIG. 1 schematically illustrates the overall system architecture of the Sony® Playstation 3® entertainment device.
- a system unit 10 is provided, with various peripheral devices connectable to the system unit.
- the system unit 10 comprises: a Cell processor 100 ; a Rambus® dynamic random access memory (XDRAM) unit 500 ; a Reality Synthesiser graphics unit 200 with a dedicated video random access memory (VRAM) unit 250 ; and an I/O bridge 700 .
- XDRAM Rambus® dynamic random access memory
- VRAM dedicated video random access memory
- the system unit 10 also comprises a Blu Ray® Disk BD-ROM® optical disk reader 430 for reading from a disk 440 and a removable slot-in hard disk drive (HDD) 400 , accessible through the I/O bridge 700 .
- the system unit also comprises a memory card reader 450 for reading compact flash memory cards, Memory Stick® memory cards and the like, which is similarly accessible through the I/O bridge 700 .
- the I/O bridge 700 also connects to four Universal Serial Bus (USB) 2.0 ports 710 ; a gigabit Ethernet port 720 ; an IEEE 802.11b/g wireless network (Wi-Fi) port 730 ; and a Bluetooth® wireless link port 740 capable of supporting up to seven Bluetooth connections.
- USB Universal Serial Bus
- Wi-Fi IEEE 802.11b/g wireless network
- the I/O bridge 700 handles all wireless, USB and Ethernet data, including data from one or more game controllers 751 .
- the I/O bridge 700 receives data from the game controller 751 via a Bluetooth link and directs it to the Cell processor 100 , which updates the current state of the game accordingly.
- the wireless, USB and Ethernet ports also provide connectivity for other peripheral devices in addition to game controllers 751 , such as: a remote control 752 ; a keyboard 753 ; a mouse 754 ; a portable entertainment device 755 such as a Sony Playstation Portable® entertainment device; a video camera such as an EyeToy® video camera 756 ; and a microphone headset 757 .
- peripheral devices may therefore in principle be connected to the system unit 10 wirelessly; for example the portable entertainment device 755 may communicate via a Wi-Fi ad-hoc connection, whilst the microphone headset 757 may communicate via a Bluetooth link.
- Playstation 3 device is also potentially compatible with other peripheral devices such as digital video recorders (DVRs), set-top boxes, digital cameras, portable media players, Voice over IP telephones, mobile telephones, printers and scanners.
- DVRs digital video recorders
- set-top boxes digital cameras
- portable media players Portable media players
- Voice over IP telephones mobile telephones, printers and scanners.
- a legacy memory card reader 410 may be connected to the system unit via a USB port 710 , enabling the reading of memory cards 420 of the kind used by the Playstation® or Playstation 2 ® devices.
- the game controller 751 is operable to communicate wirelessly with the system unit 10 via the Bluetooth link.
- the game controller 751 can instead be connected to a USB port, thereby also providing power by which to charge the battery of the game controller 751 .
- the game controller is sensitive to motion in 6 degrees of freedom, corresponding to translation and rotation in each axis. Consequently gestures and movements by the user of the game controller may be translated as inputs to a game in addition to or instead of conventional button or joystick commands.
- other wirelessly enabled peripheral devices such as the Playstation Portable device may be used as a controller.
- additional game or control information may be provided on the screen of the device.
- Other alternative or supplementary control devices may also be used, such as a dance mat (not shown), a light gun (not shown), a steering wheel and pedals (not shown) or bespoke controllers, such as a single or several large buttons for a rapid-response quiz game (also not shown).
- the remote control 752 is also operable to communicate wirelessly with the system unit 10 via a Bluetooth link.
- the remote control 752 comprises controls suitable for the operation of the Blu Ray Disk BD-ROM reader 430 and for the navigation of disk content.
- the Blu Ray Disk BD-ROM reader 430 is operable to read CD-ROMs compatible with the Playstation and PlayStation 2 devices, in addition to conventional pre-recorded and recordable CDs, and so-called Super Audio CDs.
- the reader 430 is also operable to read DVD-ROMs compatible with the Playstation 2 and PlayStation 3 devices, in addition to conventional pre-recorded and recordable DVDs.
- the reader 430 is further operable to read BD-ROMs compatible with the Playstation 3 device, as well as conventional pre-recorded and recordable Blu-Ray Disks.
- the system unit 10 is operable to supply audio and video, either generated or decoded by the Playstation 3 device via the Reality Synthesiser graphics unit 200 , through audio and video connectors to a display and sound output device 300 such as a monitor or television set having a display 305 and one or more loudspeakers 310 .
- the audio connectors 210 may include conventional analogue and digital outputs whilst the video connectors 220 may variously include component video, S-video, composite video and one or more High Definition Multimedia Interface (HDMI) outputs. Consequently, video output may be in formats such as PAL or NTSC, or in 720p, 1080i or 1080p high definition.
- Audio processing (generation, decoding and so on) is performed by the Cell processor 100 .
- the Playstation 3 device's operating system supports Dolby® 5.1 surround sound, Dolby® Theatre Surround (DTS), and the decoding of 7.1 surround sound from Blu-Ray® disks.
- DTS Dolby® Theatre Surround
- the video camera 756 comprises a single charge coupled device (CCD), an LED indicator, and hardware-based real-time data compression and encoding apparatus so that compressed video data may be transmitted in an appropriate format such as an intra-image based MPEG (motion picture expert group) standard for decoding by the system unit 10 .
- the camera LED indicator is arranged to illuminate in response to appropriate control data from the system unit 10 , for example to signify adverse lighting conditions.
- Embodiments of the video camera 756 may variously connect to the system unit 10 via a USB, Bluetooth or Wi-Fi communication port.
- Embodiments of the video camera may include one or more associated microphones and also be capable of transmitting audio data.
- the CCD may have a resolution suitable for high-definition video capture. In use, images captured by the video camera may for example be incorporated within a game or interpreted as game control inputs.
- a peripheral device such as a video camera or remote control via one of the communication ports of the system unit 10
- an appropriate piece of software such as a device driver should be provided.
- Device driver technology is well-known and will not be described in detail here, except to say that the skilled man will be aware that a device driver or similar software interface may be required in the present embodiment described.
- the Cell processor 100 has an architecture comprising four basic components: external input and output structures comprising a memory controller 160 and a dual bus interface controller 170 A,B; a main processor referred to as the Power Processing Element 150 ; eight co-processors referred to as Synergistic Processing Elements (SPEs) 110 A-H; and a circular data bus connecting the above components referred to as the Element Interconnect Bus 180 .
- the total floating point performance of the Cell processor is 218 GFLOPS, compared with the 6.2 GFLOPs of the Playstation 2 device's Emotion Engine.
- the Power Processing Element (PPE) 150 is based upon a two-way simultaneous multithreading Power 970 compliant PowerPC core (PPU) 155 running with an internal clock of 3.2 GHz. It comprises a 512 kB level 2 (L 2 ) cache and a 32 kB level 1 (L 1 ) cache.
- the PPE 150 is capable of eight single position operations per clock cycle, translating to 25.6 GFLOPs at 3.2 GHz.
- the primary role of the PPE 150 is to act as a controller for the Synergistic Processing Elements 110 A-H, which handle most of the computational workload. In operation the PPE 150 maintains a job queue, scheduling jobs for the Synergistic Processing Elements 110 A-H and monitoring their progress. Consequently each Synergistic Processing Element 110 A-H runs a kernel whose role is to fetch a job, execute it and synchronise with the PPE 150 .
- Each Synergistic Processing Element (SPE) 110 A-H comprises a respective Synergistic Processing Unit (SPU) 120 A-H, and a respective Memory Flow Controller (MFC) 140 A-H comprising in turn a respective Dynamic Memory Access Controller (DMAC) 142 A-H, a respective Memory Management Unit (MMU) 144 A-H and a bus interface (not shown).
- SPU 120 A-H is a RISC processor clocked at 3.2 GHz and comprising 256 kB local RAM 130 A-H, expandable in principle to 4 GB.
- Each SPE gives a theoretical 25.6 GFLOPS of single precision performance.
- An SPU can operate on 4 single precision floating point members, 4 32-bit numbers, 8 16-bit integers, or 16 8-bit integers in a single clock cycle. In the same clock cycle it can also perform a memory operation.
- the SPU 120 A-H does not directly access the system memory XDRAM 500 ; the 64-bit addresses formed by the SPU 120 A-H are passed to the MFC 140 A-H which instructs its DMA controller 142 A-H to access memory via the Element Interconnect Bus 180 and the memory controller 160 .
- the Element Interconnect Bus (EIB) 180 is a logically circular communication bus internal to the Cell processor 100 which connects the above processor elements, namely the PPE 150 , the memory controller 160 , the dual bus interface 170 A,B and the 8 SPEs 110 A-H, totalling 12 participants. Participants can simultaneously read and write to the bus at a rate of 8 bytes per clock cycle. As noted previously, each SPE 110 A-H comprises a DMAC 142 A-H for scheduling longer read or write sequences.
- the EIB comprises four channels, two each in clockwise and anti-clockwise directions. Consequently for twelve participants, the longest step-wise data-flow between any two participants is six steps in the appropriate direction.
- the theoretical peak instantaneous EIB bandwidth for 12 slots is therefore 96B per clock, in the event of full utilisation through arbitration between participants. This equates to a theoretical peak bandwidth of 307.2 GB/s (gigabytes per second) at a clock rate of 3.2 GHz.
- the memory controller 160 comprises an XDRAM interface 162 , developed by Rambus Incorporated.
- the memory controller interfaces with the Rambus XDRAM 500 with a theoretical peak bandwidth of 25.6 GB/s.
- the dual bus interface 170 A,B comprises a Rambus FlexIO® system interface 172 A,B.
- the interface is organised into 12 channels each being 8 bits wide, with five paths being inbound and seven outbound. This provides a theoretical peak bandwidth of 62.4 GB/s (36.4 GB/s outbound, 26 GB/s inbound) between the Cell processor and the I/O Bridge 700 via controller 170 A and the Reality Simulator graphics unit 200 via controller 170 B.
- Data sent by the Cell processor 100 to the Reality Simulator graphics unit 200 will typically comprise display lists, being a sequence of commands to draw vertices, apply textures to polygons, specify lighting conditions, and so on.
- the Reality Simulator graphics (RSX) unit 200 is a video accelerator based upon the NVidia® G70/71 architecture that processes and renders lists of commands produced by the Cell processor 100 .
- the RSX unit 200 comprises a host interface 202 operable to communicate with the bus interface controller 170 B of the Cell processor 100 ; a vertex pipeline 204 (VP) comprising eight vertex shaders 205 ; a pixel pipeline 206 (PP) comprising 24 pixel shaders 207 ; a render pipeline 208 (RP) comprising eight render output units (ROPs) 209 ; a memory interface 210 ; and a video converter 212 for generating a video output.
- VP vertex pipeline 204
- PP pixel pipeline 206
- RP render pipeline 208
- ROPs render output units
- the RSX 200 is complemented by 256 MB double data rate (DDR) video RAM (VRAM) 250 , clocked at 600 MHz and operable to interface with the RSX 200 at a theoretical peak bandwidth of 25.6 GB/s.
- the VRAM 250 maintains a frame buffer 214 and a texture buffer 216 .
- the texture buffer 216 provides textures to the pixel shaders 207 , whilst the frame buffer 214 stores results of the processing pipelines.
- the RSX can also access the main memory 500 via the EIB 180 , for example to load textures into the VRAM 250 .
- the vertex pipeline 204 primarily processes deformations and transformations of vertices defining polygons within the image to be rendered.
- the pixel pipeline 206 primarily processes the application of colour, textures and lighting to these polygons, including any pixel transparency, generating red, green, blue and alpha (transparency) values for each processed pixel.
- Texture mapping may simply apply a graphic image to a surface, or may include bump-mapping (in which the notional direction of a surface is perturbed in accordance with texture values to create highlights and shade in the lighting model) or displacement mapping (in which the applied texture additionally perturbs vertex positions to generate a deformed surface consistent with the texture).
- the render pipeline 208 performs depth comparisons between pixels to determine which should be rendered in the final image.
- the render pipeline and vertex pipeline 204 can communicate depth information between them, thereby enabling the removal of occluded elements prior to pixel processing, and so improving overall rendering efficiency.
- the render pipeline 208 also applies subsequent effects such as full-screen anti-aliasing over the resulting image.
- Both the vertex shaders 205 and pixel shaders 207 are based on the shader model 3.0 standard. Up to 136 shader operations can be performed per clock cycle, with the combined pipeline therefore capable of 74.8 billion shader operations per second, outputting up to 840 million vertices and 10 billion pixels per second.
- the total floating point performance of the RSX 200 is 1.8 TFLOPS.
- the RSX 200 operates in close collaboration with the Cell processor 100 ; for example, when displaying an explosion, or weather effects such as rain or snow, a large number of particles must be tracked, updated and rendered within the scene.
- the PPU 155 of the Cell processor may schedule one or more SPEs 110 A-H to compute the trajectories of respective batches of particles.
- the RSX 200 accesses any texture data (e.g. snowflakes) not currently held in the video RAM 250 from the main system memory 500 via the element interconnect bus 180 , the memory controller 160 and a bus interface controller 170 B.
- the or each SPE 110 A-H outputs its computed particle properties (typically coordinates and normals, indicating position and attitude) directly to the video RAM 250 ; the DMA controller 142 A-H of the or each SPE 110 A-H addresses the video RAM 250 via the bus interface controller 170 B.
- the assigned SPEs become part of the video processing pipeline for the duration of the task.
- the PPU 155 can assign tasks in this fashion to six of the eight SPEs available; one SPE is reserved for the operating system, whilst one SPE is effectively disabled.
- the disabling of one SPE provides a greater level of tolerance during fabrication of the Cell processor, as it allows for one SPE to fail the fabrication process.
- the eighth SPE provides scope for redundancy in the event of subsequent failure by one of the other SPEs during the life of the Cell processor.
- the PPU 155 can assign tasks to SPEs in several ways. For example, SPEs may be chained together to handle each step in a complex operation, such as accessing a DVD, video and audio decoding, and error masking, with each step being assigned to a separate SPE. Alternatively or in addition, two or more SPEs may be assigned to operate on input data in parallel, as in the particle animation example above.
- Software instructions implemented by the Cell processor 100 and/or the RSX 200 may be supplied at manufacture and stored on the HDD 400 , and/or may be supplied on a data carrier or storage medium such as an optical disk or solid state memory, or via a transmission medium such as a wired or wireless network or interne connection, or via combinations of these.
- the software supplied at manufacture comprises system firmware and the Playstation 3 device's operating system (OS).
- the OS provides a user interface enabling a user to select from a variety of functions, including playing a game, listening to music, viewing photographs, or viewing a video.
- the interface takes the form of a so-called cross media-bar (XMB), with categories of function arranged horizontally.
- XMB cross media-bar
- the user navigates by moving through the function icons (representing the functions) horizontally using the game controller 751 , remote control 752 or other suitable control device so as to highlight a desired function icon, at which point options pertaining to that function appear as a vertically scrollable list of option icons centred on that function icon, which may be navigated in analogous fashion.
- the Playstation 3 device may select appropriate options automatically (for example, by commencing the game), or may provide relevant options (for example, to select between playing an audio disk or compressing its content to the HDD 400 ).
- the OS provides an on-line capability, including a web browser, an interface with an on-line store from which additional game content, demonstration games (demos) and other media may be downloaded, and a friends management capability, providing on-line communication with other Playstation 3 device users nominated by the user of the current device; for example, by text, audio or video depending on the peripheral devices available.
- the on-line capability also provides for on-line communication, content download and content purchase during play of a suitably configured game, and for updating the firmware and OS of the Playstation 3 device itself. It will be appreciated that the term “on-line” does not imply the physical presence of wires, as the term can also apply to wireless connections of various types.
- the above-mentioned online capability comprises interaction with a virtual environment populated by avatars (graphical representations) of the user of the PS3 10 and of other PS3 users who are currently online.
- the software to enable the virtual interactive environment is typically resident on the HDD 400 , and can be upgraded and/or expanded by software that is downloaded, or stored on optical disk 440 , or accessed by any other suitable means.
- the software may reside on a flash memory card 420 , optical disk 440 or a central server (not shown).
- the virtual interactive environment (hereafter called the ‘Home’ environment) is selected from the cross-media bar.
- the Home environment then starts in a conventional manner similar to a 3D video game by loading and executing control software, loading 3D models and textures into video memory 250 , and rendering scenes depicting the Home environment.
- the Home environment can be initiated by other programs, such as a separate game.
- FIG. 4 which displays a notional map of the Home environment
- FIG. 5 which is a schematic diagram of a Home environment online client/server arrangement
- the user's avatar is spawned within a lobby zone 1010 by default.
- a user can select among other zones 1010 - 1060 (detailed below) of the map, causing the select zone to be loaded and the avatar to be spawned within that zone.
- the map screen further comprises a sidebar on which the available zones may be listed, together with management tools such as a ranking option, enabling zones to be listed in order of user preference, or such as most recently added and/or A-Z listings.
- a search interface may allow the user to search for a zone by name.
- the lobby zone 1010 typically resembles a covered piazza, and may comprise parkland (grass, trees, sculptures etc.), and gathering spaces (such as open areas, single benches or rows of seats etc.) where users can meet through their avatars.
- parkland grass, trees, sculptures etc.
- gathering spaces such as open areas, single benches or rows of seats etc.
- the lobby zone 1010 typically also comprises advertisement-hoardings, for displaying either still or moving adverts for games or other content or products. These may be on the walls of the lobby, or may stand alone.
- the lobby zone 1010 may also include an open-air cinema 1012 showing trailers, high-profile adverts or other content from third-party providers. Such content is typically streamed or downloaded from a Home environment server 2010 to which the PS3 10 connects when the Home environment is loaded, as described in more detail later.
- the cinema screen is accompanied by seating for avatars in front of it, such that when an avatar sits down, the camera angle perceived by the user of the avatar also encompasses the screen.
- the lobby zone 1010 may also include general amusements 1014 , such as functioning pool tables, bowling alleys, and/or a video arcade. Games of pool or bowling may be conducted via the avatar, such that the avatar holds the pool cue or bowling ball, and is controlled in a conventional manner for such games.
- the home environment may switch to a substantially full-screen representation of the videogame selected.
- Such games may, for example, be classic arcade or console games such as Space Invaders (®), or Pac-Man (®), which are comparatively small in terms of memory and processing and can be emulated by the PS3 within the Home environment or run as plug-ins to the Home environment.
- user-created game content may be featured on one or more of the virtual video game machines. Such content may be the subject of on-line competitions to be featured in such a manner, with new winning content downloaded on a regular basis.
- zones 1020 , 1030 , 1040 , 1050 and 1060 which may be rooms, areas or other constructs
- zones 1020 , 1030 , 1040 , 1050 and 1060 which may be rooms, areas or other constructs
- zones 1020 , 1030 , 1040 , 1050 and 1060 which may be rooms, areas or other constructs
- zones 1050 and 1060 which may be rooms, areas or other constructs
- These may be accessed either via a map screen similar in nature to that of FIG. 4 , or alternatively the user can walk to these other areas by guiding their avatar to various exits 1016 from the lobby.
- an exit 1016 takes the form of a tunnel or corridor (but may equally take the form of an anteroom) to the next area. While the avatar is within the tunnel or anteroom, the next zone is loaded into memory. Both the lobby arid the next zone contain identical models of the tunnel or anteroom, or the model is a common resource to both. In either case, the user's avatar is relocated from the lobby-based version to the new zone-based version of the tunnel or anteroom at the same position. In this way the user's avatar can apparently walk seamlessly throughout the Home environment, without the need to retain the whole environment in memory at the same time.
- the Cinema zone 1020 resembles a multiplex cinema, comprising a plurality of screens that may show content such as trailers, movies, TV programmes, or adverts downloaded or streamed from a Home environment server 2010 as noted previously and detailed below, or may show content stored on the HDD 400 or on an optical disk 440 , such as a Blu-Ray disk.
- the multiplex cinema will have an entrance area featuring a screen 1022 on which high-profile trailers and adverts may be shown to all visitors, together with poster adverts 1024 , typically but not limited to featuring upcoming movies.
- Specific screens and the selection and display of the trailers and posters can each be restricted according to the age of the user, as registered with the PS3. This age restriction can be applied to any displayed content to which an age restriction tag is associated, in any of the zones within the Home environment.
- the multiplex cinema provides a number of screen rooms in which featured content is available, and amongst which the user can select.
- a screen room downloaded, streamed or locally stored media can be played within a virtual cinema environment, in which the screen is set in a room with rows of seats, screen curtains, etc.
- the cinema is potentially available to all users in the Home environment, and so the avatars of other users may also be visible, for example watching commonly streamed material such as a web broadcast.
- the user can zoom in so that the screen occupies the full viewing area.
- zone 1030 another type of zone is a developer or publisher zone 1030 .
- each may have its own exit from the lobby area 1010 , or alternatively some or all may share an exit from the lobby and then have separate exits from within a tunnel or ante-room model common to or replicated by each available zone therein.
- they may be selected from a menu, either in the form of a pop-up menu, or from within the Home environment, such as by selecting from a set of signposts. In these latter cases the connecting tunnel or anteroom will appear to link only to the selected developer or publisher zone 1030 .
- such zones maybe selected via the map screen, resulting in the zone being loaded in to memory, and the avatar re-spawning within the selected zone.
- Developer or publisher zones 1030 provide additional virtual environments, which may reflect the look and feel of the developer or publisher's products, brands and marks.
- the developer or publisher zones 1030 are supplementary software modules to the Home environment and typically comprise additional 3D models and textures to provide the structure and appearance of the zone.
- the software operable to implement the Home environment supports the integration of third party software via an application program interface (API). Therefore, developers can integrate their own functional content within the Home environment of their own zone. This may take the form of any or all of:
- One or more interactive scenes or vignettes representative of the developer's or publisher's games, enabling the player to experience an aspect of the game, hone a specific skill of the game, or familiarise themselves with the controls of a game;
- a developer's zone resembles a concourse in the developer's signature colours and featuring their logos, onto which open gaming areas, such as soccer nets, or a skeet range for shooting.
- a booth (not shown) manned by game-specific characters allows the user's avatar to enter and either temporarily change into the lead character of the game, or zoom into a first person perspective, and enter a further room resembling a scene from the featured game.
- the user interacts with other characters from the game, and plays out a key scene.
- adverts for the game and other content are displayed on the walls.
- the concourse opens up into an arena where a 5-a-side football match is being played, where the positions of the players and the ball correspond to a game currently being played by a popular group, such as a high-ranking, game clan, in another country.
- developer/publisher zones are available to download.
- they may be supplied as demo content on magazine disks, or may be installed/upgraded from disk as part of the installation process for a purchased game of the developer or publisher.
- subsequent purchase or registration of the game may result in further zone content being unlocked or downloaded.
- further modifications, and timely advert and trailer media may be downloaded as required.
- a similar zone is the commercial zone 1040 .
- commercial zones 1040 may comprise representative virtual assets of one or more commercial vendors in the form of 3D models, textures etc., enabling a rendering of their real-world shops, brands and identities, and these may be geographically and/or thematically grouped within zones.
- Space within commercial zones may be rented as so-called ‘virtual real-estate’ by third parties.
- a retailer may pay to have a rendering of their shop included within a commercial zone 1040 as part of a periodic update of the Home environment supplied via the Home environment server 2010 , for example on a monthly or annual renewal basis.
- a retailer may additionally pay for the commerce facilities described above, either on a periodic basis or per item. In this way they can provide users of the Home environment with a commercial presence.
- the commercial zone comprises supplementary software that can integrate with the home environment via an API, to provide additional communication options (shop-specific names, goods, transaction options etc), and additional functionality, such as accessing an online database of goods and services for purchase, determining current prices, the availability of goods, and delivery options.
- additional communication options shop-specific names, goods, transaction options etc
- additional functionality such as accessing an online database of goods and services for purchase, determining current prices, the availability of goods, and delivery options.
- Such functions may be accessed either via a menu (either as a pop-up or within the Home environment, for example on a wall) or via communication with automated avatars. Communication between avatars is described in more detail later.
- a tunnel may link a developer zone to a store that sells the developer's games.
- Such a tunnel may be of a ‘many to one’ variety, such that exits from several- zones emerge from the same tunnel in-store. In this case, if re-used, typically the tunnel would be arranged to return the user to the previous zone rather than one of the possible others.
- the software implementing the Home environment has access to an online-content purchase system provided by the PS3 OS. Developers, publishers and store owners can use this system via an interface to specify the IP address and query text that facilitates their own on-line transaction.
- the user can allow their PS3 registration details and credit card details to be used directly, such that by selecting a suitably enabled object, game, advert, trailer or movie anywhere within the Home environment, they can select to purchase that item or service.
- the Home environment server 2010 can store and optionally validate the user's credit card and other details so that the details are ready to be used in a transaction without the user having to enter them. In this way the Home environment acts as an intermediary in the transaction. Alternatively such details can be stored at the PS3 and validated either by the PS3 or by the Home environment server.
- a method of sale comprises in a step s 2102 a user selecting an item (goods or a service) within the Home environment.
- the PS3 10 transmits identification data corresponding with the object to the Home environment server 2010 , which in step s 2016 verifies the item's availability from a preferred provider (preferably within the country corresponding to the IP address of the user). If the item is unavailable then in step s 2107 it informs the user by transmitting a message to the user's PS3 10. Alternatively, it first checks for availability from one or more secondary providers, and optionally confirms whether supply from one of these providers is acceptable to the user.
- the Home environment server retrieves from data storage the user's registered payment details and validates them. If there is no valid payment method available, then the Home environment may request that the user enters new details via a secure (i.e. encrypted) connection. Once a valid payment method is available, then in step s 2110 the Home environment server requests from the appropriate third party payment provider a transfer of payment from the user's account. Finally, in s 2112 the Home environment server places an order for the item with the preferred provider, giving the user's delivery address or IP address as applicable, and transferring appropriate payment to the preferred provider's account.
- a secure (i.e. encrypted) connection Once a valid payment method is available, then in step s 2110 the Home environment server requests from the appropriate third party payment provider a transfer of payment from the user's account. Finally, in s 2112 the Home environment server places an order for the item with the preferred provider, giving the user's delivery address or IP address as applicable, and transferring appropriate payment to the preferred provider's account.
- commerce is not limited specifically to shops. Similarly, it is not necessary for shops to provide their own commerce applications if the preferred provider for goods or services when displayed within a shop is set to be that shop's owner. Where the goods or service may be digitally provided, then optionally it is downloaded from the preferred provider directly or via a Home environment server 2010 .
- zones that are private to the individual user and may only be accessed by them or by invitation from them. These zones also have exits from the communal lobby area, but when entered by the avatar (or chosen via the map screen), load a respective version of the zone that is private to that user.
- the first of these zones is an apartment zone 1050 .
- this is a user-customisable zone in which such features 1052 as wallpaper, flooring, pictures, furniture, outside scenery and lighting may be selected and positioned.
- Some of the furniture is functional furniture 1054 , linked to PS3 functionality.
- a television may be placed in the apartment 1050 on which can be viewed one of several streamed video broadcasts, or media stored on the PS3 HDD 400 or optical disk 440 .
- a radio or hi-fi may be selected that contains pre-selected links to internet radio streams.
- user artwork or photos may be imported into the room in the form of wall hangings and pictures.
- the user may purchase a larger apartment, and/or additional goods such as a larger TV, a pool table, or automated non-player avatars.
- additional goods such as a larger TV, a pool table, or automated non-player avatars.
- Other possible items include a gym, swimming pool, or disco area.
- additional control software or configuration libraries to provide additional character functionality will integrate with the home environment via the API in a similar fashion to that described for the commercial and developer/publisher zones 1030 , 1040 described previously.
- Such purchases may be made using credit card details registered with the Home environment server.
- the server downloads an authorisation key to unlock the relevant item for use within the user's apartment.
- the 3D model, textures and any software associated with an item may also be downloaded from the Home environment server or an authorised third-party server, optionally again associated with an authorisation key.
- the key may, for example, require correspondence with a firmware digital serial number of the PS3 10, thereby preventing unauthorised distribution.
- a user's apartment can only be accessed by others upon invitation from the respective user.
- This invitation can take the form of a standing invitation for particular friends from within a friends list, or in the form of a single-session pass conferred on another user, and only valid whilst that user remains in the current Home environment session.
- Such invitations may take the form of an association maintained by a Home environment server 2010 , or a digital key supplied between PS3 devices on a peer-to-peer basis that enables confirmation of status as an invitee.
- invited users can only enter the apartment when the apartment's user is present within the apartment, and are automatically returned to the lobby if the apartment's user leaves. Whilst within the apartment, all communication between the parties present (both user and positional data) is purely peer-to-peer.
- the apartment thus also provides a user with the opportunity to share home created content such as artwork, slideshows, audio or video with invited guests, and also to interact with friends without potential interference from other users within the public zones.
- the configuration of the room and the furnishings within it are transmitted in a peer-to-peer fashion between the attendees using ID codes for each object and positional data.
- the model, textures and any code required to implement it on the guest's PS3 may also be transmitted, together with a single-use key or similar constraint, such as use only whilst in the user's apartment and whilst the user and guest remain online in this session.
- a further private space that may similarly be accessed only by invitation is the user's Trophy Room 1060 .
- the Trophy Room 1060 provides a space within which trophies 1062 earned during game play may be displayed.
- a third-party game comprises seeking a magical crystal. If the player succeeds in finding the crystal, the third party game nominates this as a trophy for the Trophy Room 1060 , and places a 3D model and texture representative of the crystal in a file area accessed by the Home environment software when loading the Trophy Room 1060 . The software implementing the Home environment can then render the crystal as a trophy within the trophy Room.
- each trophy comprises an identifying code.
- a trophy room may be shared between members of a group or so-called ‘clan’, such that a trophy won by any member of the clan is transmitted to other members of the clan on a peer-to-peer basis. Therefore all members of the clan will see a common set of trophies.
- a user can have a standing invitation to all members of the Home environment, allowing anyone to visit their trophy room.
- a plurality of rooms is therefore possible, for example a private, a group-based and a public trophy room. This may be managed either by selection from a pop-up menu or signposts within the Home environment as described previously, or by identifying a relevant user by walking up to their avatar, and then selecting to enter their (public) trophy room upon using the trophy room exit from the lobby.
- a public trophy room may be provided. This room may display the trophies of the person in the current instance of the Home environment who has the most trophies or a best overall score according to a trophy value scoring scheme.
- it may be an aggregate trophy room, showing the best, or a selection of, trophies from some or all of the users in that instance of the Home environment, together with the ID of the user.
- a user could spot a trophy from a game they are having difficulty with, identify who in the Home environment won it, and then go and talk to them about how they won it.
- a public trophy room could contain the best trophies across a plurality of Home environments, identifying the best garners within a geographical, age specific or game specific group, or even worldwide.
- a leader board of the best scoring garners can be provided and updated live.
- the number of third party zones currently associated with a user's Home environment can be limited.
- a maximum memory allocation can be used to prevent additional third party zones being added until an existing one is deleted.
- third party zones may be limited according to geographical relevance or user interests (declared on registration or subsequently via an interface with the Home environment server 2010 ), such that only third party zones relevant to the user by these criteria are downloaded. Under such a system, if a new third party zone becomes available, its relevance to the user is evaluated according to the above criteria, and if it is more relevant than at least one of those currently stored, it replaces the currently least relevant third party zone stored on the user's PS3.
- Other criteria for relevance may include interests or installed zones of nominated friends, or the relevance of zones to games or other media that have been played on the user's PS3.
- the software implementing the Home environment enables the customisation of a user's avatar from a selection of pre-set options in a similar manner to the customisation of the user's apartment.
- the user may select gender and skin tone, and customise the facial features and hair by combining available options for each.
- the user may also select from a wide range of clothing.
- a wide range of 3D models and textures for avatars are provided.
- user may import their own textures to display on their clothing.
- the parameters defining the appearance of each avatar only occupy around 40 bytes, enabling fast distribution via the home server when joining a populated Home environment.
- Each avatar in the home environment can be identified by the user's ID or nickname, displayed in a bubble above the avatar. To limit the proliferation of bubbles, these fade into view when the avatar is close enough that the text it contains could easily be read, or alternatively when the avatar is close enough to interact with and/or is close to the centre of the user's viewpoint.
- the avatar is controlled by the user in a conventional third-person gaming manner (e.g. using the game controller 751 ), allowing them to walk around the Home environment.
- Some avatar behaviour is contextual; thus for example the option to sit down will only be available when the avatar is close to a seat.
- Other avatar behaviour is available at all times, such as for example the expression of a selected emotion or gesture, or certain communication options.
- Avatar actions are determined by use of the game controller 751 , either directly for actions such as movement, or by the selection of actions via a pop-up menu, summoned by pressing an appropriate key on the game controller 751 .
- Options available via such a menu include further modification of the avatar's appearance and clothing, and the selection of emotions, gestures and movements. For example, the user can select that their avatar smiles, waves and jumps up and down when the user sees someone they know in the Home environment.
- messages appear in pop-up bubbles above the relevant avatar, replacing their name bubble if necessary.
- a pop-up menu 1070 in which a range of preset messages is provided.
- These may be complete messages, or alternatively or in addition may take the form of nested menus, the navigation of which generates a message by concatenating selected options.
- a virtual keyboard may be displayed, allowing free generation of text by navigation with the game controller 751 . If a real keyboard 753 is connected via Bluetooth, then text may by typed into a bubble directly.
- the lobby also provides a chat channel hosted by the Home environment server, enabling conventional chat facilities.
- a user To communicate by speech, a user must have a microphone, such as a Bluetooth headset 757 , available. Then in an embodiment of the present invention, either by selection of a speech option by pressing a button on the game controller 751 , or by use of a voice activity detector within the software implementing the Home environment, the user can speak within the Home environment. When speaking, a speech icon may appear above the head of the avatar for example to alert other users to adjust volume settings if necessary.
- the speech is sampled by the user's PS3, encoded using a Code Excited Linear Prediction (CELP) codec (or other known VoIP applicable codec), and transmitted in a peer-to-peer fashion to the eight nearest avatars (optionally provided they are within a preset area within the virtual environment surrounding the user's avatar). Where more than eight other avatars are within the preset area, one or more of the PS3s that received the speech may forward it to other PS3s having respective user avatars within the area that did not receive the speech, in an ad-hoc manner.
- CELP Code Excited Linear Prediction
- the PS3 will transmit a speech flag to all PS3s whose avatars are within the preset area, enabling them to place a speech icon above the relevant (speaking) avatars head (enabling their user to identify the speaker more easily) and also to notify the PS3s of a transmission.
- Each PS3 can determine from the relative positions of the avatars which ones will not receive the speech, and can elect to forward the speech to the PS3 of whichever avatar they are closest to within the virtual environment.
- the PS3s within the area can ping each other, and whichever PS3 has the lowest lag with a PS3 that has not received the speech can elect to forward it.
- such speech can also be relayed to other networks, such as a mobile telephony network, upon specification of a mobile phone number.
- a mobile telephony network may be achieved either by routing the speech via the Home environment server to a gateway server of the mobile network, or by Bluetooth transmission to the user's own mobile phone.
- the mobile phone may require middleware (e.g. a Java applet) to interface with the PS3 and route the call.
- a user can contact a person on their phone from within the Home environment.
- the user can also send a text message to a person on their mobile phone.
- users whose PS3s are equipped with a video camera such as the Sony ® Eye Toy ® video camera can use a video chat mode, for example via a pop-up screen, or via a TV or similar device within the Home environment, such as a Sony Playstation Portable (PSP) held by the avatar.
- video codecs are used in addition to or instead of the audio codecs.
- the avatars of users with whom you have spoken recently can be highlighted, and those with whom you have spoken most may be highlighted more prominently, for example by an icon next to their name, or a level of glow around their avatar.
- the locally stored software when a user selects to activate the Home environment on their PS3 10, the locally stored software generates the graphical representation of the Home environment, and connects to a Home environment server 2010 that assigns the user to one of a plurality of online Home environments 2021 , 2022 , 2023 , 2024 . Only four home environments are shown for clarity.
- the Home environment server 2010 will support a large plurality of separate online Home environments. Likewise, there may be many separate Home environment servers, for example in different countries.
- a PS3 Once assigned to a Home environment, a PS3 initially uploads information regarding the appearance of the avatar, and then in an ongoing fashion provides the Home environment server with positional data for its own avatar, and receives from the Home environment server the positional data of the other avatars within that online Home environment.
- this positional update is periodic (for example every 2 seconds) to limit bandwidth, so other PS3s must interpolate movement. Such interpolation of character movement is well-known in on-line games.
- each update can provide a series of positions, improving the replication of movement (with some lag), or improving the extrapolation of current movement.
- IP addresses of the other PS3s 2031 , 2032 , 2033 within that Home environment 2024 is shared so that they can transmit other data such as speech in a peer-to-peer fashion between themselves, thereby reducing the required bandwidth of data handled by the Home environment server.
- each will support a maximum of, for example, 64 users.
- the selection of a Home environment to which a user will be connected can take account of a number of factors, either supplied by the PS3 and/or known to the Home environment server via a registration process. These include but are not limited to:
- a Swiss teenager may be connected to a Home environment on a Swiss server, with a maximum user age of 16 and a predominant language of French.
- a user with a copy of ‘Revolution’ mounted in their PS3 may be connected to a home environment where a predominant number of other users also currently have the same game mounted, thereby facilitating the organisation of multiplayer games.
- the PS3 10 detects the game loaded within the BD-ROM 430 and informs the Home environment server 2010 . The server then chooses a Home environment accordingly.
- a user is connected to a Home environment in which three users identified on his friends list can be found.
- the friends list is a list of user names and optionally IP addresses that have been received from other users that the user given wishes to meet regularly.
- different groups of friends are located on different Home environment servers (e.g. where the current user is the only friend common to both sets) then the user may either be connected to the one with the most friends, or given the option to choose.
- a user may invite one or more friends to switch between Home environments and join them.
- the user can view their friends list via a pop-up menu or from within the Home environment (for example via a screen on the wall or an information booth) and determine who is on-line.
- the user may then broadcast an invite to their friends, either using a peer-to-peer connection or, if the friend is within a Home environment or the IP address is unknown, via the Home environment server.
- the friend can then accept or decline the invitation to join.
- a Home environment server will assign less than the maximum supported number of users to a specific home environment, thereby allowing such additional user-initiated assignments to occur.
- This so-called ‘soft-limit’ may, for example, be 90% of capacity, and may be adaptive, for example changing in the early evening or at weekends where people are more likely to meet up with friends on-line.
- the map screen may also highlight those zones in which the friends can currently be found, either by displaying their name on the map or in association with the zone name on the side bar.
- preferences, settings, functions of the Home environment and optionally other functionality may be viewed, adjusted or accessed as appropriate by use of a virtual Sony ® Playstation Portable ® (PSP) entertainment device 1072 that can be summoned by use of the game controller 751 to pop-up on screen. The user can then access these options, settings and functionality via a PSP cross-media bar 1074 displayed on the virtual PSP.
- PSP Playstation Portable ®
- the PSP could also be used as an interface for video chat.
- a user may do so by selection of an appropriate key on the game controller 751 , by selection of an exit option from a pop-up menu, by selection of an exit from within the map screen, by selection of an option via their virtual PSP or by walking through a master exit within the lobby zone.
- exiting the Home environment will cause the PS3 10 to return to the PS3 cross media bar.
- a supermarket may provide a free disk upon which a Supermarket environment, supported in similar fashion by the Home environment servers, is provided.
- the user's avatar can browse displayed goods within a virtual rendition of the supermarket (either as 3D models or textures applied to shelves) and click on them to purchase as described above.
- retailers can provide and update online shopping facilities for their own user base.
- the avatar model comprises two aspects; a mesh, or skin, defining the three dimensional surface upon which textures are placed, and a hierarchy of so-called ‘bones’ used to modify the vertices of the mesh.
- these bones are typically one-dimensional lines comprising a position, size and orientation, and are associated with vertices or regions of a mesh and/or optionally with other bones. As such, they do not correspond to human bones in the conventional sense.
- the mesh typically has a default design, e.g. for male and female avatars.
- this mesh can be deformed by so called ‘blend shapes’ (or ‘morph targets’).
- Blend shapes are commonly used for facial animation of game characters or avatars using known techniques. For example, during the designing of a game, an artist will typically explicitly design a default mesh describing the head of a game character or avatar together with blend shapes that depict facial expressions of that game character or avatar, such as left eyebrow raised, right eyebrow raised, mouth saying “oo”, mouth saying “ee”, and the like.
- the animator assigns a blend weight to each blend shape so as to specify the degree to which that blend shape will influence the distortion of the template mesh during the animation.
- each vertex consequently corresponds to the sum of the blend shape offset positions multiplied by their respective weights plus the vertex position of the template mesh. Therefore, for example, an animator might choose to create a smiling avatar with a raised eyebrow by assigning an appropriate weight to those blend shapes describing those facial expressions.
- animators traditionally also use skeletal animation.
- skeletal animation each bone in the skeleton is associated with one or more vertices of the mesh.
- these vertices can also be associated with one or more bones, each association being determined by a vertex weight. Consequently the displacement of the mesh is determined by the weighted influence of the neighbouring bones.
- the relationship is described as a ‘parent/child’ relationship.
- the positioning and orientation of the mesh node or region associated with the child bone is a product of the positioning, scaling and orientation of both the child and parent bone.
- this hierarchy simplifies the positioning of a character frame-by frame since, for example, moving a thigh bone will also move a lower leg bone, resulting in a change in the position of the associated mesh of the whole leg.
- a skeletal model is used to enable the user to change bone parameters so as to modify, warp and otherwise deform the default mesh of the avatar, independent of whether skeletal animation is subsequently used to move the avatar about.
- blend shapes can be used to provide global modifications to the mesh of the avatar that can then be adjusted by the skeletal model in a similar fashion, again independently of whether blend shapes are used in subsequent animation.
- sets of blend shapes are pre-determined for different ethnicities such as
- a blend shape is a deformed version of a mesh and in the present embodiment is defined with respect to the vertex points that describe the default male or female mesh.
- the vertices of the blend shape could be defined as the positional offset from the vertices that describe the default, un-deformed mesh.
- the blend shape associated with the brow of a Native Australian for example, will be different to that of a Caucasian.
- a user interface enables the user to select the percentage of each ethnicity they wish to include in their avatar, thereby providing a weighted average of the different pre-determined blend shapes for each ethnicity. Accordingly, a user can interact with the user interface to adjust the blend weight of each blend shape associated with each ethnic type so as to quickly and straightforwardly modify the avatars basic appearance.
- an avatar could be 50% Caucasian and 50% African, or alternatively 40% Native American, 30% Oriental and 30% Native Australian.
- predetermined skin tones may be mixed in the same proportions, but preferably these are also independently adjustable.
- blend shape sets may be used for male and female avatars as appropriate, and that the available ethnicities are not limited to the above examples, and indeed can extend to fictional races and species.
- Weighted blend shapes may be used to modify the appearance of the avatar as to body type or amount of body fat. For example, blend shapes for different neck thicknesses could be used or blend shapes relating to fatter or thinner body types could be used to modify the basic body mesh describing an avatar.
- various bones and groups of bones can be adjusted directly via respective user interfaces so as to modify or further modify the appearance of the avatar.
- bones used to control the jaw-line can be adjusted by selecting from a menu of adjustable features the option ‘lower jaw-line’ and then adjusting the bone parameters using the controller 751 .
- the controller has at least two sets of directional controls, typically two sets of parameters can be adjusted together.
- these adjustments may comprise, for example, rear jaw width (left-hand horizontal control) and the angle between the chin and the rear of the jaw (left-hand vertical control), and chin prominence (right-hand horizontal control) and degree of double-chin (right-hand vertical control).
- These adjustments change the position, size and orientation of bones associated with the jaw accordingly, and can also affect any bones coupled to these jaw bones in the skeletal hierarchy; thus for example the cheek, upper jaw, nose, ears and neck may each be affected by changes to the jaw line.
- the position, scale and orientation of each individual bone is not necessarily accessible via the user interface; the parameters input by the user are translated into parameters of the individual bones by the entertainment device. This makes the adjustment of the facial features simple for the user.
- bone or bone groups can be adjusted in a similar manner to modify facial features, including cheek bones, brow ridge, eye socket size, lateral eye position, vertical eye position, nose position, cranial shape, upper face shape and lower face shape.
- bones do not correspond to literal bones and so may also be used for such features as lip size and shape, nose profile, vertical ear position and ear shape, as well as for double chins as in the preceding example.
- an additional naturalistic effect is achieved by allowing modifications to one or more bones or bone groups to be asymmetric. For example, ears are rarely exactly matched, both in terms of size and vertical position on the head.
- bones used to control the ears can be adjusted by selecting from a menu of adjustable features the option ‘ear disparity’ or the like and then adjusting the bone parameters using the controller 751 .
- the relative imbalance in vertical position of the ears can be controlled by the left-hand horizontal control of the controller such that a move to the right raises the right ear and lowers the left, whilst a move to the left raises the left ear and lowers the right.
- the relative imbalance in ear size can be controlled for example by the left-hand vertical control of the controller such that a move upward increases the size of the right ear and decreases the size of the left, and a move downward increases the size of the left ear and decreases the size of the right.
- Other bones or bone groups can be adjusted to create asymmetry in a similar manner, such as eye socket size, lateral and vertical eye position, brow tilt, vertical and lateral cheek bone position, nose position and profile, lateral jaw offset and jaw tilt.
- eye socket size asymmetric in a similar manner
- lateral and vertical eye position asymmetric in a similar manner
- brow tilt asymmetric in a similar manner
- vertical and lateral cheek bone position nose position and profile
- lateral jaw offset and jaw tilt Several of these features can also be further grouped to provide quick asymmetry adjustments, typically based upon size and angle disparity with respect to the centre line of the face. Referring to FIGS. 14A and B, these may include for example upper- and/or lower-face symmetry, head (cranial) symmetry and overall face symmetry. In each case, the user input is very simple (e.g.
- vertical axis adjustments affect size aspects and horizontal axis adjustments affect balance aspects of the asymmetry), and are coupled to parameters of a relevant subset of bones in the skeletal model by a series of weights or transforms, so that for example eye, ear and nose changes in a single ‘face symmetry’ adjustment are not necessarily to the same degree, but are proportionate with respect to each other so as to create a_plausible human face following the subsequent deformation of the face mesh.
- the user interface enables the addition of further meshes that may interact with bones of the avatar or be associated with further bones.
- These meshes typically provide accessories for the head and face, such as spectacles, hair, hats, and headphones, and for exotic races or amusing modifications, e.g. features such as horns, crests and trunks.
- a spectacles mesh may be associated with ear, cheek and nose bones, so that the position of the spectacles automatically reacts to the structure of the face.
- hair, hats and headphones may be associated with ear and cranial bones so that they stretch to fit the size of head and line up with the ear position.
- Other facial features such as beard and moustaches would be associated with the nose and jaw bones in a manner similar to the facial mesh over which they are placed or which they replace.
- a highly distinctive and naturalistic avatar can be generated.
- grouping adjustments in decreasing scales of detail from overall ethnicity (which may be determined via weighted blend shapes), to large scale adjustments (e.g. upper/lower face), to a selection of intuitive adjustments to certain parameters of certain bone groups and finally bones, distinctive and individual faces can be quickly made for the avatar, whilst keeping its facial features harmonious by virtue of the linkages between bones in the skeletal model and the vertex weighting of the bones to the mesh.
- FIG. 15A shows a default female avatar adjusted by weighted blend shapes to correspond to 50% Caucasian, 50% Native American.
- FIGS. 15B and 15C then show this female avatar after various bone parameters have been adjusted in the manner described herein, so as to produce two highly distinctive faces by subsequent manipulation of the skeletal model.
- the avatar may be further customised by the application of texture layers and texture effects.
- Different texture layers adding wrinkles, freckles, moles or blemishes and scars may be added with varying degrees of transparency via the user interface.
- skin texture may be introduced and varied by controlling the degree of bump mapping (or other lighting effect) used in relation to one or more of these textures, or other dedicated bump-mapping textures.
- bump mapping or other lighting effect
- one or more texture layers may be alpha-blended, bump-mapped or otherwise included on the avatar according to an ‘age’ slider, thereby providing a quick way to vary the age of the character.
- certain bone parameters may also be coupled to this slider, for example to cause sunken cheekbones, deeper eye sockets, a thinner neck and larger ears as age progresses.
- different age sliders or an age profile switch could be provided to give different age effects; for example another age profile could result in the character becoming fatter-faced, red-cheeked and jowly with age, rather than gaunt.
- the skeletal model and the mesh deformations responsive to the skeletal model and to the weighted blend shapes are computed by one or more SPEs 110 A-H.
- the resulting modified mesh is used in combination with one or more textures by the RSX unit 200 to render the avatar.
- the modified mesh is further manipulated to move the avatar, animate the face, perform lip-syncing and display emotions in substantially the same manner as would be done with a conventional or default avatar mesh.
- the PS3 entertainment device transmits configuration details of the user's avatar including the modified bone parameters and blend shape weights to the Home environment server 2010 , or optionally in a peer-to-peer fashion to the other PS3s 2031 , 2032 , 2033 in the same instance of the Home environment.
- the peer PS3 receives from either the Home environment server or the peer PS3s the respective configuration details of the other avatars within that instance of the Home environment, also including their respective modified bone parameters and blend shape weights.
- the PS3 entertainment device 10 then computes the mesh deformation for each avatar responsive to the relevant blend shape weights and modified bone parameters, before rendering it according to the other configuration details received for that avatar.
- the data describing the deformed mesh is transmitted rather than the data describing the blend shape weights and bone parameters, thereby avoiding the need for the recipient PS3 to compute the effect of the blend shape weights and bone parameters on the mesh for each of the other avatars.
- the mesh may be transmitted in a conventional manner or as a set of deviations from a default mesh (or a default male or female mesh as applicable).
- the Home environment server is therefore operable to receive data descriptive of respective blend shape weights and modified avatar skeletons from each remote entertainment device in an instance of the Home environment, and to transmit this data to the respective remaining PS3s.
- it is operable to transmit mesh data for each avatar, either in a conventional mesh format or as deviations from a default mesh.
- an on-line system comprising a Home environment server and two or more PS3s allows the users of each PS3 to customise their own avatar by virtue of a skeletal modifier and blend shape weight adjuster coupled to a user interface, and to then distribute these modified avatars within the Home environment via the Home environment server before each PS3 renders the Home environment populated with said modified avatars.
- embodiments of the present invention are not be limited to the Home environment, but are also applicable to any multiplayer on-line environment where a plurality of users may encounter each other through avatars, such as. for example in an on-line game.
- the ethnic changes to the avatars face generated by weighted blend shapes may also be achieved by other deformers such as a lattice deformer or sculpt deformer, or by suitably placed bones with different parameter values, or by a combination of the above.
- a method of user identification in an on-line virtual environment comprises in a first step s 10 , selecting a user avatar for use in the on-line virtual environment (for example selecting an initial gender or character class). Then in a second step s 20 , one or more physical properties of one or more skeletal components of the user avatar are modified via a user interface. In a third step s 30 , vertices in a three dimensional mesh representing some or all of the user avatar are adjusted in response to the position of one or more skeletal components of the user avatar, such that the placement of such vertices is responsive to one or more bones of the modified skeletal model. Then in a fourth step s 40 , the user avatar is rendered responsive to the modified user avatar skeleton.
- elements of the method of user identification in an on-line virtual environment and corresponding skeletal modelling, skeletal modification, blend shape weighting, user input and rendering means of the apparatus may be implemented in any suitable manner.
- adaptation of existing parts of a conventional equivalent entertainment device may be implemented in the form of a computer program product comprising processor implementable instructions stored on a data carrier such as a floppy disk, optical disk, hard disk, PROM, RAM, flash memory or any combination of these or other storage media, or transmitted via data signals on a network such as an Ethernet, a wireless network, the Internet, or any combination of these of other networks, or realised in hardware as an ASIC (application specific integrated circuit) or an FPGA (field programmable gate array) or other configurable circuit suitable to use in adapting the conventional equivalent device.
- a data carrier such as a floppy disk, optical disk, hard disk, PROM, RAM, flash memory or any combination of these or other storage media
- a network such as an Ethernet, a wireless network, the Internet, or any combination of these of other networks
- ASIC application specific integrated circuit
- FPGA field programmable gate array
Abstract
Description
- This invention relates to an apparatus and method of avatar customisation. In online-gaming, it is conventional for players to adopt distinctive names for their in-game characters (generally termed ‘avatars’). In addition, these avatars may also be customised, for example according to race (real or fictional) or gender, and a range of different heads and bodies are often provided. In addition, features such as hair styles, skin tone and age may be customised. The purpose of such naming and customisation is typically to project the user's personality within the game, and/or to be as distinctive as possible. For example, see http ://starwarsgalaxies.station.sony.com/players/guides.vm?id=70000.
- As on-line gaming continues to grow there is an increasing move to explore the social aspect of these virtual environments, and consequently a greater need for the user's avatar within such an environment to be distinctive enough to fulfil the requirements of social interaction between many individuals (e.g. see www.selectparks.net/blundell_charcust.pdf).
- Conventional means of further customising a user's avatar for such a purpose may include uploading the user's own face as a texture to use on the avatar (for example, see http://research.microsoft.com/˜zhang/Face/redherringReport.htm), or modifying the gestures and expressions of the avatar to reflect a particular mood that the user wishes to express. However, gestures and expressions are not instantly recognisable as they first need to be carried out. Meanwhile, uploading images of users faces is potentially intrusive, and rendering the images in a consistent manner when each face may be captured under different lighting conditions and at different effective resolutions is difficult. Moreover, the user may be dissatisfied with the result if it is a poor approximation. Finally, many people online wish to present a fictional appearance whilst remaining true to their personality, or wish to appear appropriately ‘in character’ within the game environment; in this case a captured image is not a desirable solution.
- It has been suggested that therefore it would be desirable if the end-user could have access to customisation options ‘down to the level of the shape of a nostril’ (see the introduction to www.selectparks.net/blundell_charcust.pdf). However, this would result in a bewildering array of options for the user to work through, and significantly would also result in considerable work in providing the different customisation options during initial game development. In addition, a significant data overhead in terms of transmission of configuration data in a massively multiplayer game is also likely. Consequently such systems do not appear to have been realised in-game (see again http://starwarsgalaxies.station.sony.com/players/guides.vm?id=70000).
- The present invention seeks to address the above concerns.
- In a first aspect of the present invention, an entertainment device comprises skeletal modelling means to configure a three dimensional mesh representing some or all of a user avatar in response to at least a first property of one or more skeletal components of the user avatar, skeleton modification means to modify one or more properties of one or more skeletal components of the user avatar via a user interface, and rendering means to render the user avatar in accordance with the three dimensional mesh as configured in response to the modified user avatar skeleton.
- By configuring the three-dimensional mesh used to render the user avatar in accordance with a skeletal model, then by manipulation of one or more skeletal components the user can create distinctive faces for their avatars in a comparatively simple fashion before applying any further, more conventional changes such as texture or accessory selection to the mesh.
- In another aspect of the present invention, a server operable to administer a multi-player online virtual environment comprises reception means to receive data descriptive of respective modified avatar skeletons from a plurality of remote entertainment devices, and transmission means to transmit data descriptive of respective modified avatar skeletons to a plurality of remote entertainment devices.
- In another aspect of the present invention, a system comprising a server and two or more entertainment devices as described in the above aspects co-operate to allow the two or more entertainment devices to render the modified avatars of the users of the respective other devices.
- By enabling the distribution of user-modified skeletal models for avatars, advantageously a population of avatars within an on-line environment can therefore be more easily differentiated when the populated environment is rendered by each participating remote entertainment device.
- Further respective aspects and features of the invention are defined in the appended claims, including corresponding methods of operation as appropriate.
- Embodiments of the present invention will now be described by way of example with reference to the accompanying drawings, in which:
-
FIG. 1 is a schematic diagram of an entertainment device; -
FIG. 2 is a schematic diagram of a cell processor; -
FIG. 3 is a schematic diagram of a video graphics processor; -
FIG. 4 is a schematic diagram of an interconnected set of game zones in accordance with an embodiment of the present invention; -
FIG. 5 is a schematic diagram of a Home environment online client/server arrangement in accordance with an embodiment of the present invention; -
FIG. 6 a is a schematic diagram of a lobby zone in accordance with an embodiment of the present invention; -
FIG. 6 b is a schematic diagram of a lobby zone in accordance with an embodiment of the present invention; -
FIG. 6 c is a schematic diagram of a cinema zone in accordance with an embodiment of the present invention; -
FIG. 6 d is a schematic diagram of a developer/publisher zone in accordance with an embodiment of the present invention; -
FIG. 7 is a flow diagram of a method of on-line transaction in accordance with an embodiment of the present invention; -
FIG. 8 a is schematic diagram of an apartment zone in accordance with an embodiment of the present invention; -
FIG. 8 b is schematic diagram of a trophy room zone in accordance with an embodiment of the present invention; -
FIG. 9 is a schematic diagram of a communication menu in accordance with an embodiment of the present invention; -
FIG. 10 is a schematic diagram of an interactive virtual user device in accordance with an embodiment of the present invention; -
FIG. 11 is a schematic diagram of a user interface in accordance with an embodiment of the present invention; -
FIG. 12 is a schematic diagram of a user interface in accordance with an embodiment of the present invention; -
FIG. 13 is a schematic diagram of a user interface in accordance with an embodiment of the present invention; -
FIGS. 14A and 14B are schematic diagrams of a user interface in accordance with an embodiment of the present invention; -
FIGS. 15A , B &C are schematic diagrams of a user avatar in accordance with an embodiment of the present invention; and -
FIG. 16 is a flow diagram of a method of user identification in accordance with an embodiment of the present invention. - An apparatus and method of user identification are disclosed. In the following description, a number of specific details are presented in order to provide a thorough understanding of the embodiments of the present invention. It will be apparent, however, to a person skilled in the art that these specific details need not be employed to practice the present invention. Conversely, specific details known to the person skilled in the art are omitted for the purposes of clarity where appropriate.
- In a summary embodiment of the present invention, a user of an entertainment device connected to an on-line virtual environment selects and customises an avatar using conventional options such as gender, clothing and skin-tone. In addition, however, the user can also modify the three-dimensional mesh used to define the surface of the user's avatar, upon which textures relating to gender, skin tone, age etc., can be applied. This configuration is achieved using a comparatively simple user interface by positioning vertices of the avatar mesh in response to a skeletal model underpinning the avatar's mesh structure. The user can therefore modify the mesh of their avatar by making simple parametric adjustments to the so-called ‘bones’ of the skeletal models. Typically these bones are interlinked so that modification to one bone or set of bones also affects other related bones, so maintaining a harmonious set of physical proportions for the mesh. The user interface provides a hierarchy of adjustments, from whole-face skeletal adjustments (e.g. by race) to partial face skeletal adjustments (e.g. upper face, lower face, cranium) to individual features (e.g. cheek bones). This allows a quick modification of the avatar without compromising the ability to fine tune the results. Moreover, the user interface allows the modification of bone parameters to provide an asymmetric mesh, as this conveys a more naturalistic appearance for the avatar, as well as providing an additional source of distinctiveness and identity.
-
FIG. 1 schematically illustrates the overall system architecture of the Sony® Playstation 3® entertainment device. Asystem unit 10 is provided, with various peripheral devices connectable to the system unit. - The
system unit 10 comprises: aCell processor 100; a Rambus® dynamic random access memory (XDRAM)unit 500; a RealitySynthesiser graphics unit 200 with a dedicated video random access memory (VRAM)unit 250; and an I/O bridge 700. - The
system unit 10 also comprises a Blu Ray® Disk BD-ROM®optical disk reader 430 for reading from adisk 440 and a removable slot-in hard disk drive (HDD) 400, accessible through the I/O bridge 700. Optionally the system unit also comprises amemory card reader 450 for reading compact flash memory cards, Memory Stick® memory cards and the like, which is similarly accessible through the I/O bridge 700. - The I/
O bridge 700 also connects to four Universal Serial Bus (USB) 2.0ports 710; agigabit Ethernet port 720; an IEEE 802.11b/g wireless network (Wi-Fi)port 730; and a Bluetooth®wireless link port 740 capable of supporting up to seven Bluetooth connections. - In operation the I/
O bridge 700 handles all wireless, USB and Ethernet data, including data from one ormore game controllers 751. For example when a user is playing a game, the I/O bridge 700 receives data from thegame controller 751 via a Bluetooth link and directs it to theCell processor 100, which updates the current state of the game accordingly. - The wireless, USB and Ethernet ports also provide connectivity for other peripheral devices in addition to
game controllers 751, such as: aremote control 752; akeyboard 753; amouse 754; aportable entertainment device 755 such as a Sony Playstation Portable® entertainment device; a video camera such as an EyeToy® video camera 756; and amicrophone headset 757. Such peripheral devices may therefore in principle be connected to thesystem unit 10 wirelessly; for example theportable entertainment device 755 may communicate via a Wi-Fi ad-hoc connection, whilst themicrophone headset 757 may communicate via a Bluetooth link. - The provision of these interfaces means that the Playstation 3 device is also potentially compatible with other peripheral devices such as digital video recorders (DVRs), set-top boxes, digital cameras, portable media players, Voice over IP telephones, mobile telephones, printers and scanners.
- In addition, a legacy
memory card reader 410 may be connected to the system unit via aUSB port 710, enabling the reading ofmemory cards 420 of the kind used by the Playstation® or Playstation 2® devices. - In the present embodiment, the
game controller 751 is operable to communicate wirelessly with thesystem unit 10 via the Bluetooth link. However, thegame controller 751 can instead be connected to a USB port, thereby also providing power by which to charge the battery of thegame controller 751. In addition to one or more analogue joysticks and conventional control buttons, the game controller is sensitive to motion in 6 degrees of freedom, corresponding to translation and rotation in each axis. Consequently gestures and movements by the user of the game controller may be translated as inputs to a game in addition to or instead of conventional button or joystick commands. Optionally, other wirelessly enabled peripheral devices such as the Playstation Portable device may be used as a controller. In the case of the Playstation Portable device, additional game or control information (for example, control instructions or number of lives) may be provided on the screen of the device. Other alternative or supplementary control devices may also be used, such as a dance mat (not shown), a light gun (not shown), a steering wheel and pedals (not shown) or bespoke controllers, such as a single or several large buttons for a rapid-response quiz game (also not shown). - The
remote control 752 is also operable to communicate wirelessly with thesystem unit 10 via a Bluetooth link. Theremote control 752 comprises controls suitable for the operation of the Blu Ray Disk BD-ROM reader 430 and for the navigation of disk content. - The Blu Ray Disk BD-
ROM reader 430 is operable to read CD-ROMs compatible with the Playstation and PlayStation 2 devices, in addition to conventional pre-recorded and recordable CDs, and so-called Super Audio CDs. Thereader 430 is also operable to read DVD-ROMs compatible with the Playstation 2 and PlayStation 3 devices, in addition to conventional pre-recorded and recordable DVDs. Thereader 430 is further operable to read BD-ROMs compatible with the Playstation 3 device, as well as conventional pre-recorded and recordable Blu-Ray Disks. - The
system unit 10 is operable to supply audio and video, either generated or decoded by the Playstation 3 device via the RealitySynthesiser graphics unit 200, through audio and video connectors to a display andsound output device 300 such as a monitor or television set having adisplay 305 and one ormore loudspeakers 310. Theaudio connectors 210 may include conventional analogue and digital outputs whilst thevideo connectors 220 may variously include component video, S-video, composite video and one or more High Definition Multimedia Interface (HDMI) outputs. Consequently, video output may be in formats such as PAL or NTSC, or in 720p, 1080i or 1080p high definition. - Audio processing (generation, decoding and so on) is performed by the
Cell processor 100. The Playstation 3 device's operating system supports Dolby® 5.1 surround sound, Dolby® Theatre Surround (DTS), and the decoding of 7.1 surround sound from Blu-Ray® disks. - In the present embodiment, the
video camera 756 comprises a single charge coupled device (CCD), an LED indicator, and hardware-based real-time data compression and encoding apparatus so that compressed video data may be transmitted in an appropriate format such as an intra-image based MPEG (motion picture expert group) standard for decoding by thesystem unit 10. The camera LED indicator is arranged to illuminate in response to appropriate control data from thesystem unit 10, for example to signify adverse lighting conditions. Embodiments of thevideo camera 756 may variously connect to thesystem unit 10 via a USB, Bluetooth or Wi-Fi communication port. Embodiments of the video camera may include one or more associated microphones and also be capable of transmitting audio data. In embodiments of the video camera, the CCD may have a resolution suitable for high-definition video capture. In use, images captured by the video camera may for example be incorporated within a game or interpreted as game control inputs. - In general, in order for successful data communication to occur with a peripheral device such as a video camera or remote control via one of the communication ports of the
system unit 10, an appropriate piece of software such as a device driver should be provided. Device driver technology is well-known and will not be described in detail here, except to say that the skilled man will be aware that a device driver or similar software interface may be required in the present embodiment described. - Referring now to
FIG. 2 , theCell processor 100 has an architecture comprising four basic components: external input and output structures comprising amemory controller 160 and a dualbus interface controller 170A,B; a main processor referred to as thePower Processing Element 150; eight co-processors referred to as Synergistic Processing Elements (SPEs) 110A-H; and a circular data bus connecting the above components referred to as theElement Interconnect Bus 180. The total floating point performance of the Cell processor is 218 GFLOPS, compared with the 6.2 GFLOPs of the Playstation 2 device's Emotion Engine. - The Power Processing Element (PPE) 150 is based upon a two-way simultaneous multithreading Power 970 compliant PowerPC core (PPU) 155 running with an internal clock of 3.2 GHz. It comprises a 512 kB level 2 (L2) cache and a 32 kB level 1 (L1) cache. The
PPE 150 is capable of eight single position operations per clock cycle, translating to 25.6 GFLOPs at 3.2 GHz. The primary role of thePPE 150 is to act as a controller for the Synergistic Processing Elements 110A-H, which handle most of the computational workload. In operation thePPE 150 maintains a job queue, scheduling jobs for the Synergistic Processing Elements 110A-H and monitoring their progress. Consequently each Synergistic Processing Element 110A-H runs a kernel whose role is to fetch a job, execute it and synchronise with thePPE 150. - Each Synergistic Processing Element (SPE) 110A-H comprises a respective Synergistic Processing Unit (SPU) 120A-H, and a respective Memory Flow Controller (MFC) 140A-H comprising in turn a respective Dynamic Memory Access Controller (DMAC) 142A-H, a respective Memory Management Unit (MMU) 144A-H and a bus interface (not shown). Each
SPU 120A-H is a RISC processor clocked at 3.2 GHz and comprising 256 kBlocal RAM 130A-H, expandable in principle to 4 GB. Each SPE gives a theoretical 25.6 GFLOPS of single precision performance. An SPU can operate on 4 single precision floating point members, 4 32-bit numbers, 8 16-bit integers, or 16 8-bit integers in a single clock cycle. In the same clock cycle it can also perform a memory operation. TheSPU 120A-H does not directly access thesystem memory XDRAM 500; the 64-bit addresses formed by theSPU 120A-H are passed to theMFC 140A-H which instructs itsDMA controller 142A-H to access memory via theElement Interconnect Bus 180 and thememory controller 160. - The Element Interconnect Bus (EIB) 180 is a logically circular communication bus internal to the
Cell processor 100 which connects the above processor elements, namely thePPE 150, thememory controller 160, thedual bus interface 170A,B and the 8 SPEs 110A-H, totalling 12 participants. Participants can simultaneously read and write to the bus at a rate of 8 bytes per clock cycle. As noted previously, each SPE 110A-H comprises aDMAC 142A-H for scheduling longer read or write sequences. The EIB comprises four channels, two each in clockwise and anti-clockwise directions. Consequently for twelve participants, the longest step-wise data-flow between any two participants is six steps in the appropriate direction. The theoretical peak instantaneous EIB bandwidth for 12 slots is therefore 96B per clock, in the event of full utilisation through arbitration between participants. This equates to a theoretical peak bandwidth of 307.2 GB/s (gigabytes per second) at a clock rate of 3.2 GHz. - The
memory controller 160 comprises anXDRAM interface 162, developed by Rambus Incorporated. The memory controller interfaces with theRambus XDRAM 500 with a theoretical peak bandwidth of 25.6 GB/s. - The
dual bus interface 170A,B comprises a Rambus FlexIO® system interface 172A,B. The interface is organised into 12 channels each being 8 bits wide, with five paths being inbound and seven outbound. This provides a theoretical peak bandwidth of 62.4 GB/s (36.4 GB/s outbound, 26 GB/s inbound) between the Cell processor and the I/O Bridge 700 viacontroller 170A and the RealitySimulator graphics unit 200 viacontroller 170B. - Data sent by the
Cell processor 100 to the RealitySimulator graphics unit 200 will typically comprise display lists, being a sequence of commands to draw vertices, apply textures to polygons, specify lighting conditions, and so on. - Referring now to
FIG. 3 , the Reality Simulator graphics (RSX)unit 200 is a video accelerator based upon the NVidia® G70/71 architecture that processes and renders lists of commands produced by theCell processor 100. TheRSX unit 200 comprises ahost interface 202 operable to communicate with thebus interface controller 170B of theCell processor 100; a vertex pipeline 204 (VP) comprising eightvertex shaders 205; a pixel pipeline 206 (PP) comprising 24pixel shaders 207; a render pipeline 208 (RP) comprising eight render output units (ROPs) 209; amemory interface 210; and avideo converter 212 for generating a video output. TheRSX 200 is complemented by 256 MB double data rate (DDR) video RAM (VRAM) 250, clocked at 600 MHz and operable to interface with theRSX 200 at a theoretical peak bandwidth of 25.6 GB/s. In operation, theVRAM 250 maintains aframe buffer 214 and atexture buffer 216. Thetexture buffer 216 provides textures to thepixel shaders 207, whilst theframe buffer 214 stores results of the processing pipelines. The RSX can also access themain memory 500 via theEIB 180, for example to load textures into theVRAM 250. - The
vertex pipeline 204 primarily processes deformations and transformations of vertices defining polygons within the image to be rendered. - The
pixel pipeline 206 primarily processes the application of colour, textures and lighting to these polygons, including any pixel transparency, generating red, green, blue and alpha (transparency) values for each processed pixel. Texture mapping may simply apply a graphic image to a surface, or may include bump-mapping (in which the notional direction of a surface is perturbed in accordance with texture values to create highlights and shade in the lighting model) or displacement mapping (in which the applied texture additionally perturbs vertex positions to generate a deformed surface consistent with the texture). - The render
pipeline 208 performs depth comparisons between pixels to determine which should be rendered in the final image. Optionally, if the intervening pixel process will not affect depth values (for example in the absence of transparency or displacement mapping) then the render pipeline andvertex pipeline 204 can communicate depth information between them, thereby enabling the removal of occluded elements prior to pixel processing, and so improving overall rendering efficiency. In addition, the renderpipeline 208 also applies subsequent effects such as full-screen anti-aliasing over the resulting image. - Both the vertex shaders 205 and
pixel shaders 207 are based on the shader model 3.0 standard. Up to 136 shader operations can be performed per clock cycle, with the combined pipeline therefore capable of 74.8 billion shader operations per second, outputting up to 840 million vertices and 10 billion pixels per second. The total floating point performance of theRSX 200 is 1.8 TFLOPS. - Typically, the
RSX 200 operates in close collaboration with theCell processor 100; for example, when displaying an explosion, or weather effects such as rain or snow, a large number of particles must be tracked, updated and rendered within the scene. In this case, thePPU 155 of the Cell processor may schedule one or more SPEs 110A-H to compute the trajectories of respective batches of particles. Meanwhile, theRSX 200 accesses any texture data (e.g. snowflakes) not currently held in thevideo RAM 250 from themain system memory 500 via theelement interconnect bus 180, thememory controller 160 and abus interface controller 170B. The or each SPE 110A-H outputs its computed particle properties (typically coordinates and normals, indicating position and attitude) directly to thevideo RAM 250; theDMA controller 142A-H of the or each SPE 110A-H addresses thevideo RAM 250 via thebus interface controller 170B. Thus in effect the assigned SPEs become part of the video processing pipeline for the duration of the task. - In general, the
PPU 155 can assign tasks in this fashion to six of the eight SPEs available; one SPE is reserved for the operating system, whilst one SPE is effectively disabled. The disabling of one SPE provides a greater level of tolerance during fabrication of the Cell processor, as it allows for one SPE to fail the fabrication process. Alternatively if all eight SPEs are functional, then the eighth SPE provides scope for redundancy in the event of subsequent failure by one of the other SPEs during the life of the Cell processor. - The
PPU 155 can assign tasks to SPEs in several ways. For example, SPEs may be chained together to handle each step in a complex operation, such as accessing a DVD, video and audio decoding, and error masking, with each step being assigned to a separate SPE. Alternatively or in addition, two or more SPEs may be assigned to operate on input data in parallel, as in the particle animation example above. - Software instructions implemented by the
Cell processor 100 and/or theRSX 200 may be supplied at manufacture and stored on theHDD 400, and/or may be supplied on a data carrier or storage medium such as an optical disk or solid state memory, or via a transmission medium such as a wired or wireless network or interne connection, or via combinations of these. - The software supplied at manufacture comprises system firmware and the Playstation 3 device's operating system (OS). In operation, the OS provides a user interface enabling a user to select from a variety of functions, including playing a game, listening to music, viewing photographs, or viewing a video. The interface takes the form of a so-called cross media-bar (XMB), with categories of function arranged horizontally. The user navigates by moving through the function icons (representing the functions) horizontally using the
game controller 751,remote control 752 or other suitable control device so as to highlight a desired function icon, at which point options pertaining to that function appear as a vertically scrollable list of option icons centred on that function icon, which may be navigated in analogous fashion. However, if a game, audio ormovie disk 440 is inserted into the BD-ROMoptical disk reader 430, the Playstation 3 device may select appropriate options automatically (for example, by commencing the game), or may provide relevant options (for example, to select between playing an audio disk or compressing its content to the HDD 400). - In addition, the OS provides an on-line capability, including a web browser, an interface with an on-line store from which additional game content, demonstration games (demos) and other media may be downloaded, and a friends management capability, providing on-line communication with other Playstation 3 device users nominated by the user of the current device; for example, by text, audio or video depending on the peripheral devices available. The on-line capability also provides for on-line communication, content download and content purchase during play of a suitably configured game, and for updating the firmware and OS of the Playstation 3 device itself. It will be appreciated that the term “on-line” does not imply the physical presence of wires, as the term can also apply to wireless connections of various types.
- In an embodiment of the present invention, the above-mentioned online capability comprises interaction with a virtual environment populated by avatars (graphical representations) of the user of the
PS3 10 and of other PS3 users who are currently online. - The software to enable the virtual interactive environment is typically resident on the
HDD 400, and can be upgraded and/or expanded by software that is downloaded, or stored onoptical disk 440, or accessed by any other suitable means. Alternatively, the software may reside on aflash memory card 420,optical disk 440 or a central server (not shown). In an embodiment of the present invention, the virtual interactive environment (hereafter called the ‘Home’ environment) is selected from the cross-media bar. The Home environment then starts in a conventional manner similar to a 3D video game by loading and executing control software, loading 3D models and textures intovideo memory 250, and rendering scenes depicting the Home environment. Alternatively or in addition, the Home environment can be initiated by other programs, such as a separate game. - Referring now to
FIG. 4 , which displays a notional map of the Home environment, andFIG. 5 , which is a schematic diagram of a Home environment online client/server arrangement, the user's avatar is spawned within alobby zone 1010 by default. However, a user can select among other zones 1010-1060 (detailed below) of the map, causing the select zone to be loaded and the avatar to be spawned within that zone. In an embodiment of the present invention, the map screen further comprises a sidebar on which the available zones may be listed, together with management tools such as a ranking option, enabling zones to be listed in order of user preference, or such as most recently added and/or A-Z listings. In addition a search interface may allow the user to search for a zone by name. In an embodiment of the present invention, there maybe many more zones available than are locally stored on the user's PS3 at any one time; the local availability may be colour coded on the list, or the list may be filtered to only display locally available zones. If the user selects a locally unavailable zone, it can be downloaded from aHome environment Server 2010. - Referring now to
FIG. 6 a, thelobby zone 1010 typically resembles a covered piazza, and may comprise parkland (grass, trees, sculptures etc.), and gathering spaces (such as open areas, single benches or rows of seats etc.) where users can meet through their avatars. - The
lobby zone 1010 typically also comprises advertisement-hoardings, for displaying either still or moving adverts for games or other content or products. These may be on the walls of the lobby, or may stand alone. - The
lobby zone 1010 may also include an open-air cinema 1012 showing trailers, high-profile adverts or other content from third-party providers. Such content is typically streamed or downloaded from aHome environment server 2010 to which thePS3 10 connects when the Home environment is loaded, as described in more detail later. - The cinema screen is accompanied by seating for avatars in front of it, such that when an avatar sits down, the camera angle perceived by the user of the avatar also encompasses the screen.
- Referring now also to
FIG. 6 b, thelobby zone 1010 may also includegeneral amusements 1014, such as functioning pool tables, bowling alleys, and/or a video arcade. Games of pool or bowling may be conducted via the avatar, such that the avatar holds the pool cue or bowling ball, and is controlled in a conventional manner for such games. In the video arcade, if an avatar approaches a videogame machine, the home environment may switch to a substantially full-screen representation of the videogame selected. Such games may, for example, be classic arcade or console games such as Space Invaders (®), or Pac-Man (®), which are comparatively small in terms of memory and processing and can be emulated by the PS3 within the Home environment or run as plug-ins to the Home environment. In this case, typically the user will control the game directly, without representation by the avatar. The game will switch back to the default Home environment view if the user quits the game, or causes the avatar to move away from the videogame machine. In addition to classic arcade games, user-created game content may be featured on one or more of the virtual video game machines. Such content may be the subject of on-line competitions to be featured in such a manner, with new winning content downloaded on a regular basis. - In addition to the
lobby zone 1010, other zones (e.g. zones FIG. 4 , or alternatively the user can walk to these other areas by guiding their avatar tovarious exits 1016 from the lobby. - Typically, an
exit 1016 takes the form of a tunnel or corridor (but may equally take the form of an anteroom) to the next area. While the avatar is within the tunnel or anteroom, the next zone is loaded into memory. Both the lobby arid the next zone contain identical models of the tunnel or anteroom, or the model is a common resource to both. In either case, the user's avatar is relocated from the lobby-based version to the new zone-based version of the tunnel or anteroom at the same position. In this way the user's avatar can apparently walk seamlessly throughout the Home environment, without the need to retain the whole environment in memory at the same time. - Referring now also to
FIG. 6 c, one available zone is aCinema zone 1020. TheCinema zone 1020 resembles a multiplex cinema, comprising a plurality of screens that may show content such as trailers, movies, TV programmes, or adverts downloaded or streamed from aHome environment server 2010 as noted previously and detailed below, or may show content stored on theHDD 400 or on anoptical disk 440, such as a Blu-Ray disk. - Typically, the multiplex cinema will have an entrance area featuring a
screen 1022 on which high-profile trailers and adverts may be shown to all visitors, together withposter adverts 1024, typically but not limited to featuring upcoming movies. Specific screens and the selection and display of the trailers and posters can each be restricted according to the age of the user, as registered with the PS3. This age restriction can be applied to any displayed content to which an age restriction tag is associated, in any of the zones within the Home environment. - In addition, in an embodiment of the present invention the multiplex cinema provides a number of screen rooms in which featured content is available, and amongst which the user can select. Within a screen room downloaded, streamed or locally stored media can be played within a virtual cinema environment, in which the screen is set in a room with rows of seats, screen curtains, etc. The cinema is potentially available to all users in the Home environment, and so the avatars of other users may also be visible, for example watching commonly streamed material such as a web broadcast. Alternatively, the user can zoom in so that the screen occupies the full viewing area.
- Referring now also to
FIG. 6 d, another type of zone is a developer orpublisher zone 1030. Typically, there may be a plurality of such zones available. Optionally, each may have its own exit from thelobby area 1010, or alternatively some or all may share an exit from the lobby and then have separate exits from within a tunnel or ante-room model common to or replicated by each available zone therein. Alternatively they may be selected from a menu, either in the form of a pop-up menu, or from within the Home environment, such as by selecting from a set of signposts. In these latter cases the connecting tunnel or anteroom will appear to link only to the selected developer orpublisher zone 1030. Alternatively or in addition, such zones maybe selected via the map screen, resulting in the zone being loaded in to memory, and the avatar re-spawning within the selected zone. - Developer or
publisher zones 1030 provide additional virtual environments, which may reflect the look and feel of the developer or publisher's products, brands and marks. - The developer or
publisher zones 1030 are supplementary software modules to the Home environment and typically comprise additional 3D models and textures to provide the structure and appearance of the zone. - In addition, the software operable to implement the Home environment supports the integration of third party software via an application program interface (API). Therefore, developers can integrate their own functional content within the Home environment of their own zone. This may take the form of any or all of:
- i. Downloading/streaming of specific content, such as game trailers or celebrity endorsements;
- ii. Changes in avatar appearance, behaviour and/or communication options within the zone;
- iii. The provision of one or more games, such as
basketball 1032 or a golf range 1034, optionally branded or graphically reminiscent of the developer's or publisher's games; - iv. One or more interactive scenes or vignettes representative of the developer's or publisher's games, enabling the player to experience an aspect of the game, hone a specific skill of the game, or familiarise themselves with the controls of a game;
- v. An arena, ring, dojo, court or
similar area 1036 in which remotely played games may be represented live byavatars 1038, for spectators to watch. - Thus, for example, a developer's zone resembles a concourse in the developer's signature colours and featuring their logos, onto which open gaming areas, such as soccer nets, or a skeet range for shooting. In addition, a booth (not shown) manned by game-specific characters allows the user's avatar to enter and either temporarily change into the lead character of the game, or zoom into a first person perspective, and enter a further room resembling a scene from the featured game. Here the user interacts with other characters from the game, and plays out a key scene. Returning to the concourse, adverts for the game and other content are displayed on the walls. At the end of the zone, the concourse opens up into an arena where a 5-a-side football match is being played, where the positions of the players and the ball correspond to a game currently being played by a popular group, such as a high-ranking, game clan, in another country.
- In embodiments of the present invention, developer/publisher zones are available to download. Alternatively or in addition, to reduce bandwidth they may be supplied as demo content on magazine disks, or may be installed/upgraded from disk as part of the installation process for a purchased game of the developer or publisher. In the latter two examples, subsequent purchase or registration of the game may result in further zone content being unlocked or downloaded. In any event, further modifications, and timely advert and trailer media, may be downloaded as required.
- A similar zone is the
commercial zone 1040. Again, there may be a plurality of such commercial zones accessible in similar manner to the developer and publisher zones. Like developer/publisher zones 1030,commercial zones 1040 may comprise representative virtual assets of one or more commercial vendors in the form of 3D models, textures etc., enabling a rendering of their real-world shops, brands and identities, and these may be geographically and/or thematically grouped within zones. - Space within commercial zones may be rented as so-called ‘virtual real-estate’ by third parties. For example, a retailer may pay to have a rendering of their shop included within a
commercial zone 1040 as part of a periodic update of the Home environment supplied via theHome environment server 2010, for example on a monthly or annual renewal basis. A retailer may additionally pay for the commerce facilities described above, either on a periodic basis or per item. In this way they can provide users of the Home environment with a commercial presence. - Again, the commercial zone comprises supplementary software that can integrate with the home environment via an API, to provide additional communication options (shop-specific names, goods, transaction options etc), and additional functionality, such as accessing an online database of goods and services for purchase, determining current prices, the availability of goods, and delivery options. Such functions may be accessed either via a menu (either as a pop-up or within the Home environment, for example on a wall) or via communication with automated avatars. Communication between avatars is described in more detail later.
- It will be appreciated that developers and publishers can also provide stores within commercial zones, and in addition that connecting tunnels between developer/publisher and commercial zones may be provided. For example, a tunnel may link a developer zone to a store that sells the developer's games. Such a tunnel may be of a ‘many to one’ variety, such that exits from several- zones emerge from the same tunnel in-store. In this case, if re-used, typically the tunnel would be arranged to return the user to the previous zone rather than one of the possible others.
- In an embodiment of the present invention, the software implementing the Home environment has access to an online-content purchase system provided by the PS3 OS. Developers, publishers and store owners can use this system via an interface to specify the IP address and query text that facilitates their own on-line transaction. Alternatively, the user can allow their PS3 registration details and credit card details to be used directly, such that by selecting a suitably enabled object, game, advert, trailer or movie anywhere within the Home environment, they can select to purchase that item or service. In particular, the
Home environment server 2010 can store and optionally validate the user's credit card and other details so that the details are ready to be used in a transaction without the user having to enter them. In this way the Home environment acts as an intermediary in the transaction. Alternatively such details can be stored at the PS3 and validated either by the PS3 or by the Home environment server. - Thus, referring now also to
FIG. 7 , in an embodiment of the present invention a method of sale comprises in a step s2102 a user selecting an item (goods or a service) within the Home environment. In step s2104, thePS3 10 transmits identification data corresponding with the object to theHome environment server 2010, which in step s2016 verifies the item's availability from a preferred provider (preferably within the country corresponding to the IP address of the user). If the item is unavailable then in step s2107 it informs the user by transmitting a message to the user'sPS3 10. Alternatively, it first checks for availability from one or more secondary providers, and optionally confirms whether supply from one of these providers is acceptable to the user. In step s2108, the Home environment server retrieves from data storage the user's registered payment details and validates them. If there is no valid payment method available, then the Home environment may request that the user enters new details via a secure (i.e. encrypted) connection. Once a valid payment method is available, then in step s2110 the Home environment server requests from the appropriate third party payment provider a transfer of payment from the user's account. Finally, in s2112 the Home environment server places an order for the item with the preferred provider, giving the user's delivery address or IP address as applicable, and transferring appropriate payment to the preferred provider's account. - In this way, commerce is not limited specifically to shops. Similarly, it is not necessary for shops to provide their own commerce applications if the preferred provider for goods or services when displayed within a shop is set to be that shop's owner. Where the goods or service may be digitally provided, then optionally it is downloaded from the preferred provider directly or via a
Home environment server 2010. - In addition to the above public zones, there are additional zones that are private to the individual user and may only be accessed by them or by invitation from them. These zones also have exits from the communal lobby area, but when entered by the avatar (or chosen via the map screen), load a respective version of the zone that is private to that user.
- Referring to
FIG. 8 a, the first of these zones is anapartment zone 1050. In an embodiment of the present invention, this is a user-customisable zone in whichsuch features 1052 as wallpaper, flooring, pictures, furniture, outside scenery and lighting may be selected and positioned. Some of the furniture isfunctional furniture 1054, linked to PS3 functionality. For example, a television may be placed in theapartment 1050 on which can be viewed one of several streamed video broadcasts, or media stored on thePS3 HDD 400 oroptical disk 440. Similarly, a radio or hi-fi may be selected that contains pre-selected links to internet radio streams. In addition, user artwork or photos may be imported into the room in the form of wall hangings and pictures. - Optionally, the user (represented in
FIG. 8 a by their avatar 1056) may purchase a larger apartment, and/or additional goods such as a larger TV, a pool table, or automated non-player avatars. Other possible items include a gym, swimming pool, or disco area. In these latter cases, additional control software or configuration libraries to provide additional character functionality will integrate with the home environment via the API in a similar fashion to that described for the commercial and developer/publisher zones - Such purchases may be made using credit card details registered with the Home environment server. In return for a payment, the server downloads an authorisation key to unlock the relevant item for use within the user's apartment. Alternatively, the 3D model, textures and any software associated with an item may also be downloaded from the Home environment server or an authorised third-party server, optionally again associated with an authorisation key. The key may, for example, require correspondence with a firmware digital serial number of the
PS3 10, thereby preventing unauthorised distribution. - A user's apartment can only be accessed by others upon invitation from the respective user. This invitation can take the form of a standing invitation for particular friends from within a friends list, or in the form of a single-session pass conferred on another user, and only valid whilst that user remains in the current Home environment session. Such invitations may take the form of an association maintained by a
Home environment server 2010, or a digital key supplied between PS3 devices on a peer-to-peer basis that enables confirmation of status as an invitee. - In an embodiment of the present invention invited users can only enter the apartment when the apartment's user is present within the apartment, and are automatically returned to the lobby if the apartment's user leaves. Whilst within the apartment, all communication between the parties present (both user and positional data) is purely peer-to-peer.
- The apartment thus also provides a user with the opportunity to share home created content such as artwork, slideshows, audio or video with invited guests, and also to interact with friends without potential interference from other users within the public zones.
- When invited guests enter a user's apartment, the configuration of the room and the furnishings within it are transmitted in a peer-to-peer fashion between the attendees using ID codes for each object and positional data. Where a room or item are not held in common between the user and a guest, the model, textures and any code required to implement it on the guest's PS3 may also be transmitted, together with a single-use key or similar constraint, such as use only whilst in the user's apartment and whilst the user and guest remain online in this session.
- Referring to
FIG. 8 b, a further private space that may similarly be accessed only by invitation is the user'sTrophy Room 1060. TheTrophy Room 1060 provides a space within whichtrophies 1062 earned during game play may be displayed. - For example, a third-party game comprises seeking a magical crystal. If the player succeeds in finding the crystal, the third party game nominates this as a trophy for the
Trophy Room 1060, and places a 3D model and texture representative of the crystal in a file area accessed by the Home environment software when loading theTrophy Room 1060. The software implementing the Home environment can then render the crystal as a trophy within the Trophy Room. - When parties are invited to view a user's trophy room, the models and textures required to temporarily view the trophies are sent from the user's PS3 to those of the other parties on a peer-to-peer basis. This may be done as a background activity following the initial invitation, in anticipation of entering the trophy room, or may occur when parties enter a connecting tunnel/anteroom or select the user's trophy room from the map screen. Optionally, where another party also has that trophy, they will not download the corresponding trophy from the user they are visiting. Therefore, in an embodiment of the present invention, each trophy comprises an identifying code.
- Alternatively or in addition, a trophy room may be shared between members of a group or so-called ‘clan’, such that a trophy won by any member of the clan is transmitted to other members of the clan on a peer-to-peer basis. Therefore all members of the clan will see a common set of trophies.
- Alternatively or in addition, a user can have a standing invitation to all members of the Home environment, allowing anyone to visit their trophy room. As with the commercial and developer/publisher zones, a plurality of rooms is therefore possible, for example a private, a group-based and a public trophy room. This may be managed either by selection from a pop-up menu or signposts within the Home environment as described previously, or by identifying a relevant user by walking up to their avatar, and then selecting to enter their (public) trophy room upon using the trophy room exit from the lobby.
- Alternatively or in addition, a public trophy room may be provided. This room may display the trophies of the person in the current instance of the Home environment who has the most trophies or a best overall score according to a trophy value scoring scheme.
- Alternatively it may be an aggregate trophy room, showing the best, or a selection of, trophies from some or all of the users in that instance of the Home environment, together with the ID of the user. Thus, for example, a user could spot a trophy from a game they are having difficulty with, identify who in the Home environment won it, and then go and talk to them about how they won it. Alternatively, a public trophy room could contain the best trophies across a plurality of Home environments, identifying the best garners within a geographical, age specific or game specific group, or even worldwide. Alternatively or in addition, a leader board of the best scoring garners can be provided and updated live.
- It will be appreciated that potentially a large number of additional third party zones may become available, each comprising additional 3D models, textures and control software. As a result a significant amount of space on
HDD 400 may become occupied by Home environment zones. - Consequently, in an embodiment of the present invention the number of third party zones currently associated with a user's Home environment can be limited. In a first instance, a maximum memory allocation can be used to prevent additional third party zones being added until an existing one is deleted. Alternatively or in addition, third party zones may be limited according to geographical relevance or user interests (declared on registration or subsequently via an interface with the Home environment server 2010), such that only third party zones relevant to the user by these criteria are downloaded. Under such a system, if a new third party zone becomes available, its relevance to the user is evaluated according to the above criteria, and if it is more relevant than at least one of those currently stored, it replaces the currently least relevant third party zone stored on the user's PS3.
- Other criteria for relevance may include interests or installed zones of nominated friends, or the relevance of zones to games or other media that have been played on the user's PS3.
- Further zones may be admitted according to whether the user explicitly installs them, either by download or by disk.
- As noted above, within the Home environment users are represented by avatars. The software implementing the Home environment enables the customisation of a user's avatar from a selection of pre-set options in a similar manner to the customisation of the user's apartment. The user may select gender and skin tone, and customise the facial features and hair by combining available options for each. The user may also select from a wide range of clothing. To support this facility, a wide range of 3D models and textures for avatars are provided. In an embodiment of the present invention, user may import their own textures to display on their clothing. Typically, the parameters defining the appearance of each avatar only occupy around 40 bytes, enabling fast distribution via the home server when joining a populated Home environment.
- Each avatar in the home environment can be identified by the user's ID or nickname, displayed in a bubble above the avatar. To limit the proliferation of bubbles, these fade into view when the avatar is close enough that the text it contains could easily be read, or alternatively when the avatar is close enough to interact with and/or is close to the centre of the user's viewpoint.
- The avatar is controlled by the user in a conventional third-person gaming manner (e.g. using the game controller 751), allowing them to walk around the Home environment. Some avatar behaviour is contextual; thus for example the option to sit down will only be available when the avatar is close to a seat. Other avatar behaviour is available at all times, such as for example the expression of a selected emotion or gesture, or certain communication options. Avatar actions are determined by use of the
game controller 751, either directly for actions such as movement, or by the selection of actions via a pop-up menu, summoned by pressing an appropriate key on thegame controller 751. - Options available via such a menu include further modification of the avatar's appearance and clothing, and the selection of emotions, gestures and movements. For example, the user can select that their avatar smiles, waves and jumps up and down when the user sees someone they know in the Home environment.
- Users can also communicate with each other via their avatars using text or speech.
- To communicate by text, in an embodiment of the present invention, messages appear in pop-up bubbles above the relevant avatar, replacing their name bubble if necessary.
- Referring now also to
FIG. 9 , to generate a message the user can activate a pop-upmenu 1070 in which a range of preset messages is provided. These may be complete messages, or alternatively or in addition may take the form of nested menus, the navigation of which generates a message by concatenating selected options. - Alternatively or in addition, a virtual keyboard may be displayed, allowing free generation of text by navigation with the
game controller 751. If areal keyboard 753 is connected via Bluetooth, then text may by typed into a bubble directly. - In an embodiment of the present invention, the lobby also provides a chat channel hosted by the Home environment server, enabling conventional chat facilities.
- To communicate by speech, a user must have a microphone, such as a
Bluetooth headset 757, available. Then in an embodiment of the present invention, either by selection of a speech option by pressing a button on thegame controller 751, or by use of a voice activity detector within the software implementing the Home environment, the user can speak within the Home environment. When speaking, a speech icon may appear above the head of the avatar for example to alert other users to adjust volume settings if necessary. - The speech is sampled by the user's PS3, encoded using a Code Excited Linear Prediction (CELP) codec (or other known VoIP applicable codec), and transmitted in a peer-to-peer fashion to the eight nearest avatars (optionally provided they are within a preset area within the virtual environment surrounding the user's avatar). Where more than eight other avatars are within the preset area, one or more of the PS3s that received the speech may forward it to other PS3s having respective user avatars within the area that did not receive the speech, in an ad-hoc manner. To co-ordinate this function, in an embodiment of the present invention the PS3 will transmit a speech flag to all PS3s whose avatars are within the preset area, enabling them to place a speech icon above the relevant (speaking) avatars head (enabling their user to identify the speaker more easily) and also to notify the PS3s of a transmission. Each PS3 can determine from the relative positions of the avatars which ones will not receive the speech, and can elect to forward the speech to the PS3 of whichever avatar they are closest to within the virtual environment. Alternatively, the PS3s within the area can ping each other, and whichever PS3 has the lowest lag with a PS3 that has not received the speech can elect to forward it.
- It will be appreciated that the limitation to eight is exemplary, and the actual number depends upon such factors as the speech compression ratio and the available bandwidth.
- In an embodiment of the present invention, such speech can also be relayed to other networks, such as a mobile telephony network, upon specification of a mobile phone number. This may be achieved either by routing the speech via the Home environment server to a gateway server of the mobile network, or by Bluetooth transmission to the user's own mobile phone. In this latter case, the mobile phone may require middleware (e.g. a Java applet) to interface with the PS3 and route the call.
- Thus a user can contact a person on their phone from within the Home environment. In a similar manner, the user can also send a text message to a person on their mobile phone.
- In a similar manner to speech, in an embodiment of the present invention users whose PS3s are equipped with a video camera such as the Sony ® Eye Toy ® video camera can use a video chat mode, for example via a pop-up screen, or via a TV or similar device within the Home environment, such as a Sony Playstation Portable (PSP) held by the avatar. In this case video codecs are used in addition to or instead of the audio codecs.
- Optionally, the avatars of users with whom you have spoken recently can be highlighted, and those with whom you have spoken most may be highlighted more prominently, for example by an icon next to their name, or a level of glow around their avatar.
- Referring back to
FIG. 5 , when a user selects to activate the Home environment on theirPS3 10, the locally stored software generates the graphical representation of the Home environment, and connects to aHome environment server 2010 that assigns the user to one of a plurality ofonline Home environments - It will be understood that potentially many tens of thousands of users may be online at any one time. Consequently to prevent overcrowding, the
Home environment server 2010 will support a large plurality of separate online Home environments. Likewise, there may be many separate Home environment servers, for example in different countries. - Once assigned to a Home environment, a PS3 initially uploads information regarding the appearance of the avatar, and then in an ongoing fashion provides the Home environment server with positional data for its own avatar, and receives from the Home environment server the positional data of the other avatars within that online Home environment. In practice this positional update is periodic (for example every 2 seconds) to limit bandwidth, so other PS3s must interpolate movement. Such interpolation of character movement is well-known in on-line games. In addition, each update can provide a series of positions, improving the replication of movement (with some lag), or improving the extrapolation of current movement.
- In addition the IP addresses of the
other PS3s Home environment 2024 is shared so that they can transmit other data such as speech in a peer-to-peer fashion between themselves, thereby reducing the required bandwidth of data handled by the Home environment server. - To prevent overcrowding within the Home environments, each will support a maximum of, for example, 64 users.
- The selection of a Home environment to which a user will be connected can take account of a number of factors, either supplied by the PS3 and/or known to the Home environment server via a registration process. These include but are not limited to:
- i. The geographical location of the PS3;
- ii. The user's preferred language;
- iii. The user's age;
- iv. Whether any users within the current user's ‘friends list’ are in a particular Home environment already;
- v. What game disk is currently within the user's PS3;
- vi. What games have recently been played on the user's PS3.
- Thus, for example, a Swiss teenager may be connected to a Home environment on a Swiss server, with a maximum user age of 16 and a predominant language of French. In another example, a user with a copy of ‘Revolution’ mounted in their PS3 may be connected to a home environment where a predominant number of other users also currently have the same game mounted, thereby facilitating the organisation of multiplayer games. In this latter case, the
PS3 10 detects the game loaded within the BD-ROM 430 and informs theHome environment server 2010. The server then chooses a Home environment accordingly. - In a further example, a user is connected to a Home environment in which three users identified on his friends list can be found. In this latter example, the friends list is a list of user names and optionally IP addresses that have been received from other users that the user given wishes to meet regularly. Where different groups of friends are located on different Home environment servers (e.g. where the current user is the only friend common to both sets) then the user may either be connected to the one with the most friends, or given the option to choose.
- Conversely, a user may invite one or more friends to switch between Home environments and join them. In this case, the user can view their friends list via a pop-up menu or from within the Home environment (for example via a screen on the wall or an information booth) and determine who is on-line. The user may then broadcast an invite to their friends, either using a peer-to-peer connection or, if the friend is within a Home environment or the IP address is unknown, via the Home environment server. The friend can then accept or decline the invitation to join.
- To facilitate invitation, generally a Home environment server will assign less than the maximum supported number of users to a specific home environment, thereby allowing such additional user-initiated assignments to occur. This so-called ‘soft-limit’ may, for example, be 90% of capacity, and may be adaptive, for example changing in the early evening or at weekends where people are more likely to meet up with friends on-line.
- Where several friends are within the same Home environment, in an embodiment of the present invention the map screen may also highlight those zones in which the friends can currently be found, either by displaying their name on the map or in association with the zone name on the side bar.
- Referring now also to
FIG. 10 , in addition, preferences, settings, functions of the Home environment and optionally other functionality may be viewed, adjusted or accessed as appropriate by use of a virtual Sony ® Playstation Portable ® (PSP)entertainment device 1072 that can be summoned by use of thegame controller 751 to pop-up on screen. The user can then access these options, settings and functionality via aPSP cross-media bar 1074 displayed on the virtual PSP. As noted above, the PSP could also be used as an interface for video chat. - When a user wishes to leave the Home environment, in embodiments of the present invention they may do so by selection of an appropriate key on the
game controller 751, by selection of an exit option from a pop-up menu, by selection of an exit from within the map screen, by selection of an option via their virtual PSP or by walking through a master exit within the lobby zone. - Typically, exiting the Home environment will cause the
PS3 10 to return to the PS3 cross media bar. - Finally, it will be appreciated that additional, separate environments based upon the Home environment software and separately accessible from the PS3 cross-media bar are envisaged. For example, a supermarket may provide a free disk upon which a Supermarket environment, supported in similar fashion by the Home environment servers, is provided. Upon selection, the user's avatar can browse displayed goods within a virtual rendition of the supermarket (either as 3D models or textures applied to shelves) and click on them to purchase as described above. In this way retailers can provide and update online shopping facilities for their own user base.
- In an embodiment of the present invention, the avatar model comprises two aspects; a mesh, or skin, defining the three dimensional surface upon which textures are placed, and a hierarchy of so-called ‘bones’ used to modify the vertices of the mesh. It should be understood that these bones are typically one-dimensional lines comprising a position, size and orientation, and are associated with vertices or regions of a mesh and/or optionally with other bones. As such, they do not correspond to human bones in the conventional sense.
- The mesh typically has a default design, e.g. for male and female avatars. In conventional animation, this mesh can be deformed by so called ‘blend shapes’ (or ‘morph targets’). Blend shapes are commonly used for facial animation of game characters or avatars using known techniques. For example, during the designing of a game, an artist will typically explicitly design a default mesh describing the head of a game character or avatar together with blend shapes that depict facial expressions of that game character or avatar, such as left eyebrow raised, right eyebrow raised, mouth saying “oo”, mouth saying “ee”, and the like. During animation, the animator assigns a blend weight to each blend shape so as to specify the degree to which that blend shape will influence the distortion of the template mesh during the animation. The rendered position of each vertex consequently corresponds to the sum of the blend shape offset positions multiplied by their respective weights plus the vertex position of the template mesh. Therefore, for example, an animator might choose to create a smiling avatar with a raised eyebrow by assigning an appropriate weight to those blend shapes describing those facial expressions.
- Alternatively or in addition to blend shapes, animators traditionally also use skeletal animation. In skeletal animation, each bone in the skeleton is associated with one or more vertices of the mesh. Conversely, these vertices can also be associated with one or more bones, each association being determined by a vertex weight. Consequently the displacement of the mesh is determined by the weighted influence of the neighbouring bones.
- Where one bone is linked to another bone, the relationship is described as a ‘parent/child’ relationship. In this relationship, the positioning and orientation of the mesh node or region associated with the child bone is a product of the positioning, scaling and orientation of both the child and parent bone.
- In conventional so-called ‘skeletal’ animation, this hierarchy simplifies the positioning of a character frame-by frame since, for example, moving a thigh bone will also move a lower leg bone, resulting in a change in the position of the associated mesh of the whole leg.
- In the present embodiment, a skeletal model is used to enable the user to change bone parameters so as to modify, warp and otherwise deform the default mesh of the avatar, independent of whether skeletal animation is subsequently used to move the avatar about.
- Likewise, blend shapes can be used to provide global modifications to the mesh of the avatar that can then be adjusted by the skeletal model in a similar fashion, again independently of whether blend shapes are used in subsequent animation.
- Referring now to
FIG. 11 , to facilitate a fast customisation of the user's avatar, sets of blend shapes (or morph targets) are pre-determined for different ethnicities such as - Caucasian, Native American, African, Middle Eastern, Oriental, and Native Australian. As noted above, a blend shape is a deformed version of a mesh and in the present embodiment is defined with respect to the vertex points that describe the default male or female mesh. For example, the vertices of the blend shape could be defined as the positional offset from the vertices that describe the default, un-deformed mesh. To generate ethnic characteristics, the blend shape associated with the brow of a Native Australian, for example, will be different to that of a Caucasian.
- According to the present embodiment, a user interface enables the user to select the percentage of each ethnicity they wish to include in their avatar, thereby providing a weighted average of the different pre-determined blend shapes for each ethnicity. Accordingly, a user can interact with the user interface to adjust the blend weight of each blend shape associated with each ethnic type so as to quickly and straightforwardly modify the avatars basic appearance. For example, an avatar could be 50% Caucasian and 50% African, or alternatively 40% Native American, 30% Oriental and 30% Native Australian. Optionally predetermined skin tones may be mixed in the same proportions, but preferably these are also independently adjustable.
- Therefore, by modifying the blend weight associated with each blend shape that describes each ethnic type, a wide range of familiar face structures can be quickly imposed upon the default mesh of the avatar, providing an initial source of distinctiveness for the user.
- It will be understood that different blend shape sets may be used for male and female avatars as appropriate, and that the available ethnicities are not limited to the above examples, and indeed can extend to fictional races and species.
- Weighted blend shapes may be used to modify the appearance of the avatar as to body type or amount of body fat. For example, blend shapes for different neck thicknesses could be used or blend shapes relating to fatter or thinner body types could be used to modify the basic body mesh describing an avatar.
- In addition to adjusting ethnicity and body type using weighted blend shapes (where the skeleton is unchanged by altering the blend weights), various bones and groups of bones can be adjusted directly via respective user interfaces so as to modify or further modify the appearance of the avatar. Thus, for example, bones used to control the jaw-line can be adjusted by selecting from a menu of adjustable features the option ‘lower jaw-line’ and then adjusting the bone parameters using the
controller 751. As the controller has at least two sets of directional controls, typically two sets of parameters can be adjusted together. Referring now toFIG. 12 , these adjustments may comprise, for example, rear jaw width (left-hand horizontal control) and the angle between the chin and the rear of the jaw (left-hand vertical control), and chin prominence (right-hand horizontal control) and degree of double-chin (right-hand vertical control). These adjustments change the position, size and orientation of bones associated with the jaw accordingly, and can also affect any bones coupled to these jaw bones in the skeletal hierarchy; thus for example the cheek, upper jaw, nose, ears and neck may each be affected by changes to the jaw line. - Notably, the position, scale and orientation of each individual bone is not necessarily accessible via the user interface; the parameters input by the user are translated into parameters of the individual bones by the entertainment device. This makes the adjustment of the facial features simple for the user.
- Other bone or bone groups can be adjusted in a similar manner to modify facial features, including cheek bones, brow ridge, eye socket size, lateral eye position, vertical eye position, nose position, cranial shape, upper face shape and lower face shape. As noted previously, bones do not correspond to literal bones and so may also be used for such features as lip size and shape, nose profile, vertical ear position and ear shape, as well as for double chins as in the preceding example.
- In an embodiment of the present invention, an additional naturalistic effect is achieved by allowing modifications to one or more bones or bone groups to be asymmetric. For example, ears are rarely exactly matched, both in terms of size and vertical position on the head.
- Thus, for example, bones used to control the ears can be adjusted by selecting from a menu of adjustable features the option ‘ear disparity’ or the like and then adjusting the bone parameters using the
controller 751. Referring now toFIG. 13 , for example the relative imbalance in vertical position of the ears can be controlled by the left-hand horizontal control of the controller such that a move to the right raises the right ear and lowers the left, whilst a move to the left raises the left ear and lowers the right. Meanwhile, the relative imbalance in ear size can be controlled for example by the left-hand vertical control of the controller such that a move upward increases the size of the right ear and decreases the size of the left, and a move downward increases the size of the left ear and decreases the size of the right. - Other bones or bone groups can be adjusted to create asymmetry in a similar manner, such as eye socket size, lateral and vertical eye position, brow tilt, vertical and lateral cheek bone position, nose position and profile, lateral jaw offset and jaw tilt. Several of these features can also be further grouped to provide quick asymmetry adjustments, typically based upon size and angle disparity with respect to the centre line of the face. Referring to
FIGS. 14A and B, these may include for example upper- and/or lower-face symmetry, head (cranial) symmetry and overall face symmetry. In each case, the user input is very simple (e.g. vertical axis adjustments affect size aspects and horizontal axis adjustments affect balance aspects of the asymmetry), and are coupled to parameters of a relevant subset of bones in the skeletal model by a series of weights or transforms, so that for example eye, ear and nose changes in a single ‘face symmetry’ adjustment are not necessarily to the same degree, but are proportionate with respect to each other so as to create a_plausible human face following the subsequent deformation of the face mesh. - Again the specific position, scale and orientation of each individual bone is not necessarily accessible via the user interface; rather the user controls the degree of asymmetry with respect to certain parameters of certain bones. Again, this makes the adjustment of the facial features simple for the user when in practice there may be nearly 100 bones within the avatar's face.
- It will be appreciated by a person skilled in the art that the input convention outlined above is one option, but that other suitable input conventions and methods are possible; for example, using the EyeToy
® video camera 756 to adjust one or more bone or bone groups by gesturing with respect to an on-screen depiction of the avatar. - In addition to adjustment of the bones in the avatar (and in particular in the head and face of the avatar), the user interface enables the addition of further meshes that may interact with bones of the avatar or be associated with further bones. These meshes typically provide accessories for the head and face, such as spectacles, hair, hats, and headphones, and for exotic races or amusing modifications, e.g. features such as horns, crests and trunks.
- Thus, for example, a spectacles mesh may be associated with ear, cheek and nose bones, so that the position of the spectacles automatically reacts to the structure of the face. Similarly, hair, hats and headphones may be associated with ear and cranial bones so that they stretch to fit the size of head and line up with the ear position. Other facial features such as beard and moustaches would be associated with the nose and jaw bones in a manner similar to the facial mesh over which they are placed or which they replace.
- By using a skeletal model within the avatar (or at least within the avatar's head and face) in this fashion, and enabling bones or bone groups to be parametrically adjusted, and furthermore enabling the asymmetric modification of such bones or bone groups, a highly distinctive and naturalistic avatar can be generated. Moreover, by grouping adjustments in decreasing scales of detail, from overall ethnicity (which may be determined via weighted blend shapes), to large scale adjustments (e.g. upper/lower face), to a selection of intuitive adjustments to certain parameters of certain bone groups and finally bones, distinctive and individual faces can be quickly made for the avatar, whilst keeping its facial features harmonious by virtue of the linkages between bones in the skeletal model and the vertex weighting of the bones to the mesh.
- By way of example,
FIG. 15A shows a default female avatar adjusted by weighted blend shapes to correspond to 50% Caucasian, 50% Native American.FIGS. 15B and 15C then show this female avatar after various bone parameters have been adjusted in the manner described herein, so as to produce two highly distinctive faces by subsequent manipulation of the skeletal model. - The avatar may be further customised by the application of texture layers and texture effects. Different texture layers adding wrinkles, freckles, moles or blemishes and scars may be added with varying degrees of transparency via the user interface. Likewise, skin texture may be introduced and varied by controlling the degree of bump mapping (or other lighting effect) used in relation to one or more of these textures, or other dedicated bump-mapping textures. Thus scars, pock-marking and stubble may be added to varying degrees.
- Optionally one or more texture layers may be alpha-blended, bump-mapped or otherwise included on the avatar according to an ‘age’ slider, thereby providing a quick way to vary the age of the character. Optionally certain bone parameters may also be coupled to this slider, for example to cause sunken cheekbones, deeper eye sockets, a thinner neck and larger ears as age progresses. Optionally different age sliders or an age profile switch could be provided to give different age effects; for example another age profile could result in the character becoming fatter-faced, red-cheeked and jowly with age, rather than gaunt.
- Typically, the skeletal model and the mesh deformations responsive to the skeletal model and to the weighted blend shapes are computed by one or more SPEs 110A-H. The resulting modified mesh is used in combination with one or more textures by the
RSX unit 200 to render the avatar. The modified mesh is further manipulated to move the avatar, animate the face, perform lip-syncing and display emotions in substantially the same manner as would be done with a conventional or default avatar mesh. - Referring now back again to
FIG. 5 , when the user logs into the home environment, the PS3 entertainment device transmits configuration details of the user's avatar including the modified bone parameters and blend shape weights to theHome environment server 2010, or optionally in a peer-to-peer fashion to theother PS3s - Likewise, it also receives from either the Home environment server or the peer PS3s the respective configuration details of the other avatars within that instance of the Home environment, also including their respective modified bone parameters and blend shape weights.
- The
PS3 entertainment device 10 then computes the mesh deformation for each avatar responsive to the relevant blend shape weights and modified bone parameters, before rendering it according to the other configuration details received for that avatar. - In an alternative embodiment, the data describing the deformed mesh is transmitted rather than the data describing the blend shape weights and bone parameters, thereby avoiding the need for the recipient PS3 to compute the effect of the blend shape weights and bone parameters on the mesh for each of the other avatars. In this case, the mesh may be transmitted in a conventional manner or as a set of deviations from a default mesh (or a default male or female mesh as applicable).
- In an embodiment of the present invention the Home environment server is therefore operable to receive data descriptive of respective blend shape weights and modified avatar skeletons from each remote entertainment device in an instance of the Home environment, and to transmit this data to the respective remaining PS3s. Alternatively it is operable to transmit mesh data for each avatar, either in a conventional mesh format or as deviations from a default mesh.
- Thus an on-line system comprising a Home environment server and two or more PS3s allows the users of each PS3 to customise their own avatar by virtue of a skeletal modifier and blend shape weight adjuster coupled to a user interface, and to then distribute these modified avatars within the Home environment via the Home environment server before each PS3 renders the Home environment populated with said modified avatars.
- It will be apparent to a person skilled in the art that embodiments of the present invention are not be limited to the Home environment, but are also applicable to any multiplayer on-line environment where a plurality of users may encounter each other through avatars, such as. for example in an on-line game.
- It will be appreciated by a person skilled in the art that the ethnic changes to the avatars face generated by weighted blend shapes may also be achieved by other deformers such as a lattice deformer or sculpt deformer, or by suitably placed bones with different parameter values, or by a combination of the above.
- Referring now to
FIG. 16 , in an embodiment of the present invention a method of user identification in an on-line virtual environment comprises in a first step s10, selecting a user avatar for use in the on-line virtual environment (for example selecting an initial gender or character class). Then in a second step s20, one or more physical properties of one or more skeletal components of the user avatar are modified via a user interface. In a third step s30, vertices in a three dimensional mesh representing some or all of the user avatar are adjusted in response to the position of one or more skeletal components of the user avatar, such that the placement of such vertices is responsive to one or more bones of the modified skeletal model. Then in a fourth step s40, the user avatar is rendered responsive to the modified user avatar skeleton. - It will be apparent to a person skilled in the art that variations in the above method corresponding to operation of the various embodiments of the apparatus disclosed herein are considered within the scope of the present invention, including but not limited to:
-
- transmitting data descriptive of the modified user avatar skeleton to one or more remote entertainment devices;
- receiving data descriptive of respective modified avatar skeletons corresponding to one or more respective remote entertainment devices;
- rendering a plurality of respective avatars corresponding to one or more respective remote entertainment devices, the rendering of each avatar being responsive to its respective modified avatar skeleton;
- modifying one or more physical properties of one or more skeletal components of the user avatar asymmetrically;
- modifying vertex positions of the three dimensional mesh in accordance with one or more blend shapes; and
- selecting of additional skeletal components for incorporation into the user avatar.
- It will be appreciated by a person skilled in the art that in embodiments of the present invention, elements of the method of user identification in an on-line virtual environment and corresponding skeletal modelling, skeletal modification, blend shape weighting, user input and rendering means of the apparatus may be implemented in any suitable manner.
- Thus the adaptation of existing parts of a conventional equivalent entertainment device may be implemented in the form of a computer program product comprising processor implementable instructions stored on a data carrier such as a floppy disk, optical disk, hard disk, PROM, RAM, flash memory or any combination of these or other storage media, or transmitted via data signals on a network such as an Ethernet, a wireless network, the Internet, or any combination of these of other networks, or realised in hardware as an ASIC (application specific integrated circuit) or an FPGA (field programmable gate array) or other configurable circuit suitable to use in adapting the conventional equivalent device.
Claims (29)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0713186.5 | 2007-07-06 | ||
GB0713186A GB2450757A (en) | 2007-07-06 | 2007-07-06 | Avatar customisation, transmission and reception |
PCT/GB2008/002321 WO2009007701A1 (en) | 2007-07-06 | 2008-07-04 | Apparatus and method of avatar customisation |
Publications (1)
Publication Number | Publication Date |
---|---|
US20100203968A1 true US20100203968A1 (en) | 2010-08-12 |
Family
ID=38440552
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/667,775 Abandoned US20100203968A1 (en) | 2007-07-06 | 2008-07-04 | Apparatus And Method Of Avatar Customisation |
Country Status (5)
Country | Link |
---|---|
US (1) | US20100203968A1 (en) |
EP (1) | EP2175950A1 (en) |
JP (1) | JP2010532890A (en) |
GB (1) | GB2450757A (en) |
WO (1) | WO2009007701A1 (en) |
Cited By (207)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100009747A1 (en) * | 2008-07-14 | 2010-01-14 | Microsoft Corporation | Programming APIS for an Extensible Avatar System |
US20100023885A1 (en) * | 2008-07-14 | 2010-01-28 | Microsoft Corporation | System for editing an avatar |
US20100026698A1 (en) * | 2008-08-01 | 2010-02-04 | Microsoft Corporation | Avatar items and animations |
US20100035692A1 (en) * | 2008-08-08 | 2010-02-11 | Microsoft Corporation | Avatar closet/ game awarded avatar |
US20100058208A1 (en) * | 2008-08-26 | 2010-03-04 | Finn Peter G | System and method for tagging objects for heterogeneous searches |
US20100144448A1 (en) * | 2008-12-05 | 2010-06-10 | Namco Bandai Games Inc. | Information storage medium, game device, and game system |
US20100197396A1 (en) * | 2009-02-05 | 2010-08-05 | Kabushiki Kaisha Square Enix (Also Trading As Square Enix Co., Ltd.) | Game apparatus, game character displaying method, and recording medium |
US20100231582A1 (en) * | 2009-03-10 | 2010-09-16 | Yogurt Bilgi Teknolojileri A.S. | Method and system for distributing animation sequences of 3d objects |
US20100293473A1 (en) * | 2009-05-15 | 2010-11-18 | Ganz | Unlocking emoticons using feature codes |
US20100306685A1 (en) * | 2009-05-29 | 2010-12-02 | Microsoft Corporation | User movement feedback via on-screen avatars |
US20110298827A1 (en) * | 2010-06-02 | 2011-12-08 | Microsoft Corporation | Limiting avatar gesture display |
US20110304629A1 (en) * | 2010-06-09 | 2011-12-15 | Microsoft Corporation | Real-time animation of facial expressions |
US20120011453A1 (en) * | 2010-07-08 | 2012-01-12 | Namco Bandai Games Inc. | Method, storage medium, and user terminal |
US20130031475A1 (en) * | 2010-10-18 | 2013-01-31 | Scene 53 Inc. | Social network based virtual assembly places |
US20130127853A1 (en) * | 2011-11-17 | 2013-05-23 | Mixamo, Inc. | System and method for automatic rigging of three dimensional characters for facial animation |
CN103218844A (en) * | 2013-04-03 | 2013-07-24 | 腾讯科技(深圳)有限公司 | Collocation method, implementation method, client side, server and system of virtual image |
US20130296046A1 (en) * | 2012-03-30 | 2013-11-07 | Marty Mianji | System and method for collaborative shopping through social gaming |
US20140078144A1 (en) * | 2012-09-14 | 2014-03-20 | Squee, Inc. | Systems and methods for avatar creation |
US8789094B1 (en) | 2011-06-16 | 2014-07-22 | Google Inc. | Optimizing virtual collaboration sessions for mobile computing devices |
US20140364239A1 (en) * | 2011-12-20 | 2014-12-11 | Icelero Inc | Method and system for creating a virtual social and gaming experience |
US8941642B2 (en) | 2008-10-17 | 2015-01-27 | Kabushiki Kaisha Square Enix | System for the creation and editing of three dimensional models |
CN105426039A (en) * | 2015-10-30 | 2016-03-23 | 广州华多网络科技有限公司 | Method and apparatus for pushing approach image |
US20160110838A1 (en) * | 2014-10-21 | 2016-04-21 | Sony Computer Entertainment Europe Limited | System and method of watermarking |
US20160335840A1 (en) * | 2007-04-30 | 2016-11-17 | Patent Investment & Licensing Company | Gaming device with personality |
US9524582B2 (en) | 2014-01-28 | 2016-12-20 | Siemens Healthcare Gmbh | Method and system for constructing personalized avatars using a parameterized deformable mesh |
US9536138B2 (en) | 2014-06-27 | 2017-01-03 | Microsoft Technology Licensing, Llc | Dynamic remapping of components of a virtual skeleton |
US9626788B2 (en) | 2012-03-06 | 2017-04-18 | Adobe Systems Incorporated | Systems and methods for creating animations using human faces |
US20170113131A1 (en) * | 2015-10-21 | 2017-04-27 | Activision Publishing, Inc. | Interactive videogame using game-related physical objects |
US20170173473A1 (en) * | 2015-12-16 | 2017-06-22 | Crytek Gmbh | Apparatus and method for automatically generating scenery |
US9786084B1 (en) | 2016-06-23 | 2017-10-10 | LoomAi, Inc. | Systems and methods for generating computer ready animation models of a human head from captured data images |
US20180122147A1 (en) * | 2016-10-31 | 2018-05-03 | Dg Holdings, Inc. | Transferrable between styles virtual identity systems and methods |
US20180122148A1 (en) * | 2016-11-01 | 2018-05-03 | Dg Holdings, Inc. | Comparative virtual asset adjustment systems and methods |
US10147146B2 (en) * | 2012-03-14 | 2018-12-04 | Disney Enterprises, Inc. | Tailoring social elements of virtual environments |
US10198845B1 (en) | 2018-05-29 | 2019-02-05 | LoomAi, Inc. | Methods and systems for animating facial expressions |
US10217185B1 (en) * | 2014-08-08 | 2019-02-26 | Amazon Technologies, Inc. | Customizing client experiences within a media universe |
US10238968B2 (en) | 2016-12-06 | 2019-03-26 | Colopl, Inc. | Information processing method, apparatus, and system for executing the information processing method |
US10375313B1 (en) | 2018-05-07 | 2019-08-06 | Apple Inc. | Creative camera |
US10379719B2 (en) | 2017-05-16 | 2019-08-13 | Apple Inc. | Emoji recording and sending |
US10410434B1 (en) | 2018-05-07 | 2019-09-10 | Apple Inc. | Avatar creation user interface |
US10438631B2 (en) * | 2014-02-05 | 2019-10-08 | Snap Inc. | Method for real-time video processing involving retouching of an object in the video |
US10444963B2 (en) | 2016-09-23 | 2019-10-15 | Apple Inc. | Image data for enhanced user interactions |
US10467793B2 (en) * | 2018-02-08 | 2019-11-05 | King.Com Ltd. | Computer implemented method and device |
US10521948B2 (en) | 2017-05-16 | 2019-12-31 | Apple Inc. | Emoji recording and sending |
US10559111B2 (en) | 2016-06-23 | 2020-02-11 | LoomAi, Inc. | Systems and methods for generating computer ready animation models of a human head from captured data images |
US20200051306A1 (en) * | 2015-12-18 | 2020-02-13 | Intel Corporation | Avatar animation system |
US10659405B1 (en) | 2019-05-06 | 2020-05-19 | Apple Inc. | Avatar integration with multiple applications |
CN111899321A (en) * | 2020-08-26 | 2020-11-06 | 网易(杭州)网络有限公司 | Method and device for showing expression of virtual character |
US10848446B1 (en) | 2016-07-19 | 2020-11-24 | Snap Inc. | Displaying customized electronic messaging graphics |
US10852918B1 (en) | 2019-03-08 | 2020-12-01 | Snap Inc. | Contextual information in chat |
US10861170B1 (en) | 2018-11-30 | 2020-12-08 | Snap Inc. | Efficient human pose tracking in videos |
US10872451B2 (en) | 2018-10-31 | 2020-12-22 | Snap Inc. | 3D avatar rendering |
US10880246B2 (en) | 2016-10-24 | 2020-12-29 | Snap Inc. | Generating and displaying customized avatars in electronic messages |
US10893385B1 (en) | 2019-06-07 | 2021-01-12 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US10896534B1 (en) | 2018-09-19 | 2021-01-19 | Snap Inc. | Avatar style transformation using neural networks |
US10895964B1 (en) | 2018-09-25 | 2021-01-19 | Snap Inc. | Interface to display shared user groups |
US10904181B2 (en) | 2018-09-28 | 2021-01-26 | Snap Inc. | Generating customized graphics having reactions to electronic message content |
US10902661B1 (en) | 2018-11-28 | 2021-01-26 | Snap Inc. | Dynamic composite user identifier |
US10911387B1 (en) | 2019-08-12 | 2021-02-02 | Snap Inc. | Message reminder interface |
US10939246B1 (en) | 2019-01-16 | 2021-03-02 | Snap Inc. | Location-based context information sharing in a messaging system |
US10936157B2 (en) | 2017-11-29 | 2021-03-02 | Snap Inc. | Selectable item including a customized graphic for an electronic messaging application |
US10936066B1 (en) | 2019-02-13 | 2021-03-02 | Snap Inc. | Sleep detection in a location sharing system |
US10949648B1 (en) | 2018-01-23 | 2021-03-16 | Snap Inc. | Region-based stabilized face tracking |
US10952013B1 (en) | 2017-04-27 | 2021-03-16 | Snap Inc. | Selective location-based identity communication |
US10951562B2 (en) | 2017-01-18 | 2021-03-16 | Snap. Inc. | Customized contextual media content item generation |
US10964082B2 (en) | 2019-02-26 | 2021-03-30 | Snap Inc. | Avatar based on weather |
US10963529B1 (en) | 2017-04-27 | 2021-03-30 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US10979752B1 (en) | 2018-02-28 | 2021-04-13 | Snap Inc. | Generating media content items based on location information |
CN112673400A (en) * | 2018-07-04 | 2021-04-16 | 网络助手有限责任公司 | Avatar animation |
USD916871S1 (en) | 2019-05-28 | 2021-04-20 | Snap Inc. | Display screen or portion thereof with a transitional graphical user interface |
US10984575B2 (en) | 2019-02-06 | 2021-04-20 | Snap Inc. | Body pose estimation |
USD916872S1 (en) | 2019-05-28 | 2021-04-20 | Snap Inc. | Display screen or portion thereof with a graphical user interface |
USD916811S1 (en) | 2019-05-28 | 2021-04-20 | Snap Inc. | Display screen or portion thereof with a transitional graphical user interface |
US10984569B2 (en) | 2016-06-30 | 2021-04-20 | Snap Inc. | Avatar based ideogram generation |
USD916810S1 (en) | 2019-05-28 | 2021-04-20 | Snap Inc. | Display screen or portion thereof with a graphical user interface |
USD916809S1 (en) | 2019-05-28 | 2021-04-20 | Snap Inc. | Display screen or portion thereof with a transitional graphical user interface |
US10992619B2 (en) | 2019-04-30 | 2021-04-27 | Snap Inc. | Messaging system with avatar generation |
US11010022B2 (en) | 2019-02-06 | 2021-05-18 | Snap Inc. | Global event-based avatar |
US11032670B1 (en) | 2019-01-14 | 2021-06-08 | Snap Inc. | Destination sharing in location sharing system |
US11030789B2 (en) | 2017-10-30 | 2021-06-08 | Snap Inc. | Animated chat presence |
US11030813B2 (en) | 2018-08-30 | 2021-06-08 | Snap Inc. | Video clip object tracking |
US11036989B1 (en) | 2019-12-11 | 2021-06-15 | Snap Inc. | Skeletal tracking using previous frames |
US11036781B1 (en) | 2020-01-30 | 2021-06-15 | Snap Inc. | Video generation system to render frames on demand using a fleet of servers |
US11039270B2 (en) | 2019-03-28 | 2021-06-15 | Snap Inc. | Points of interest in a location sharing system |
US11048916B2 (en) | 2016-03-31 | 2021-06-29 | Snap Inc. | Automated avatar generation |
US11048873B2 (en) | 2015-09-15 | 2021-06-29 | Apple Inc. | Emoji and canned responses |
US11055514B1 (en) | 2018-12-14 | 2021-07-06 | Snap Inc. | Image face manipulation |
US11061372B1 (en) | 2020-05-11 | 2021-07-13 | Apple Inc. | User interfaces related to time |
US11063891B2 (en) | 2019-12-03 | 2021-07-13 | Snap Inc. | Personalized avatar notification |
US11069103B1 (en) | 2017-04-20 | 2021-07-20 | Snap Inc. | Customized user interface for electronic communications |
US11074675B2 (en) | 2018-07-31 | 2021-07-27 | Snap Inc. | Eye texture inpainting |
US11080917B2 (en) | 2019-09-30 | 2021-08-03 | Snap Inc. | Dynamic parameterized user avatar stories |
US20210252397A1 (en) * | 2015-08-26 | 2021-08-19 | Warner Bros. Entertainment Inc. | Social and procedural effects for computer-generated environments |
US11100311B2 (en) | 2016-10-19 | 2021-08-24 | Snap Inc. | Neural networks for facial modeling |
US11103795B1 (en) | 2018-10-31 | 2021-08-31 | Snap Inc. | Game drawer |
US11103161B2 (en) | 2018-05-07 | 2021-08-31 | Apple Inc. | Displaying user interfaces associated with physical activities |
US11107261B2 (en) | 2019-01-18 | 2021-08-31 | Apple Inc. | Virtual avatar animation based on facial feature movement |
US11120601B2 (en) | 2018-02-28 | 2021-09-14 | Snap Inc. | Animated expressive icon |
US11120597B2 (en) | 2017-10-26 | 2021-09-14 | Snap Inc. | Joint audio-video facial animation system |
US11122094B2 (en) | 2017-07-28 | 2021-09-14 | Snap Inc. | Software application manager for messaging applications |
US11128715B1 (en) | 2019-12-30 | 2021-09-21 | Snap Inc. | Physical friend proximity in chat |
US11128586B2 (en) | 2019-12-09 | 2021-09-21 | Snap Inc. | Context sensitive avatar captions |
US11140515B1 (en) | 2019-12-30 | 2021-10-05 | Snap Inc. | Interfaces for relative device positioning |
US11166123B1 (en) | 2019-03-28 | 2021-11-02 | Snap Inc. | Grouped transmission of location data in a location sharing system |
US11169658B2 (en) | 2019-12-31 | 2021-11-09 | Snap Inc. | Combined map icon with action indicator |
US11176737B2 (en) | 2018-11-27 | 2021-11-16 | Snap Inc. | Textured mesh building |
US11189070B2 (en) | 2018-09-28 | 2021-11-30 | Snap Inc. | System and method of generating targeted user lists using customizable avatar characteristics |
US11189098B2 (en) | 2019-06-28 | 2021-11-30 | Snap Inc. | 3D object camera customization system |
US11188190B2 (en) | 2019-06-28 | 2021-11-30 | Snap Inc. | Generating animation overlays in a communication session |
US11199957B1 (en) | 2018-11-30 | 2021-12-14 | Snap Inc. | Generating customized avatars based on location information |
US11217020B2 (en) | 2020-03-16 | 2022-01-04 | Snap Inc. | 3D cutout image modification |
US11218838B2 (en) | 2019-10-31 | 2022-01-04 | Snap Inc. | Focused map-based context information surfacing |
US11227442B1 (en) | 2019-12-19 | 2022-01-18 | Snap Inc. | 3D captions with semantic graphical elements |
US11229849B2 (en) | 2012-05-08 | 2022-01-25 | Snap Inc. | System and method for generating and displaying avatars |
US11245658B2 (en) | 2018-09-28 | 2022-02-08 | Snap Inc. | System and method of generating private notifications between users in a communication session |
US11263817B1 (en) | 2019-12-19 | 2022-03-01 | Snap Inc. | 3D captions with face tracking |
US11284144B2 (en) | 2020-01-30 | 2022-03-22 | Snap Inc. | Video generation system to render frames on demand using a fleet of GPUs |
US11290682B1 (en) | 2015-03-18 | 2022-03-29 | Snap Inc. | Background modification in video conferencing |
US11294936B1 (en) | 2019-01-30 | 2022-04-05 | Snap Inc. | Adaptive spatial density based clustering |
US11295502B2 (en) | 2014-12-23 | 2022-04-05 | Intel Corporation | Augmented facial animation |
US11303850B2 (en) | 2012-04-09 | 2022-04-12 | Intel Corporation | Communication using interactive avatars |
US11307747B2 (en) | 2019-07-11 | 2022-04-19 | Snap Inc. | Edge gesture interface with smart interactions |
US11310176B2 (en) | 2018-04-13 | 2022-04-19 | Snap Inc. | Content suggestion system |
US11307763B2 (en) | 2008-11-19 | 2022-04-19 | Apple Inc. | Portable touch screen device, method, and graphical user interface for using emoji characters |
US11321731B2 (en) | 2015-06-05 | 2022-05-03 | Apple Inc. | User interface for loyalty accounts and private label accounts |
US11320969B2 (en) | 2019-09-16 | 2022-05-03 | Snap Inc. | Messaging system with battery level sharing |
US11356720B2 (en) | 2020-01-30 | 2022-06-07 | Snap Inc. | Video generation system to render frames on demand |
US11354844B2 (en) | 2018-10-26 | 2022-06-07 | Soul Machines Limited | Digital character blending and generation system and method |
US11360733B2 (en) | 2020-09-10 | 2022-06-14 | Snap Inc. | Colocated shared augmented reality without shared backend |
US11411895B2 (en) | 2017-11-29 | 2022-08-09 | Snap Inc. | Generating aggregated media content items for a group of users in an electronic messaging application |
US11425068B2 (en) | 2009-02-03 | 2022-08-23 | Snap Inc. | Interactive avatar in messaging environment |
US11425062B2 (en) | 2019-09-27 | 2022-08-23 | Snap Inc. | Recommended content viewed by friends |
US11438341B1 (en) | 2016-10-10 | 2022-09-06 | Snap Inc. | Social media post subscribe requests for buffer user accounts |
US20220292773A1 (en) * | 2021-03-15 | 2022-09-15 | Tencent America LLC | Methods and systems for personalized 3d head model deformation |
US11450051B2 (en) | 2020-11-18 | 2022-09-20 | Snap Inc. | Personalized avatar real-time motion capture |
US11455081B2 (en) | 2019-08-05 | 2022-09-27 | Snap Inc. | Message thread prioritization interface |
US11452939B2 (en) | 2020-09-21 | 2022-09-27 | Snap Inc. | Graphical marker generation system for synchronizing users |
US11455082B2 (en) | 2018-09-28 | 2022-09-27 | Snap Inc. | Collaborative achievement interface |
US11460974B1 (en) | 2017-11-28 | 2022-10-04 | Snap Inc. | Content discovery refresh |
US11481988B2 (en) | 2010-04-07 | 2022-10-25 | Apple Inc. | Avatar editing environment |
US11516173B1 (en) | 2018-12-26 | 2022-11-29 | Snap Inc. | Message composition interface |
US11543939B2 (en) | 2020-06-08 | 2023-01-03 | Snap Inc. | Encoded image based messaging system |
US11544883B1 (en) | 2017-01-16 | 2023-01-03 | Snap Inc. | Coded vision system |
US11544885B2 (en) | 2021-03-19 | 2023-01-03 | Snap Inc. | Augmented reality experience based on physical items |
US11551393B2 (en) | 2019-07-23 | 2023-01-10 | LoomAi, Inc. | Systems and methods for animation generation |
US11562548B2 (en) | 2021-03-22 | 2023-01-24 | Snap Inc. | True size eyewear in real time |
US11580608B2 (en) | 2016-06-12 | 2023-02-14 | Apple Inc. | Managing contact information for communication applications |
US11580682B1 (en) | 2020-06-30 | 2023-02-14 | Snap Inc. | Messaging system with augmented reality makeup |
US11580700B2 (en) | 2016-10-24 | 2023-02-14 | Snap Inc. | Augmented reality object manipulation |
US11615592B2 (en) | 2020-10-27 | 2023-03-28 | Snap Inc. | Side-by-side character animation from realtime 3D body motion capture |
US11616745B2 (en) | 2017-01-09 | 2023-03-28 | Snap Inc. | Contextual generation and selection of customized media content |
US11619501B2 (en) | 2020-03-11 | 2023-04-04 | Snap Inc. | Avatar based on trip |
US11625873B2 (en) | 2020-03-30 | 2023-04-11 | Snap Inc. | Personalized media overlay recommendation |
US11636662B2 (en) | 2021-09-30 | 2023-04-25 | Snap Inc. | Body normal network light and rendering control |
US11636654B2 (en) | 2021-05-19 | 2023-04-25 | Snap Inc. | AR-based connected portal shopping |
US11651572B2 (en) | 2021-10-11 | 2023-05-16 | Snap Inc. | Light and rendering of garments |
US11651539B2 (en) | 2020-01-30 | 2023-05-16 | Snap Inc. | System for generating media content items on demand |
US11662900B2 (en) | 2016-05-31 | 2023-05-30 | Snap Inc. | Application control using a gesture based trigger |
US11663792B2 (en) | 2021-09-08 | 2023-05-30 | Snap Inc. | Body fitted accessory with physics simulation |
US11660022B2 (en) | 2020-10-27 | 2023-05-30 | Snap Inc. | Adaptive skeletal joint smoothing |
US11670059B2 (en) | 2021-09-01 | 2023-06-06 | Snap Inc. | Controlling interactive fashion based on body gestures |
US11676199B2 (en) | 2019-06-28 | 2023-06-13 | Snap Inc. | Generating customizable avatar outfits |
US11673054B2 (en) | 2021-09-07 | 2023-06-13 | Snap Inc. | Controlling AR games on fashion items |
US11683280B2 (en) | 2020-06-10 | 2023-06-20 | Snap Inc. | Messaging system including an external-resource dock and drawer |
US11704878B2 (en) | 2017-01-09 | 2023-07-18 | Snap Inc. | Surface aware lens |
US11714536B2 (en) | 2021-05-21 | 2023-08-01 | Apple Inc. | Avatar sticker editor user interfaces |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
US11734959B2 (en) | 2021-03-16 | 2023-08-22 | Snap Inc. | Activating hands-free mode on mirroring device |
US11734866B2 (en) | 2021-09-13 | 2023-08-22 | Snap Inc. | Controlling interactive fashion based on voice |
US11734894B2 (en) | 2020-11-18 | 2023-08-22 | Snap Inc. | Real-time motion transfer for prosthetic limbs |
US11748931B2 (en) | 2020-11-18 | 2023-09-05 | Snap Inc. | Body animation sharing and remixing |
US11748958B2 (en) | 2021-12-07 | 2023-09-05 | Snap Inc. | Augmented reality unboxing experience |
US11763481B2 (en) | 2021-10-20 | 2023-09-19 | Snap Inc. | Mirror-based augmented reality experience |
US11776190B2 (en) | 2021-06-04 | 2023-10-03 | Apple Inc. | Techniques for managing an avatar on a lock screen |
US20230316622A1 (en) * | 2017-11-17 | 2023-10-05 | Sony Interactive Entertainment LLC | Systems, methods, and devices for creating a spline-based video animation sequence |
US11790531B2 (en) | 2021-02-24 | 2023-10-17 | Snap Inc. | Whole body segmentation |
US11790614B2 (en) | 2021-10-11 | 2023-10-17 | Snap Inc. | Inferring intent from pose and speech input |
US11798238B2 (en) | 2021-09-14 | 2023-10-24 | Snap Inc. | Blending body mesh into external mesh |
US11798201B2 (en) | 2021-03-16 | 2023-10-24 | Snap Inc. | Mirroring device with whole-body outfits |
US11809633B2 (en) | 2021-03-16 | 2023-11-07 | Snap Inc. | Mirroring device with pointing based navigation |
US11818286B2 (en) | 2020-03-30 | 2023-11-14 | Snap Inc. | Avatar recommendation and reply |
US11823346B2 (en) | 2022-01-17 | 2023-11-21 | Snap Inc. | AR body part tracking system |
US11830209B2 (en) | 2017-05-26 | 2023-11-28 | Snap Inc. | Neural network-based image stream modification |
US11836862B2 (en) | 2021-10-11 | 2023-12-05 | Snap Inc. | External mesh with vertex attributes |
US11836866B2 (en) | 2021-09-20 | 2023-12-05 | Snap Inc. | Deforming real-world object using an external mesh |
US11842411B2 (en) | 2017-04-27 | 2023-12-12 | Snap Inc. | Location-based virtual avatars |
US11854069B2 (en) | 2021-07-16 | 2023-12-26 | Snap Inc. | Personalized try-on ads |
US11852554B1 (en) | 2019-03-21 | 2023-12-26 | Snap Inc. | Barometer calibration in a location sharing system |
US11863513B2 (en) | 2020-08-31 | 2024-01-02 | Snap Inc. | Media content playback and comments management |
US11868414B1 (en) | 2019-03-14 | 2024-01-09 | Snap Inc. | Graph-based prediction for contact suggestion in a location sharing system |
US11870745B1 (en) | 2022-06-28 | 2024-01-09 | Snap Inc. | Media gallery sharing and management |
US11870743B1 (en) | 2017-01-23 | 2024-01-09 | Snap Inc. | Customized digital avatar accessories |
US11875439B2 (en) | 2018-04-18 | 2024-01-16 | Snap Inc. | Augmented expression system |
US11880947B2 (en) | 2021-12-21 | 2024-01-23 | Snap Inc. | Real-time upper-body garment exchange |
US11887260B2 (en) | 2021-12-30 | 2024-01-30 | Snap Inc. | AR position indicator |
US11888795B2 (en) | 2020-09-21 | 2024-01-30 | Snap Inc. | Chats with micro sound clips |
US11893166B1 (en) | 2022-11-08 | 2024-02-06 | Snap Inc. | User avatar movement control using an augmented reality eyewear device |
US11900506B2 (en) | 2021-09-09 | 2024-02-13 | Snap Inc. | Controlling interactive fashion based on facial expressions |
US11910269B2 (en) | 2020-09-25 | 2024-02-20 | Snap Inc. | Augmented reality content items including user avatar to share location |
US11908083B2 (en) | 2021-08-31 | 2024-02-20 | Snap Inc. | Deforming custom mesh based on body mesh |
US11908243B2 (en) | 2021-03-16 | 2024-02-20 | Snap Inc. | Menu hierarchy navigation on electronic mirroring devices |
US11921998B2 (en) | 2020-05-11 | 2024-03-05 | Apple Inc. | Editing features of an avatar |
US11922010B2 (en) | 2020-06-08 | 2024-03-05 | Snap Inc. | Providing contextual information with keyboard interface for messaging system |
US11928783B2 (en) | 2021-12-30 | 2024-03-12 | Snap Inc. | AR position and orientation along a plane |
US11941227B2 (en) | 2021-06-30 | 2024-03-26 | Snap Inc. | Hybrid search system for customizable media |
US11956190B2 (en) | 2020-05-08 | 2024-04-09 | Snap Inc. | Messaging system with a carousel of related entities |
US11954762B2 (en) | 2022-01-19 | 2024-04-09 | Snap Inc. | Object replacement system |
US11960784B2 (en) | 2021-12-07 | 2024-04-16 | Snap Inc. | Shared augmented reality unboxing experience |
Families Citing this family (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8704832B2 (en) | 2008-09-20 | 2014-04-22 | Mixamo, Inc. | Interactive design, synthesis and delivery of 3D character motion data through the web |
US8460107B2 (en) | 2008-10-09 | 2013-06-11 | Wms Gaming, Inc. | Controlling and presenting virtual wagering game environments |
US8749556B2 (en) | 2008-10-14 | 2014-06-10 | Mixamo, Inc. | Data compression for real-time streaming of deformable 3D models for 3D animation |
US8982122B2 (en) | 2008-11-24 | 2015-03-17 | Mixamo, Inc. | Real time concurrent design of shape, texture, and motion for 3D character animation |
US8659596B2 (en) | 2008-11-24 | 2014-02-25 | Mixamo, Inc. | Real time generation of animation-ready 3D character models |
US20100259547A1 (en) | 2009-02-12 | 2010-10-14 | Mixamo, Inc. | Web platform for interactive design, synthesis and delivery of 3d character motion data |
US20110025689A1 (en) * | 2009-07-29 | 2011-02-03 | Microsoft Corporation | Auto-Generating A Visual Representation |
US8284157B2 (en) | 2010-01-15 | 2012-10-09 | Microsoft Corporation | Directed performance in motion capture system |
US8928672B2 (en) | 2010-04-28 | 2015-01-06 | Mixamo, Inc. | Real-time automatic concatenation of 3D animation sequences |
US8771064B2 (en) | 2010-05-26 | 2014-07-08 | Aristocrat Technologies Australia Pty Limited | Gaming system and a method of gaming |
US8843585B1 (en) | 2010-07-06 | 2014-09-23 | Midnight Studios, Inc. | Methods and apparatus for generating a unique virtual item |
US8797328B2 (en) | 2010-07-23 | 2014-08-05 | Mixamo, Inc. | Automatic generation of 3D character animation from 3D meshes |
US9345973B1 (en) | 2010-08-06 | 2016-05-24 | Bally Gaming, Inc. | Controlling wagering game system browser areas |
US8911294B2 (en) | 2010-08-06 | 2014-12-16 | Wms Gaming, Inc. | Browser based heterogenous technology ecosystem |
JP5620743B2 (en) * | 2010-08-16 | 2014-11-05 | 株式会社カプコン | Facial image editing program, recording medium recording the facial image editing program, and facial image editing system |
US10049482B2 (en) | 2011-07-22 | 2018-08-14 | Adobe Systems Incorporated | Systems and methods for animation recommendations |
JP2014199536A (en) * | 2013-03-29 | 2014-10-23 | 株式会社コナミデジタルエンタテインメント | Face model generating device, method for controlling face model generating device, and program |
JP6392832B2 (en) * | 2016-12-06 | 2018-09-19 | 株式会社コロプラ | Information processing method, apparatus, and program for causing computer to execute information processing method |
JP7100842B2 (en) * | 2018-03-29 | 2022-07-14 | 国立研究開発法人情報通信研究機構 | Model analysis device, model analysis method, and model analysis program |
CN110111247B (en) * | 2019-05-15 | 2022-06-24 | 浙江商汤科技开发有限公司 | Face deformation processing method, device and equipment |
US20230071947A1 (en) * | 2019-11-25 | 2023-03-09 | Sony Group Corporation | Information processing system, information processing method, program, and user interface |
JP6818219B1 (en) * | 2020-05-29 | 2021-01-20 | 株式会社PocketRD | 3D avatar generator, 3D avatar generation method and 3D avatar generation program |
US11688117B2 (en) | 2020-11-04 | 2023-06-27 | Sony Group Corporation | User expressions in virtual environments |
CN113050795A (en) * | 2021-03-24 | 2021-06-29 | 北京百度网讯科技有限公司 | Virtual image generation method and device |
JP7382112B1 (en) | 2023-01-25 | 2023-11-16 | Kddi株式会社 | Information processing device and information processing method |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6163322A (en) * | 1998-01-19 | 2000-12-19 | Taarna Studios Inc. | Method and apparatus for providing real-time animation utilizing a database of postures |
US6545682B1 (en) * | 2000-05-24 | 2003-04-08 | There, Inc. | Method and apparatus for creating and customizing avatars using genetic paradigm |
US20050137015A1 (en) * | 2003-08-19 | 2005-06-23 | Lawrence Rogers | Systems and methods for a role-playing game having a customizable avatar and differentiated instant messaging environment |
US20050250579A1 (en) * | 2004-05-07 | 2005-11-10 | Valve Corporation | Generating eyes for a character in a virtual environment |
US7181690B1 (en) * | 1995-11-13 | 2007-02-20 | Worlds. Com Inc. | System and method for enabling users to interact in a virtual space |
US7184047B1 (en) * | 1996-12-24 | 2007-02-27 | Stephen James Crampton | Method and apparatus for the generation of computer graphic representations of individuals |
US20070268293A1 (en) * | 2006-05-19 | 2007-11-22 | Erick Miller | Musculo-skeletal shape skinning |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0934874A (en) * | 1995-07-21 | 1997-02-07 | Nippon Telegr & Teleph Corp <Ntt> | Distributing coordinate space configuration method and system |
US5909218A (en) * | 1996-04-25 | 1999-06-01 | Matsushita Electric Industrial Co., Ltd. | Transmitter-receiver of three-dimensional skeleton structure motions and method thereof |
US6147692A (en) * | 1997-06-25 | 2000-11-14 | Haptek, Inc. | Method and apparatus for controlling transformation of two and three-dimensional images |
JP3338382B2 (en) * | 1997-07-31 | 2002-10-28 | 松下電器産業株式会社 | Apparatus and method for transmitting and receiving a data stream representing a three-dimensional virtual space |
JP2001216531A (en) * | 2000-02-02 | 2001-08-10 | Nippon Telegr & Teleph Corp <Ntt> | Method for displaying participant in three-dimensional virtual space and three-dimensional virtual space display device |
KR100327541B1 (en) * | 2000-08-10 | 2002-03-08 | 김재성, 이두원 | 3D facial modeling system and modeling method |
-
2007
- 2007-07-06 GB GB0713186A patent/GB2450757A/en not_active Withdrawn
-
2008
- 2008-07-04 WO PCT/GB2008/002321 patent/WO2009007701A1/en active Application Filing
- 2008-07-04 EP EP08762523A patent/EP2175950A1/en not_active Ceased
- 2008-07-04 JP JP2010514129A patent/JP2010532890A/en active Pending
- 2008-07-04 US US12/667,775 patent/US20100203968A1/en not_active Abandoned
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7181690B1 (en) * | 1995-11-13 | 2007-02-20 | Worlds. Com Inc. | System and method for enabling users to interact in a virtual space |
US7184047B1 (en) * | 1996-12-24 | 2007-02-27 | Stephen James Crampton | Method and apparatus for the generation of computer graphic representations of individuals |
US6163322A (en) * | 1998-01-19 | 2000-12-19 | Taarna Studios Inc. | Method and apparatus for providing real-time animation utilizing a database of postures |
US6545682B1 (en) * | 2000-05-24 | 2003-04-08 | There, Inc. | Method and apparatus for creating and customizing avatars using genetic paradigm |
US20050137015A1 (en) * | 2003-08-19 | 2005-06-23 | Lawrence Rogers | Systems and methods for a role-playing game having a customizable avatar and differentiated instant messaging environment |
US20050250579A1 (en) * | 2004-05-07 | 2005-11-10 | Valve Corporation | Generating eyes for a character in a virtual environment |
US20070268293A1 (en) * | 2006-05-19 | 2007-11-22 | Erick Miller | Musculo-skeletal shape skinning |
Cited By (358)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9697677B2 (en) * | 2007-04-30 | 2017-07-04 | Patent Investment & Licensing Company | Gaming device with personality |
US10037648B2 (en) | 2007-04-30 | 2018-07-31 | Patent Investment & Licensing Company | Gaming device with personality |
US20160335840A1 (en) * | 2007-04-30 | 2016-11-17 | Patent Investment & Licensing Company | Gaming device with personality |
US20120246585A9 (en) * | 2008-07-14 | 2012-09-27 | Microsoft Corporation | System for editing an avatar |
US20100009747A1 (en) * | 2008-07-14 | 2010-01-14 | Microsoft Corporation | Programming APIS for an Extensible Avatar System |
US8446414B2 (en) | 2008-07-14 | 2013-05-21 | Microsoft Corporation | Programming APIS for an extensible avatar system |
US20100023885A1 (en) * | 2008-07-14 | 2010-01-28 | Microsoft Corporation | System for editing an avatar |
US8384719B2 (en) | 2008-08-01 | 2013-02-26 | Microsoft Corporation | Avatar items and animations |
US20100026698A1 (en) * | 2008-08-01 | 2010-02-04 | Microsoft Corporation | Avatar items and animations |
US20100035692A1 (en) * | 2008-08-08 | 2010-02-11 | Microsoft Corporation | Avatar closet/ game awarded avatar |
US8639589B2 (en) | 2008-08-26 | 2014-01-28 | International Business Machines Corporation | Externalizing virtual object tags relating to virtual objects |
US8639588B2 (en) | 2008-08-26 | 2014-01-28 | International Business Machines Corporation | Externalizing virtual object tags relating to virtual objects |
US20100058208A1 (en) * | 2008-08-26 | 2010-03-04 | Finn Peter G | System and method for tagging objects for heterogeneous searches |
US8473356B2 (en) * | 2008-08-26 | 2013-06-25 | International Business Machines Corporation | System and method for tagging objects for heterogeneous searches |
US8941642B2 (en) | 2008-10-17 | 2015-01-27 | Kabushiki Kaisha Square Enix | System for the creation and editing of three dimensional models |
US11307763B2 (en) | 2008-11-19 | 2022-04-19 | Apple Inc. | Portable touch screen device, method, and graphical user interface for using emoji characters |
US20100144448A1 (en) * | 2008-12-05 | 2010-06-10 | Namco Bandai Games Inc. | Information storage medium, game device, and game system |
US11425068B2 (en) | 2009-02-03 | 2022-08-23 | Snap Inc. | Interactive avatar in messaging environment |
US8834267B2 (en) * | 2009-02-05 | 2014-09-16 | Square Enix Co., Ltd. | Avatar useable in multiple games that changes appearance according to the game being played |
US20100197396A1 (en) * | 2009-02-05 | 2010-08-05 | Kabushiki Kaisha Square Enix (Also Trading As Square Enix Co., Ltd.) | Game apparatus, game character displaying method, and recording medium |
US20100231582A1 (en) * | 2009-03-10 | 2010-09-16 | Yogurt Bilgi Teknolojileri A.S. | Method and system for distributing animation sequences of 3d objects |
US20100293473A1 (en) * | 2009-05-15 | 2010-11-18 | Ganz | Unlocking emoticons using feature codes |
US8788943B2 (en) * | 2009-05-15 | 2014-07-22 | Ganz | Unlocking emoticons using feature codes |
US20100306685A1 (en) * | 2009-05-29 | 2010-12-02 | Microsoft Corporation | User movement feedback via on-screen avatars |
US11869165B2 (en) | 2010-04-07 | 2024-01-09 | Apple Inc. | Avatar editing environment |
US11481988B2 (en) | 2010-04-07 | 2022-10-25 | Apple Inc. | Avatar editing environment |
US20110298827A1 (en) * | 2010-06-02 | 2011-12-08 | Microsoft Corporation | Limiting avatar gesture display |
US9245177B2 (en) * | 2010-06-02 | 2016-01-26 | Microsoft Technology Licensing, Llc | Limiting avatar gesture display |
US20110304629A1 (en) * | 2010-06-09 | 2011-12-15 | Microsoft Corporation | Real-time animation of facial expressions |
US20120011453A1 (en) * | 2010-07-08 | 2012-01-12 | Namco Bandai Games Inc. | Method, storage medium, and user terminal |
US20130031475A1 (en) * | 2010-10-18 | 2013-01-31 | Scene 53 Inc. | Social network based virtual assembly places |
US9094476B1 (en) | 2011-06-16 | 2015-07-28 | Google Inc. | Ambient communication session |
US9230241B1 (en) | 2011-06-16 | 2016-01-05 | Google Inc. | Initiating a communication session based on an associated content item |
US8997007B1 (en) | 2011-06-16 | 2015-03-31 | Google Inc. | Indicating availability for participation in communication session |
US10554696B2 (en) | 2011-06-16 | 2020-02-04 | Google Llc | Initiating a communication session based on an associated content item |
US9866597B2 (en) | 2011-06-16 | 2018-01-09 | Google Llc | Ambient communication session |
US10250648B2 (en) | 2011-06-16 | 2019-04-02 | Google Llc | Ambient communication session |
US8832284B1 (en) | 2011-06-16 | 2014-09-09 | Google Inc. | Virtual socializing |
US9800622B2 (en) | 2011-06-16 | 2017-10-24 | Google Inc. | Virtual socializing |
US8789094B1 (en) | 2011-06-16 | 2014-07-22 | Google Inc. | Optimizing virtual collaboration sessions for mobile computing devices |
US10748325B2 (en) * | 2011-11-17 | 2020-08-18 | Adobe Inc. | System and method for automatic rigging of three dimensional characters for facial animation |
US11170558B2 (en) | 2011-11-17 | 2021-11-09 | Adobe Inc. | Automatic rigging of three dimensional characters for animation |
US20130127853A1 (en) * | 2011-11-17 | 2013-05-23 | Mixamo, Inc. | System and method for automatic rigging of three dimensional characters for facial animation |
US20140364239A1 (en) * | 2011-12-20 | 2014-12-11 | Icelero Inc | Method and system for creating a virtual social and gaming experience |
US9747495B2 (en) | 2012-03-06 | 2017-08-29 | Adobe Systems Incorporated | Systems and methods for creating and distributing modifiable animated video messages |
US9626788B2 (en) | 2012-03-06 | 2017-04-18 | Adobe Systems Incorporated | Systems and methods for creating animations using human faces |
US10147146B2 (en) * | 2012-03-14 | 2018-12-04 | Disney Enterprises, Inc. | Tailoring social elements of virtual environments |
US20130296046A1 (en) * | 2012-03-30 | 2013-11-07 | Marty Mianji | System and method for collaborative shopping through social gaming |
US11595617B2 (en) | 2012-04-09 | 2023-02-28 | Intel Corporation | Communication using interactive avatars |
US11303850B2 (en) | 2012-04-09 | 2022-04-12 | Intel Corporation | Communication using interactive avatars |
US11607616B2 (en) | 2012-05-08 | 2023-03-21 | Snap Inc. | System and method for generating and displaying avatars |
US11229849B2 (en) | 2012-05-08 | 2022-01-25 | Snap Inc. | System and method for generating and displaying avatars |
US11925869B2 (en) | 2012-05-08 | 2024-03-12 | Snap Inc. | System and method for generating and displaying avatars |
US20140078144A1 (en) * | 2012-09-14 | 2014-03-20 | Squee, Inc. | Systems and methods for avatar creation |
CN103218844A (en) * | 2013-04-03 | 2013-07-24 | 腾讯科技(深圳)有限公司 | Collocation method, implementation method, client side, server and system of virtual image |
US9524582B2 (en) | 2014-01-28 | 2016-12-20 | Siemens Healthcare Gmbh | Method and system for constructing personalized avatars using a parameterized deformable mesh |
US10438631B2 (en) * | 2014-02-05 | 2019-10-08 | Snap Inc. | Method for real-time video processing involving retouching of an object in the video |
US10991395B1 (en) | 2014-02-05 | 2021-04-27 | Snap Inc. | Method for real time video processing involving changing a color of an object on a human face in a video |
US11651797B2 (en) | 2014-02-05 | 2023-05-16 | Snap Inc. | Real time video processing for changing proportions of an object in the video |
US10950271B1 (en) | 2014-02-05 | 2021-03-16 | Snap Inc. | Method for triggering events in a video |
US11514947B1 (en) | 2014-02-05 | 2022-11-29 | Snap Inc. | Method for real-time video processing involving changing features of an object in the video |
US11468913B1 (en) * | 2014-02-05 | 2022-10-11 | Snap Inc. | Method for real-time video processing involving retouching of an object in the video |
US11450349B2 (en) | 2014-02-05 | 2022-09-20 | Snap Inc. | Real time video processing for changing proportions of an object in the video |
US11443772B2 (en) | 2014-02-05 | 2022-09-13 | Snap Inc. | Method for triggering events in a video |
US10586570B2 (en) | 2014-02-05 | 2020-03-10 | Snap Inc. | Real time video processing for changing proportions of an object in the video |
US10566026B1 (en) | 2014-02-05 | 2020-02-18 | Snap Inc. | Method for real-time video processing involving changing features of an object in the video |
US9824478B2 (en) | 2014-06-27 | 2017-11-21 | Microsoft Technology Licensing, Llc | Dynamic remapping of components of a virtual skeleton |
US9536138B2 (en) | 2014-06-27 | 2017-01-03 | Microsoft Technology Licensing, Llc | Dynamic remapping of components of a virtual skeleton |
US10564820B1 (en) | 2014-08-08 | 2020-02-18 | Amazon Technologies, Inc. | Active content in digital media within a media universe |
US10217185B1 (en) * | 2014-08-08 | 2019-02-26 | Amazon Technologies, Inc. | Customizing client experiences within a media universe |
US10506003B1 (en) | 2014-08-08 | 2019-12-10 | Amazon Technologies, Inc. | Repository service for managing digital assets |
US10719192B1 (en) | 2014-08-08 | 2020-07-21 | Amazon Technologies, Inc. | Client-generated content within a media universe |
US9811871B2 (en) * | 2014-10-21 | 2017-11-07 | Sony Interactive Entertainment Europe Limited | System and method of watermarking |
US20160110838A1 (en) * | 2014-10-21 | 2016-04-21 | Sony Computer Entertainment Europe Limited | System and method of watermarking |
US11295502B2 (en) | 2014-12-23 | 2022-04-05 | Intel Corporation | Augmented facial animation |
US11290682B1 (en) | 2015-03-18 | 2022-03-29 | Snap Inc. | Background modification in video conferencing |
US11321731B2 (en) | 2015-06-05 | 2022-05-03 | Apple Inc. | User interface for loyalty accounts and private label accounts |
US11734708B2 (en) | 2015-06-05 | 2023-08-22 | Apple Inc. | User interface for loyalty accounts and private label accounts |
US20210252397A1 (en) * | 2015-08-26 | 2021-08-19 | Warner Bros. Entertainment Inc. | Social and procedural effects for computer-generated environments |
US11048873B2 (en) | 2015-09-15 | 2021-06-29 | Apple Inc. | Emoji and canned responses |
US20170113131A1 (en) * | 2015-10-21 | 2017-04-27 | Activision Publishing, Inc. | Interactive videogame using game-related physical objects |
US10610777B2 (en) | 2015-10-21 | 2020-04-07 | Activision Publishing, Inc. | Interactive videogame using game-related physical objects |
US9868059B2 (en) * | 2015-10-21 | 2018-01-16 | Activision Publishing, Inc. | Interactive videogame using game-related physical objects |
CN105426039A (en) * | 2015-10-30 | 2016-03-23 | 广州华多网络科技有限公司 | Method and apparatus for pushing approach image |
US20170173473A1 (en) * | 2015-12-16 | 2017-06-22 | Crytek Gmbh | Apparatus and method for automatically generating scenery |
US11887231B2 (en) * | 2015-12-18 | 2024-01-30 | Tahoe Research, Ltd. | Avatar animation system |
US20200051306A1 (en) * | 2015-12-18 | 2020-02-13 | Intel Corporation | Avatar animation system |
US11048916B2 (en) | 2016-03-31 | 2021-06-29 | Snap Inc. | Automated avatar generation |
US11631276B2 (en) | 2016-03-31 | 2023-04-18 | Snap Inc. | Automated avatar generation |
US11662900B2 (en) | 2016-05-31 | 2023-05-30 | Snap Inc. | Application control using a gesture based trigger |
US11580608B2 (en) | 2016-06-12 | 2023-02-14 | Apple Inc. | Managing contact information for communication applications |
US11922518B2 (en) | 2016-06-12 | 2024-03-05 | Apple Inc. | Managing contact information for communication applications |
US9786084B1 (en) | 2016-06-23 | 2017-10-10 | LoomAi, Inc. | Systems and methods for generating computer ready animation models of a human head from captured data images |
US10169905B2 (en) | 2016-06-23 | 2019-01-01 | LoomAi, Inc. | Systems and methods for animating models from audio data |
US10062198B2 (en) | 2016-06-23 | 2018-08-28 | LoomAi, Inc. | Systems and methods for generating computer ready animation models of a human head from captured data images |
US10559111B2 (en) | 2016-06-23 | 2020-02-11 | LoomAi, Inc. | Systems and methods for generating computer ready animation models of a human head from captured data images |
US10984569B2 (en) | 2016-06-30 | 2021-04-20 | Snap Inc. | Avatar based ideogram generation |
US11418470B2 (en) | 2016-07-19 | 2022-08-16 | Snap Inc. | Displaying customized electronic messaging graphics |
US11509615B2 (en) | 2016-07-19 | 2022-11-22 | Snap Inc. | Generating customized electronic messaging graphics |
US11438288B2 (en) | 2016-07-19 | 2022-09-06 | Snap Inc. | Displaying customized electronic messaging graphics |
US10855632B2 (en) | 2016-07-19 | 2020-12-01 | Snap Inc. | Displaying customized electronic messaging graphics |
US10848446B1 (en) | 2016-07-19 | 2020-11-24 | Snap Inc. | Displaying customized electronic messaging graphics |
US10444963B2 (en) | 2016-09-23 | 2019-10-15 | Apple Inc. | Image data for enhanced user interactions |
US11438341B1 (en) | 2016-10-10 | 2022-09-06 | Snap Inc. | Social media post subscribe requests for buffer user accounts |
US11100311B2 (en) | 2016-10-19 | 2021-08-24 | Snap Inc. | Neural networks for facial modeling |
US10938758B2 (en) | 2016-10-24 | 2021-03-02 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US11876762B1 (en) | 2016-10-24 | 2024-01-16 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US11843456B2 (en) | 2016-10-24 | 2023-12-12 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US10880246B2 (en) | 2016-10-24 | 2020-12-29 | Snap Inc. | Generating and displaying customized avatars in electronic messages |
US11580700B2 (en) | 2016-10-24 | 2023-02-14 | Snap Inc. | Augmented reality object manipulation |
US11218433B2 (en) | 2016-10-24 | 2022-01-04 | Snap Inc. | Generating and displaying customized avatars in electronic messages |
US20180122147A1 (en) * | 2016-10-31 | 2018-05-03 | Dg Holdings, Inc. | Transferrable between styles virtual identity systems and methods |
US10769860B2 (en) * | 2016-10-31 | 2020-09-08 | Dg Holdings, Inc. | Transferrable between styles virtual identity systems and methods |
US11631229B2 (en) | 2016-11-01 | 2023-04-18 | Dg Holdings, Inc. | Comparative virtual asset adjustment systems and methods |
US11354877B2 (en) | 2016-11-01 | 2022-06-07 | Dg Holdings, Inc. | Comparative virtual asset adjustment systems and methods |
US20180122148A1 (en) * | 2016-11-01 | 2018-05-03 | Dg Holdings, Inc. | Comparative virtual asset adjustment systems and methods |
US10930086B2 (en) * | 2016-11-01 | 2021-02-23 | Dg Holdings, Inc. | Comparative virtual asset adjustment systems and methods |
US10894211B2 (en) | 2016-12-06 | 2021-01-19 | Colopl, Inc. | Information processing method, apparatus, and system for executing the information processing method |
US10238968B2 (en) | 2016-12-06 | 2019-03-26 | Colopl, Inc. | Information processing method, apparatus, and system for executing the information processing method |
US11616745B2 (en) | 2017-01-09 | 2023-03-28 | Snap Inc. | Contextual generation and selection of customized media content |
US11704878B2 (en) | 2017-01-09 | 2023-07-18 | Snap Inc. | Surface aware lens |
US11544883B1 (en) | 2017-01-16 | 2023-01-03 | Snap Inc. | Coded vision system |
US10951562B2 (en) | 2017-01-18 | 2021-03-16 | Snap. Inc. | Customized contextual media content item generation |
US11870743B1 (en) | 2017-01-23 | 2024-01-09 | Snap Inc. | Customized digital avatar accessories |
US11593980B2 (en) | 2017-04-20 | 2023-02-28 | Snap Inc. | Customized user interface for electronic communications |
US11069103B1 (en) | 2017-04-20 | 2021-07-20 | Snap Inc. | Customized user interface for electronic communications |
US11418906B2 (en) | 2017-04-27 | 2022-08-16 | Snap Inc. | Selective location-based identity communication |
US11893647B2 (en) | 2017-04-27 | 2024-02-06 | Snap Inc. | Location-based virtual avatars |
US11451956B1 (en) | 2017-04-27 | 2022-09-20 | Snap Inc. | Location privacy management on map-based social media platforms |
US11842411B2 (en) | 2017-04-27 | 2023-12-12 | Snap Inc. | Location-based virtual avatars |
US10963529B1 (en) | 2017-04-27 | 2021-03-30 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US11392264B1 (en) | 2017-04-27 | 2022-07-19 | Snap Inc. | Map-based graphical user interface for multi-type social media galleries |
US11385763B2 (en) | 2017-04-27 | 2022-07-12 | Snap Inc. | Map-based graphical user interface indicating geospatial activity metrics |
US11782574B2 (en) | 2017-04-27 | 2023-10-10 | Snap Inc. | Map-based graphical user interface indicating geospatial activity metrics |
US10952013B1 (en) | 2017-04-27 | 2021-03-16 | Snap Inc. | Selective location-based identity communication |
US11474663B2 (en) | 2017-04-27 | 2022-10-18 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US11532112B2 (en) | 2017-05-16 | 2022-12-20 | Apple Inc. | Emoji recording and sending |
US10521091B2 (en) | 2017-05-16 | 2019-12-31 | Apple Inc. | Emoji recording and sending |
US10997768B2 (en) | 2017-05-16 | 2021-05-04 | Apple Inc. | Emoji recording and sending |
US10521948B2 (en) | 2017-05-16 | 2019-12-31 | Apple Inc. | Emoji recording and sending |
US10846905B2 (en) | 2017-05-16 | 2020-11-24 | Apple Inc. | Emoji recording and sending |
US10845968B2 (en) | 2017-05-16 | 2020-11-24 | Apple Inc. | Emoji recording and sending |
US10379719B2 (en) | 2017-05-16 | 2019-08-13 | Apple Inc. | Emoji recording and sending |
US11830209B2 (en) | 2017-05-26 | 2023-11-28 | Snap Inc. | Neural network-based image stream modification |
US11882162B2 (en) | 2017-07-28 | 2024-01-23 | Snap Inc. | Software application manager for messaging applications |
US11122094B2 (en) | 2017-07-28 | 2021-09-14 | Snap Inc. | Software application manager for messaging applications |
US11659014B2 (en) | 2017-07-28 | 2023-05-23 | Snap Inc. | Software application manager for messaging applications |
US11610354B2 (en) | 2017-10-26 | 2023-03-21 | Snap Inc. | Joint audio-video facial animation system |
US11120597B2 (en) | 2017-10-26 | 2021-09-14 | Snap Inc. | Joint audio-video facial animation system |
US11706267B2 (en) | 2017-10-30 | 2023-07-18 | Snap Inc. | Animated chat presence |
US11930055B2 (en) | 2017-10-30 | 2024-03-12 | Snap Inc. | Animated chat presence |
US11354843B2 (en) | 2017-10-30 | 2022-06-07 | Snap Inc. | Animated chat presence |
US11030789B2 (en) | 2017-10-30 | 2021-06-08 | Snap Inc. | Animated chat presence |
US20230316622A1 (en) * | 2017-11-17 | 2023-10-05 | Sony Interactive Entertainment LLC | Systems, methods, and devices for creating a spline-based video animation sequence |
US11460974B1 (en) | 2017-11-28 | 2022-10-04 | Snap Inc. | Content discovery refresh |
US10936157B2 (en) | 2017-11-29 | 2021-03-02 | Snap Inc. | Selectable item including a customized graphic for an electronic messaging application |
US11411895B2 (en) | 2017-11-29 | 2022-08-09 | Snap Inc. | Generating aggregated media content items for a group of users in an electronic messaging application |
US10949648B1 (en) | 2018-01-23 | 2021-03-16 | Snap Inc. | Region-based stabilized face tracking |
US11769259B2 (en) | 2018-01-23 | 2023-09-26 | Snap Inc. | Region-based stabilized face tracking |
US10467793B2 (en) * | 2018-02-08 | 2019-11-05 | King.Com Ltd. | Computer implemented method and device |
US11468618B2 (en) | 2018-02-28 | 2022-10-11 | Snap Inc. | Animated expressive icon |
US11880923B2 (en) | 2018-02-28 | 2024-01-23 | Snap Inc. | Animated expressive icon |
US11120601B2 (en) | 2018-02-28 | 2021-09-14 | Snap Inc. | Animated expressive icon |
US11523159B2 (en) | 2018-02-28 | 2022-12-06 | Snap Inc. | Generating media content items based on location information |
US11688119B2 (en) | 2018-02-28 | 2023-06-27 | Snap Inc. | Animated expressive icon |
US10979752B1 (en) | 2018-02-28 | 2021-04-13 | Snap Inc. | Generating media content items based on location information |
US11310176B2 (en) | 2018-04-13 | 2022-04-19 | Snap Inc. | Content suggestion system |
US11875439B2 (en) | 2018-04-18 | 2024-01-16 | Snap Inc. | Augmented expression system |
US11682182B2 (en) | 2018-05-07 | 2023-06-20 | Apple Inc. | Avatar creation user interface |
US10861248B2 (en) | 2018-05-07 | 2020-12-08 | Apple Inc. | Avatar creation user interface |
US10580221B2 (en) * | 2018-05-07 | 2020-03-03 | Apple Inc. | Avatar creation user interface |
US10523879B2 (en) | 2018-05-07 | 2019-12-31 | Apple Inc. | Creative camera |
US11178335B2 (en) | 2018-05-07 | 2021-11-16 | Apple Inc. | Creative camera |
US11380077B2 (en) | 2018-05-07 | 2022-07-05 | Apple Inc. | Avatar creation user interface |
US10375313B1 (en) | 2018-05-07 | 2019-08-06 | Apple Inc. | Creative camera |
US10410434B1 (en) | 2018-05-07 | 2019-09-10 | Apple Inc. | Avatar creation user interface |
US11103161B2 (en) | 2018-05-07 | 2021-08-31 | Apple Inc. | Displaying user interfaces associated with physical activities |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
US10198845B1 (en) | 2018-05-29 | 2019-02-05 | LoomAi, Inc. | Methods and systems for animating facial expressions |
CN112673400A (en) * | 2018-07-04 | 2021-04-16 | 网络助手有限责任公司 | Avatar animation |
US11074675B2 (en) | 2018-07-31 | 2021-07-27 | Snap Inc. | Eye texture inpainting |
US11715268B2 (en) | 2018-08-30 | 2023-08-01 | Snap Inc. | Video clip object tracking |
US11030813B2 (en) | 2018-08-30 | 2021-06-08 | Snap Inc. | Video clip object tracking |
US10896534B1 (en) | 2018-09-19 | 2021-01-19 | Snap Inc. | Avatar style transformation using neural networks |
US11348301B2 (en) | 2018-09-19 | 2022-05-31 | Snap Inc. | Avatar style transformation using neural networks |
US11868590B2 (en) | 2018-09-25 | 2024-01-09 | Snap Inc. | Interface to display shared user groups |
US10895964B1 (en) | 2018-09-25 | 2021-01-19 | Snap Inc. | Interface to display shared user groups |
US11294545B2 (en) | 2018-09-25 | 2022-04-05 | Snap Inc. | Interface to display shared user groups |
US11824822B2 (en) | 2018-09-28 | 2023-11-21 | Snap Inc. | Generating customized graphics having reactions to electronic message content |
US10904181B2 (en) | 2018-09-28 | 2021-01-26 | Snap Inc. | Generating customized graphics having reactions to electronic message content |
US11704005B2 (en) | 2018-09-28 | 2023-07-18 | Snap Inc. | Collaborative achievement interface |
US11477149B2 (en) | 2018-09-28 | 2022-10-18 | Snap Inc. | Generating customized graphics having reactions to electronic message content |
US11189070B2 (en) | 2018-09-28 | 2021-11-30 | Snap Inc. | System and method of generating targeted user lists using customizable avatar characteristics |
US11455082B2 (en) | 2018-09-28 | 2022-09-27 | Snap Inc. | Collaborative achievement interface |
US11610357B2 (en) | 2018-09-28 | 2023-03-21 | Snap Inc. | System and method of generating targeted user lists using customizable avatar characteristics |
US11245658B2 (en) | 2018-09-28 | 2022-02-08 | Snap Inc. | System and method of generating private notifications between users in a communication session |
US11171902B2 (en) | 2018-09-28 | 2021-11-09 | Snap Inc. | Generating customized graphics having reactions to electronic message content |
US11354844B2 (en) | 2018-10-26 | 2022-06-07 | Soul Machines Limited | Digital character blending and generation system and method |
US11103795B1 (en) | 2018-10-31 | 2021-08-31 | Snap Inc. | Game drawer |
US10872451B2 (en) | 2018-10-31 | 2020-12-22 | Snap Inc. | 3D avatar rendering |
US11321896B2 (en) | 2018-10-31 | 2022-05-03 | Snap Inc. | 3D avatar rendering |
US11176737B2 (en) | 2018-11-27 | 2021-11-16 | Snap Inc. | Textured mesh building |
US11620791B2 (en) | 2018-11-27 | 2023-04-04 | Snap Inc. | Rendering 3D captions within real-world environments |
US20220044479A1 (en) | 2018-11-27 | 2022-02-10 | Snap Inc. | Textured mesh building |
US11836859B2 (en) | 2018-11-27 | 2023-12-05 | Snap Inc. | Textured mesh building |
US10902661B1 (en) | 2018-11-28 | 2021-01-26 | Snap Inc. | Dynamic composite user identifier |
US11887237B2 (en) | 2018-11-28 | 2024-01-30 | Snap Inc. | Dynamic composite user identifier |
US11315259B2 (en) | 2018-11-30 | 2022-04-26 | Snap Inc. | Efficient human pose tracking in videos |
US10861170B1 (en) | 2018-11-30 | 2020-12-08 | Snap Inc. | Efficient human pose tracking in videos |
US11783494B2 (en) | 2018-11-30 | 2023-10-10 | Snap Inc. | Efficient human pose tracking in videos |
US11199957B1 (en) | 2018-11-30 | 2021-12-14 | Snap Inc. | Generating customized avatars based on location information |
US11698722B2 (en) | 2018-11-30 | 2023-07-11 | Snap Inc. | Generating customized avatars based on location information |
US11798261B2 (en) | 2018-12-14 | 2023-10-24 | Snap Inc. | Image face manipulation |
US11055514B1 (en) | 2018-12-14 | 2021-07-06 | Snap Inc. | Image face manipulation |
US11516173B1 (en) | 2018-12-26 | 2022-11-29 | Snap Inc. | Message composition interface |
US11877211B2 (en) | 2019-01-14 | 2024-01-16 | Snap Inc. | Destination sharing in location sharing system |
US11032670B1 (en) | 2019-01-14 | 2021-06-08 | Snap Inc. | Destination sharing in location sharing system |
US11751015B2 (en) | 2019-01-16 | 2023-09-05 | Snap Inc. | Location-based context information sharing in a messaging system |
US10945098B2 (en) | 2019-01-16 | 2021-03-09 | Snap Inc. | Location-based context information sharing in a messaging system |
US10939246B1 (en) | 2019-01-16 | 2021-03-02 | Snap Inc. | Location-based context information sharing in a messaging system |
US11107261B2 (en) | 2019-01-18 | 2021-08-31 | Apple Inc. | Virtual avatar animation based on facial feature movement |
US11294936B1 (en) | 2019-01-30 | 2022-04-05 | Snap Inc. | Adaptive spatial density based clustering |
US11693887B2 (en) | 2019-01-30 | 2023-07-04 | Snap Inc. | Adaptive spatial density based clustering |
US11714524B2 (en) | 2019-02-06 | 2023-08-01 | Snap Inc. | Global event-based avatar |
US10984575B2 (en) | 2019-02-06 | 2021-04-20 | Snap Inc. | Body pose estimation |
US11557075B2 (en) | 2019-02-06 | 2023-01-17 | Snap Inc. | Body pose estimation |
US11010022B2 (en) | 2019-02-06 | 2021-05-18 | Snap Inc. | Global event-based avatar |
US11809624B2 (en) | 2019-02-13 | 2023-11-07 | Snap Inc. | Sleep detection in a location sharing system |
US11275439B2 (en) | 2019-02-13 | 2022-03-15 | Snap Inc. | Sleep detection in a location sharing system |
US10936066B1 (en) | 2019-02-13 | 2021-03-02 | Snap Inc. | Sleep detection in a location sharing system |
US10964082B2 (en) | 2019-02-26 | 2021-03-30 | Snap Inc. | Avatar based on weather |
US11574431B2 (en) | 2019-02-26 | 2023-02-07 | Snap Inc. | Avatar based on weather |
US10852918B1 (en) | 2019-03-08 | 2020-12-01 | Snap Inc. | Contextual information in chat |
US11301117B2 (en) | 2019-03-08 | 2022-04-12 | Snap Inc. | Contextual information in chat |
US11868414B1 (en) | 2019-03-14 | 2024-01-09 | Snap Inc. | Graph-based prediction for contact suggestion in a location sharing system |
US11852554B1 (en) | 2019-03-21 | 2023-12-26 | Snap Inc. | Barometer calibration in a location sharing system |
US11638115B2 (en) | 2019-03-28 | 2023-04-25 | Snap Inc. | Points of interest in a location sharing system |
US11039270B2 (en) | 2019-03-28 | 2021-06-15 | Snap Inc. | Points of interest in a location sharing system |
US11166123B1 (en) | 2019-03-28 | 2021-11-02 | Snap Inc. | Grouped transmission of location data in a location sharing system |
US10992619B2 (en) | 2019-04-30 | 2021-04-27 | Snap Inc. | Messaging system with avatar generation |
US10659405B1 (en) | 2019-05-06 | 2020-05-19 | Apple Inc. | Avatar integration with multiple applications |
USD916810S1 (en) | 2019-05-28 | 2021-04-20 | Snap Inc. | Display screen or portion thereof with a graphical user interface |
USD916871S1 (en) | 2019-05-28 | 2021-04-20 | Snap Inc. | Display screen or portion thereof with a transitional graphical user interface |
USD916809S1 (en) | 2019-05-28 | 2021-04-20 | Snap Inc. | Display screen or portion thereof with a transitional graphical user interface |
USD916872S1 (en) | 2019-05-28 | 2021-04-20 | Snap Inc. | Display screen or portion thereof with a graphical user interface |
USD916811S1 (en) | 2019-05-28 | 2021-04-20 | Snap Inc. | Display screen or portion thereof with a transitional graphical user interface |
US10893385B1 (en) | 2019-06-07 | 2021-01-12 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US11917495B2 (en) | 2019-06-07 | 2024-02-27 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US11601783B2 (en) | 2019-06-07 | 2023-03-07 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US11189098B2 (en) | 2019-06-28 | 2021-11-30 | Snap Inc. | 3D object camera customization system |
US11188190B2 (en) | 2019-06-28 | 2021-11-30 | Snap Inc. | Generating animation overlays in a communication session |
US11676199B2 (en) | 2019-06-28 | 2023-06-13 | Snap Inc. | Generating customizable avatar outfits |
US11443491B2 (en) | 2019-06-28 | 2022-09-13 | Snap Inc. | 3D object camera customization system |
US11823341B2 (en) | 2019-06-28 | 2023-11-21 | Snap Inc. | 3D object camera customization system |
US11714535B2 (en) | 2019-07-11 | 2023-08-01 | Snap Inc. | Edge gesture interface with smart interactions |
US11307747B2 (en) | 2019-07-11 | 2022-04-19 | Snap Inc. | Edge gesture interface with smart interactions |
US11551393B2 (en) | 2019-07-23 | 2023-01-10 | LoomAi, Inc. | Systems and methods for animation generation |
US11455081B2 (en) | 2019-08-05 | 2022-09-27 | Snap Inc. | Message thread prioritization interface |
US11956192B2 (en) | 2019-08-12 | 2024-04-09 | Snap Inc. | Message reminder interface |
US10911387B1 (en) | 2019-08-12 | 2021-02-02 | Snap Inc. | Message reminder interface |
US11588772B2 (en) | 2019-08-12 | 2023-02-21 | Snap Inc. | Message reminder interface |
US11822774B2 (en) | 2019-09-16 | 2023-11-21 | Snap Inc. | Messaging system with battery level sharing |
US11320969B2 (en) | 2019-09-16 | 2022-05-03 | Snap Inc. | Messaging system with battery level sharing |
US11662890B2 (en) | 2019-09-16 | 2023-05-30 | Snap Inc. | Messaging system with battery level sharing |
US11425062B2 (en) | 2019-09-27 | 2022-08-23 | Snap Inc. | Recommended content viewed by friends |
US11676320B2 (en) | 2019-09-30 | 2023-06-13 | Snap Inc. | Dynamic media collection generation |
US11080917B2 (en) | 2019-09-30 | 2021-08-03 | Snap Inc. | Dynamic parameterized user avatar stories |
US11270491B2 (en) | 2019-09-30 | 2022-03-08 | Snap Inc. | Dynamic parameterized user avatar stories |
US11218838B2 (en) | 2019-10-31 | 2022-01-04 | Snap Inc. | Focused map-based context information surfacing |
US11063891B2 (en) | 2019-12-03 | 2021-07-13 | Snap Inc. | Personalized avatar notification |
US11563702B2 (en) | 2019-12-03 | 2023-01-24 | Snap Inc. | Personalized avatar notification |
US11128586B2 (en) | 2019-12-09 | 2021-09-21 | Snap Inc. | Context sensitive avatar captions |
US11582176B2 (en) | 2019-12-09 | 2023-02-14 | Snap Inc. | Context sensitive avatar captions |
US11594025B2 (en) | 2019-12-11 | 2023-02-28 | Snap Inc. | Skeletal tracking using previous frames |
US11036989B1 (en) | 2019-12-11 | 2021-06-15 | Snap Inc. | Skeletal tracking using previous frames |
US11908093B2 (en) | 2019-12-19 | 2024-02-20 | Snap Inc. | 3D captions with semantic graphical elements |
US11263817B1 (en) | 2019-12-19 | 2022-03-01 | Snap Inc. | 3D captions with face tracking |
US11227442B1 (en) | 2019-12-19 | 2022-01-18 | Snap Inc. | 3D captions with semantic graphical elements |
US11636657B2 (en) | 2019-12-19 | 2023-04-25 | Snap Inc. | 3D captions with semantic graphical elements |
US11810220B2 (en) | 2019-12-19 | 2023-11-07 | Snap Inc. | 3D captions with face tracking |
US11128715B1 (en) | 2019-12-30 | 2021-09-21 | Snap Inc. | Physical friend proximity in chat |
US11140515B1 (en) | 2019-12-30 | 2021-10-05 | Snap Inc. | Interfaces for relative device positioning |
US11893208B2 (en) | 2019-12-31 | 2024-02-06 | Snap Inc. | Combined map icon with action indicator |
US11169658B2 (en) | 2019-12-31 | 2021-11-09 | Snap Inc. | Combined map icon with action indicator |
US11356720B2 (en) | 2020-01-30 | 2022-06-07 | Snap Inc. | Video generation system to render frames on demand |
US11729441B2 (en) | 2020-01-30 | 2023-08-15 | Snap Inc. | Video generation system to render frames on demand |
US11263254B2 (en) | 2020-01-30 | 2022-03-01 | Snap Inc. | Video generation system to render frames on demand using a fleet of servers |
US11284144B2 (en) | 2020-01-30 | 2022-03-22 | Snap Inc. | Video generation system to render frames on demand using a fleet of GPUs |
US11651022B2 (en) | 2020-01-30 | 2023-05-16 | Snap Inc. | Video generation system to render frames on demand using a fleet of servers |
US11651539B2 (en) | 2020-01-30 | 2023-05-16 | Snap Inc. | System for generating media content items on demand |
US11036781B1 (en) | 2020-01-30 | 2021-06-15 | Snap Inc. | Video generation system to render frames on demand using a fleet of servers |
US11831937B2 (en) | 2020-01-30 | 2023-11-28 | Snap Inc. | Video generation system to render frames on demand using a fleet of GPUS |
US11619501B2 (en) | 2020-03-11 | 2023-04-04 | Snap Inc. | Avatar based on trip |
US11775165B2 (en) | 2020-03-16 | 2023-10-03 | Snap Inc. | 3D cutout image modification |
US11217020B2 (en) | 2020-03-16 | 2022-01-04 | Snap Inc. | 3D cutout image modification |
US11818286B2 (en) | 2020-03-30 | 2023-11-14 | Snap Inc. | Avatar recommendation and reply |
US11625873B2 (en) | 2020-03-30 | 2023-04-11 | Snap Inc. | Personalized media overlay recommendation |
US11956190B2 (en) | 2020-05-08 | 2024-04-09 | Snap Inc. | Messaging system with a carousel of related entities |
US11442414B2 (en) | 2020-05-11 | 2022-09-13 | Apple Inc. | User interfaces related to time |
US11061372B1 (en) | 2020-05-11 | 2021-07-13 | Apple Inc. | User interfaces related to time |
US11822778B2 (en) | 2020-05-11 | 2023-11-21 | Apple Inc. | User interfaces related to time |
US11921998B2 (en) | 2020-05-11 | 2024-03-05 | Apple Inc. | Editing features of an avatar |
US11543939B2 (en) | 2020-06-08 | 2023-01-03 | Snap Inc. | Encoded image based messaging system |
US11922010B2 (en) | 2020-06-08 | 2024-03-05 | Snap Inc. | Providing contextual information with keyboard interface for messaging system |
US11822766B2 (en) | 2020-06-08 | 2023-11-21 | Snap Inc. | Encoded image based messaging system |
US11683280B2 (en) | 2020-06-10 | 2023-06-20 | Snap Inc. | Messaging system including an external-resource dock and drawer |
US11580682B1 (en) | 2020-06-30 | 2023-02-14 | Snap Inc. | Messaging system with augmented reality makeup |
CN111899321A (en) * | 2020-08-26 | 2020-11-06 | 网易(杭州)网络有限公司 | Method and device for showing expression of virtual character |
US11863513B2 (en) | 2020-08-31 | 2024-01-02 | Snap Inc. | Media content playback and comments management |
US11893301B2 (en) | 2020-09-10 | 2024-02-06 | Snap Inc. | Colocated shared augmented reality without shared backend |
US11360733B2 (en) | 2020-09-10 | 2022-06-14 | Snap Inc. | Colocated shared augmented reality without shared backend |
US11888795B2 (en) | 2020-09-21 | 2024-01-30 | Snap Inc. | Chats with micro sound clips |
US11833427B2 (en) | 2020-09-21 | 2023-12-05 | Snap Inc. | Graphical marker generation system for synchronizing users |
US11452939B2 (en) | 2020-09-21 | 2022-09-27 | Snap Inc. | Graphical marker generation system for synchronizing users |
US11910269B2 (en) | 2020-09-25 | 2024-02-20 | Snap Inc. | Augmented reality content items including user avatar to share location |
US11660022B2 (en) | 2020-10-27 | 2023-05-30 | Snap Inc. | Adaptive skeletal joint smoothing |
US11615592B2 (en) | 2020-10-27 | 2023-03-28 | Snap Inc. | Side-by-side character animation from realtime 3D body motion capture |
US11450051B2 (en) | 2020-11-18 | 2022-09-20 | Snap Inc. | Personalized avatar real-time motion capture |
US11748931B2 (en) | 2020-11-18 | 2023-09-05 | Snap Inc. | Body animation sharing and remixing |
US11734894B2 (en) | 2020-11-18 | 2023-08-22 | Snap Inc. | Real-time motion transfer for prosthetic limbs |
US11790531B2 (en) | 2021-02-24 | 2023-10-17 | Snap Inc. | Whole body segmentation |
US11562536B2 (en) * | 2021-03-15 | 2023-01-24 | Tencent America LLC | Methods and systems for personalized 3D head model deformation |
US20220292773A1 (en) * | 2021-03-15 | 2022-09-15 | Tencent America LLC | Methods and systems for personalized 3d head model deformation |
US11809633B2 (en) | 2021-03-16 | 2023-11-07 | Snap Inc. | Mirroring device with pointing based navigation |
US11734959B2 (en) | 2021-03-16 | 2023-08-22 | Snap Inc. | Activating hands-free mode on mirroring device |
US11798201B2 (en) | 2021-03-16 | 2023-10-24 | Snap Inc. | Mirroring device with whole-body outfits |
US11908243B2 (en) | 2021-03-16 | 2024-02-20 | Snap Inc. | Menu hierarchy navigation on electronic mirroring devices |
US11544885B2 (en) | 2021-03-19 | 2023-01-03 | Snap Inc. | Augmented reality experience based on physical items |
US11562548B2 (en) | 2021-03-22 | 2023-01-24 | Snap Inc. | True size eyewear in real time |
US11636654B2 (en) | 2021-05-19 | 2023-04-25 | Snap Inc. | AR-based connected portal shopping |
US11941767B2 (en) | 2021-05-19 | 2024-03-26 | Snap Inc. | AR-based connected portal shopping |
US11714536B2 (en) | 2021-05-21 | 2023-08-01 | Apple Inc. | Avatar sticker editor user interfaces |
US11776190B2 (en) | 2021-06-04 | 2023-10-03 | Apple Inc. | Techniques for managing an avatar on a lock screen |
US11941227B2 (en) | 2021-06-30 | 2024-03-26 | Snap Inc. | Hybrid search system for customizable media |
US11854069B2 (en) | 2021-07-16 | 2023-12-26 | Snap Inc. | Personalized try-on ads |
US11908083B2 (en) | 2021-08-31 | 2024-02-20 | Snap Inc. | Deforming custom mesh based on body mesh |
US11670059B2 (en) | 2021-09-01 | 2023-06-06 | Snap Inc. | Controlling interactive fashion based on body gestures |
US11673054B2 (en) | 2021-09-07 | 2023-06-13 | Snap Inc. | Controlling AR games on fashion items |
US11663792B2 (en) | 2021-09-08 | 2023-05-30 | Snap Inc. | Body fitted accessory with physics simulation |
US11900506B2 (en) | 2021-09-09 | 2024-02-13 | Snap Inc. | Controlling interactive fashion based on facial expressions |
US11734866B2 (en) | 2021-09-13 | 2023-08-22 | Snap Inc. | Controlling interactive fashion based on voice |
US11798238B2 (en) | 2021-09-14 | 2023-10-24 | Snap Inc. | Blending body mesh into external mesh |
US11836866B2 (en) | 2021-09-20 | 2023-12-05 | Snap Inc. | Deforming real-world object using an external mesh |
US11636662B2 (en) | 2021-09-30 | 2023-04-25 | Snap Inc. | Body normal network light and rendering control |
US11790614B2 (en) | 2021-10-11 | 2023-10-17 | Snap Inc. | Inferring intent from pose and speech input |
US11836862B2 (en) | 2021-10-11 | 2023-12-05 | Snap Inc. | External mesh with vertex attributes |
US11651572B2 (en) | 2021-10-11 | 2023-05-16 | Snap Inc. | Light and rendering of garments |
US11763481B2 (en) | 2021-10-20 | 2023-09-19 | Snap Inc. | Mirror-based augmented reality experience |
US11960784B2 (en) | 2021-12-07 | 2024-04-16 | Snap Inc. | Shared augmented reality unboxing experience |
US11748958B2 (en) | 2021-12-07 | 2023-09-05 | Snap Inc. | Augmented reality unboxing experience |
US11880947B2 (en) | 2021-12-21 | 2024-01-23 | Snap Inc. | Real-time upper-body garment exchange |
US11928783B2 (en) | 2021-12-30 | 2024-03-12 | Snap Inc. | AR position and orientation along a plane |
US11887260B2 (en) | 2021-12-30 | 2024-01-30 | Snap Inc. | AR position indicator |
US11823346B2 (en) | 2022-01-17 | 2023-11-21 | Snap Inc. | AR body part tracking system |
US11954762B2 (en) | 2022-01-19 | 2024-04-09 | Snap Inc. | Object replacement system |
US11870745B1 (en) | 2022-06-28 | 2024-01-09 | Snap Inc. | Media gallery sharing and management |
US11962598B2 (en) | 2022-08-10 | 2024-04-16 | Snap Inc. | Social media post subscribe requests for buffer user accounts |
US11893166B1 (en) | 2022-11-08 | 2024-02-06 | Snap Inc. | User avatar movement control using an augmented reality eyewear device |
Also Published As
Publication number | Publication date |
---|---|
GB0713186D0 (en) | 2007-08-15 |
GB2450757A (en) | 2009-01-07 |
EP2175950A1 (en) | 2010-04-21 |
WO2009007701A1 (en) | 2009-01-15 |
JP2010532890A (en) | 2010-10-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20100203968A1 (en) | Apparatus And Method Of Avatar Customisation | |
US9259641B2 (en) | Entertainment device and method | |
EP2131935B1 (en) | Apparatus and method of data transfer | |
EP2131934B1 (en) | Entertainment device and method | |
US20100030660A1 (en) | Apparatus and method of on-line transaction | |
US9345970B2 (en) | Switching operation of an entertainment device and method thereof | |
US8771083B2 (en) | Apparatus and method of on-line reporting | |
US8606904B2 (en) | Apparatus and method of administering modular online environments | |
US20130132837A1 (en) | Entertainment device and method | |
US20110055320A1 (en) | Apparatus and method of data transfer | |
WO2008104783A1 (en) | Entertainment device and method | |
GB2461175A (en) | A method of transferring real-time multimedia data in a peer to peer network using polling of peer devices |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY COMPUTER ENTERTAINMENT EUROPE LIMITED, UNITED Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GILL, ANDREW GEORGE;RIBBONS, KEITH THOMAS;FOSTER, JOHN;AND OTHERS;SIGNING DATES FROM 20100125 TO 20100204;REEL/FRAME:023941/0790 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: SONY INTERACTIVE ENTERTAINMENT EUROPE LIMITED, UNITED KINGDOM Free format text: CHANGE OF NAME;ASSIGNOR:SONY COMPUTER ENTERTAINMENT EUROPE LIMITED;REEL/FRAME:043198/0110 Effective date: 20160729 Owner name: SONY INTERACTIVE ENTERTAINMENT EUROPE LIMITED, UNI Free format text: CHANGE OF NAME;ASSIGNOR:SONY COMPUTER ENTERTAINMENT EUROPE LIMITED;REEL/FRAME:043198/0110 Effective date: 20160729 |