USRE46309E1 - Application sharing - Google Patents

Application sharing Download PDF

Info

Publication number
USRE46309E1
USRE46309E1 US14/666,298 US201514666298A USRE46309E US RE46309 E1 USRE46309 E1 US RE46309E1 US 201514666298 A US201514666298 A US 201514666298A US RE46309 E USRE46309 E US RE46309E
Authority
US
United States
Prior art keywords
windows
network node
sharer
virtual area
screen layout
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US14/666,298
Inventor
Alexander Say Go
Vladimir Petter
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sococo Inc
Original Assignee
Sococo Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US11/923,634 external-priority patent/US7844724B2/en
Priority claimed from US11/923,629 external-priority patent/US7769806B2/en
Priority claimed from US12/354,709 external-priority patent/US8397168B2/en
Application filed by Sococo Inc filed Critical Sococo Inc
Priority to US14/666,298 priority Critical patent/USRE46309E1/en
Assigned to SOCIAL COMMUNICATIONS COMPANY reassignment SOCIAL COMMUNICATIONS COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PETTER, VLADIMIR, GO, ALEXANDER SAY
Application granted granted Critical
Publication of USRE46309E1 publication Critical patent/USRE46309E1/en
Assigned to Sococo, Inc. reassignment Sococo, Inc. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: SOCIAL COMMUNICATIONS COMPANY
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F15/00Digital computers in general; Data processing equipment in general
    • G06F15/16Combinations of two or more digital computers each having at least an arithmetic unit, a program unit and a register, e.g. for a simultaneous processing of several programs
    • G06F15/161Computing infrastructure, e.g. computer clusters, blade chassis or hardware partitioning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • G06F9/4443
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • H04L67/025Protocols based on web technology, e.g. hypertext transfer protocol [HTTP] for remote control or remote monitoring of applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality
    • H04L67/38
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2209/00Indexing scheme relating to G06F9/00
    • G06F2209/54Indexing scheme relating to G06F9/54
    • G06F2209/545Gui
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms

Definitions

  • a common form of data sharing is application sharing, which involves transmitting application data from one node (referred to as the “sharer node”) to one or more other nodes (referred to as “viewer nodes”).
  • Application sharing has a variety of useful applications, including providing remote technical support, remote collaboration, and remote presentation of demonstrations, documents, and images.
  • an application sharing program on the sharer node periodically collects drawing commands (e.g., GDI calls for drawing lines and curves, rendering fonts and handling palettes) from a chained display driver process on the sharer node, packages the drawing commands into an order packet, and sends the order packet to a respective counterpart application sharing program on each of the viewer nodes that accurately constructs the shared view of the sharer's display.
  • drawing commands e.g., GDI calls for drawing lines and curves, rendering fonts and handling palettes
  • Such an application sharing approach requires each viewer node to render its own version of the shared application by passing the drawing commands in the order packet to a display process (e.g., the
  • the invention features a method in accordance with which ones of the windows that are associated with a software process are identified in a screen layout on a local display of a sharer network node.
  • a composite image of the identified windows as they are arranged in the screen layout and free of obscuration by any other windows in the screen layout is generated.
  • the composite image is transmitted from the sharer network node to a viewer network node.
  • the invention features a method in accordance with which locally-generated commands that are derived from local input device events on a sharer network node are received.
  • Remotely-generated commands that are derived from remote input device events on a remote viewer network node also are received.
  • the received commands are processed into a command sequence.
  • the command sequence is passed to a shared process executing on the sharer network node.
  • a screen layout on a local display of the sharer network node one or more windows that are associated with the shared process are presented in accordance with the received command sequence.
  • An image of the one or more windows as they are presented in the screen layout is generated. The image is transmitted from the sharer network node to the viewer network node.
  • the invention also features apparatus operable to implement the inventive methods described above and computer-readable media storing computer-readable instructions causing a computer to implement the inventive methods described above.
  • FIG. 1 is a diagrammatic view of an embodiment of a network communication environment that includes a first network node and a second network node.
  • FIG. 2 is a flow diagram of an embodiment of an application sharing method.
  • FIG. 3 is a diagrammatic view of an embodiment of a screen layout on a display of a sharer network node.
  • FIG. 4A is a diagrammatic view of an embodiment of a screen layout on a display of a viewer network node.
  • FIG. 4B is a diagrammatic view of an embodiment of a screen layout on a display of a viewer network node.
  • FIG. 5 is a diagrammatic view of an embodiment of an application sharing class model.
  • FIG. 6 is a diagrammatic view of an embodiment of application sharing components implementing a method in an embodiment of an application sharing process.
  • FIG. 7 is a diagrammatic view of an embodiment of application sharing components implementing a method in an embodiment of an application sharing process.
  • FIG. 8 is a diagrammatic view of an embodiment of application sharing components implementing a method in an embodiment of an application sharing process.
  • FIG. 9 is a diagrammatic view of an embodiment of application sharing components implementing a method in an embodiment of an application sharing process.
  • FIG. 10 is a diagrammatic view of an embodiment of application sharing components implementing a method in an embodiment of an application sharing process.
  • FIGS. 11A and 11B are flow diagrams of an embodiment an application sharing method.
  • FIG. 12 is a diagrammatic view of an embodiment of a screen layout on a display of a sharer network node.
  • FIG. 13 is a flow diagram of an embodiment of a method of determining the layering order of the windows associated with a shared software process.
  • FIG. 14 is a block diagram of an embodiment of a shared network node implementing an embodiment of an application sharing process in connection with an embodiment of a viewer network node.
  • FIG. 15 is a flow diagram of an embodiment of an application sharing method.
  • FIG. 16 is a diagrammatic view of an embodiment of a network communication environment that includes the first network node of FIG. 1 , the second network node of FIG. 1 , and a virtual environment creator.
  • FIG. 17 is a diagrammatic view of an embodiment of a network node that includes a graphical user interface presenting a depiction of a virtual area.
  • FIG. 18 is a block diagram of the network communication environment of FIG. 16 that shows components of an embodiment of a client network node.
  • FIG. 19 is a diagrammatic view of an embodiment of a graphical user interface.
  • FIG. 20 is a diagrammatic view of an embodiment of a graphical user interface.
  • FIG. 21 is a diagrammatic view of an embodiment of a graphical user interface.
  • FIG. 22 is a diagrammatic view of an embodiment of a graphical user interface.
  • FIG. 23 is a diagrammatic view of an embodiment of a graphical user interface.
  • FIG. 24 is a diagrammatic view of an embodiment of a graphical user interface.
  • FIG. 25 is a diagrammatic view of an embodiment of a graphical user interface.
  • a “window” is a visual area of a display that typically includes a user interface.
  • a window typically displays the output of a software process and typically enables a user to input commands or data for the software process.
  • a window that has a parent is called a “child window.”
  • a window that has no parent, or whose parent is the desktop window, is called a “top-level window.”
  • a “desktop” is a system-defined window that paints the background of a graphical user interface (GUI) and serves as the base for all windows displayed by all software processes.
  • GUI graphical user interface
  • window scraping refers to a process of extracting data from the display output of another software process.
  • the extraction process is performed by a “scraper” software process.
  • a “layering order” (also referred to as the “z-order”) is an ordering of overlapping two-dimensional objects, such as windows in a graphical user interface (GUI), along an axis (typically referred to as the “z-axis”) that is perpendicular to a display on which the GUI is presented.
  • GUI graphical user interface
  • Comppositing is the combining of visual elements from separate sources into a single composite image (also referred to herein as a “frame”).
  • obscuration means the act or process of concealing or hiding by or as if by covering.
  • a “communicant” is a person who communicates or otherwise interacts with other persons over one or more network connections, where the communication or interaction may or may not occur in the context of a virtual area.
  • a “user” is a communicant who is operating a particular network node that defines a particular perspective for descriptive purposes.
  • a “sharer” is a communicant who is operating a sharer network node.
  • a “viewer” is a communicant who is operating a viewer network node.
  • a “realtime contact” of a user is a communicant or other person who has communicated with the user via a realtime communications platform.
  • a “computer” is any machine, device, or apparatus that processes data according to computer-readable instructions that are stored on a computer-readable medium either temporarily or permanently.
  • An “operating system” is a software component of a computer system that manages and coordinates the performance of tasks and the sharing of computing and hardware resources.
  • a “software process” (also referred to as software, an application, computer software, a computer application, a program, and a computer program) is a set of instructions that a computer can interpret and execute to perform one or more specific tasks.
  • a software process may have one or more “threads” of execution.
  • a “shared software process” is a software process whose output is shared with a viewer network node.
  • a “computer data file” is a block of information that durably stores data for use by a software application.
  • a “database” is an organized collection of records that are presented in a standardized format that can be searched by computers.
  • a database may be stored on a single computer-readable data storage medium on a single computer or it may be distributed across multiple computer-readable data storage media on one or more computers.
  • a “data sink” (referred to herein simply as a “sink”) is any of a device (e.g., a computer), part of a device, or software that receives data.
  • a “data source” (referred to herein simply as a “source”) is any of a device (e.g., a computer), part of a device, or software that originates data.
  • a “network node” (also referred to simply as a “node”) is a junction or connection point in a communications network.
  • exemplary network nodes include, but are not limited to, a terminal, a computer, and a network switch.
  • a “server” network node is a host computer on a network that responds to requests for information or service.
  • a “client” network node is a computer on a network that requests information or service from a server.
  • a “network connection” is a link between two communicating network nodes.
  • the term “local network node” refers to a network node that currently is the primary subject of discussion.
  • the term “remote network node” refers to a network node that is connected to a local network node by a network communications link.
  • a “connection handle” is a pointer or identifier (e.g., a uniform resource identifier (URI)) that can be used to establish a network connection with a communicant, resource, or service on a network node.
  • a “sharer network node” is a network node that is sharing content with another network node, which is referred to as a “viewer network node.
  • a “network communication” can include any type of information (e.g., text, voice, audio, video, electronic mail message, data file, motion data stream, and data packet) that is transmitted or otherwise conveyed from one network node to another network node over a network connection.
  • a “communicant interaction” is any type of direct or indirect action or influence between a communicant and another network entity, which may include for example another communicant, a virtual area, or a network service.
  • Exemplary types of communicant interactions include communicants communicating with each other in realtime, a communicant entering a virtual area, and a communicant requesting access to a resource from a network service.
  • Presence refers to the ability and willingness of a networked entity (e.g., a communicant, service, or device) to communicate, where such willingness affects the ability to detect and obtain information about the state of the entity on a network and the ability to connect to the entity:
  • a networked entity e.g., a communicant, service, or device
  • a “realtime data stream” is data that is structured and processed in a continuous flow and is designed to be received with no delay or only imperceptible delay.
  • Realtime data streams include digital representations of voice, video, user movements, facial expressions and other physical phenomena, as well as data within the computing environment that may benefit from rapid transmission, rapid execution, or both rapid transmission and rapid execution, including for example, avatar movement instructions, text chat, realtime data feeds (e.g., sensor data, machine control instructions, transaction streams and stock quote information feeds), and file transfers.
  • a “virtual area” (also referred to as an “area” or a “place”) is a representation of a computer-managed space or scene.
  • Virtual areas typically are one-dimensional, two-dimensional, or three-dimensional representations; although in some embodiments a virtual area may correspond to a single point.
  • a virtual area is designed to simulate a physical, real-world space. For example, using a traditional computer monitor, a virtual area may be visualized as a two-dimensional graphic of a three-dimensional computer-generated space.
  • virtual areas do not require an associated visualization to implement switching rules.
  • a virtual area typically refers to an instance of a virtual area schema, where the schema defines the structure and contents of a virtual area in terms of variables and the instance defines the structure and contents of a virtual area in terms of values that have been resolved from a particular context.
  • a “virtual area application” (also referred to as a “virtual area specification”) is a description of a virtual area that is used in creating a virtual environment.
  • the virtual area application typically includes definitions of geometry, physics, and realtime switching rules that are associated with one or more zones of the virtual area.
  • a “virtual environment” is a representation of a computer-managed space that includes at least one virtual area and supports realtime communications between communicants.
  • a “zone” is a region of a virtual area that is associated with at least one switching rule or governance rule.
  • a “switching rule” is an instruction that specifies a connection or disconnection of one or more realtime data sources and one or more realtime data sinks subject to one or more conditions precedent
  • a switching rule controls switching (e.g., routing, connecting, and disconnecting) of realtime data streams between network nodes communicating in the context of a virtual area.
  • a governance rule controls a communicants access to a resource (e.g., an area, a region of an area, or the contents of that area or region), the scope of that access, and follow-on consequences of that access (e.g., a requirement that audit records relating to that access must be recorded).
  • a “renderable zone” is a zone that is associated with a respective visualization.
  • a “position” in a virtual area refers to a location of a point or an area or a volume in the virtual area.
  • a point typically is represented by a single set of one-dimensional, two-dimensional, or three-dimensional coordinates (e.g., Cartesian coordinates, polar coordinates, or spherical coordinates) that define a spot in the virtual area.
  • a coordinate can be defined as any single or plurality of numbers that establish location.
  • An area typically is represented by the three-dimensional coordinates of three or more coplanar vertices that define a boundary of a closed two-dimensional shape in the virtual area.
  • a volume typically is represented by the three-dimensional coordinates of four or more non-coplanar vertices that define a closed boundary of a three-dimensional shape in the virtual area.
  • a “spatial state” is an attribute that describes where a user has presence in a virtual area.
  • the spatial state attribute typically has a respective value (e.g., a zone_ID value) for each of the zones in which the user has presence.
  • a “placemark” is a stored reference (e.g., a hyperlink) to a location in a virtual area.
  • a placemark typically can be selected to present a view of the associated location in the virtual area to a user.
  • the verb “placemark” means the act or operation of creating a placemark.
  • an “object” is any type of discrete element in a virtual area that may be usefully treated separately from the geometry of the virtual area.
  • exemplary objects include doors, portals, windows, view screens, and speakerphone.
  • An object typically has attributes or properties that are separate and distinct from the attributes and properties of the virtual area.
  • An “avatar” is an object that represents a communicant in a virtual area.
  • double-click refers to the act or operation of entering or inputting an execution command (e.g., double-clicking the left computer mouse button or by single-clicking a user interface button associated with an execute command, e.g., enter zone or view object).
  • shift-click refers to the act or operation of entering or inputting a selection command (e.g., clicking the left computer mouse button) while the Shift key of an alphanumeric input device is activated.
  • shift-double-click refers to the act or operation of entering or inputting an execution command while the Shift key of an alphanumeric input device is activated.
  • the term “includes” means includes but not limited to, the term “including” means including but not limited to.
  • the term “based on” means based at least in part on.
  • the embodiments that are described herein enable application sharing with high fidelity, realtime performance, viewer immersion, and privacy protection.
  • the screen content that is associated with each thread of a software process can be determined and composited into a respective composite image (or frame) that is free of other window content.
  • the contents of the windows that are associated with one or more software application threads on a network node can be broadcasted to other network nodes without risk of obscuration by the screen content (e.g., windows containing application content, messages, or dialog boxes) that might be generated by other software processes, thereby preventing corruption of shared window content by overlapping screen content that sometimes is generated by processes that are outside of the user's immediate control.
  • This feature avoids the need for sharers to interrupt a presentation in order to remove obscuring screen content, thereby creating a more immersive collaboration experience for viewers of the shared window content.
  • sharers don't have to worry that private information inadvertently will be shared along with the intended screen content, thereby maintaining the sharer's privacy.
  • Some embodiments also enable multichannel application sharing in which two or more communicants share applications and screen content with each other at the same time. These embodiments typically include an interface that allows each receiver to distinguish one shared window from another.
  • FIG. 1 shows an embodiment of an exemplary network communications environment 10 that includes a first network node 12 and a second network node 14 that are interconnected by a network 18 .
  • the first network node 12 includes a computer-readable memory 20 , a processor 22 , and input/output (I/O) hardware 24 (including a display).
  • the processor 22 executes at least one communications application 26 that is stored in the memory 20 .
  • the second network node 14 typically is configured in substantially the same way as the first network node 12 .
  • the communications applications 26 typically provide one or more modes of communication between the first and second network nodes 12 , 14 (e.g., text, voice, audio, and video).
  • the communications applications 26 enable one-way or two-way application sharing between the first and second network nodes 12 , 14 .
  • FIG. 2 shows an embodiment of an application sharing method that is implemented by the communications application 26 operating on one or both of the first and second network nodes 12 , 14 . This process typically is performed in response to a request from a viewer network node to share one or more documents relating to a software process operating on a sharer network node.
  • the communications application 26 identifies ones of the windows that are associated with a software process in a screen layout on a display of the sharer network node ( FIG. 2 , block 500 ).
  • the communications application 26 may identify all of the windows in the screen layout that are associated with the software process or it may identify ones of the windows in the screen layout that match a handle that is assigned to a thread of the software process. In some cases, this process involves identifying a parent window and at least one associated child window that are created by the software process thread. In some embodiments, the communications application 26 identifies all of the windows created by a designated group of software processes.
  • the communications application 26 generates a composite image of the identified windows as they are arranged in the screen layout and free of obscuration by any other windows in the screen layout ( FIG. 2 , block 502 ).
  • this process involves: determining a layering order of the identified windows in relation to each other that corresponds to the relative layering order of the identified windows in the screen layout; for each of the identified windows, retrieving a respective image of the window; and compositing the retrieved images into the composite image in accordance with the determined layering order.
  • the process of determining the layering order involves: for each of the windows in the screen layout, generating a z-order list that associates a respective z-order value with a respective window handle of the window; and deriving the layering order of the identified windows from the z-order list
  • the process of deriving the layering order typically involves: for each of the z-order values in the z-order list, matching the associated window handle with the window handle of a respective one of the identified windows; and ordering the identified windows in the layering order in accordance with the respective z-order values in the z-order list associated with the ones of the z-order values determined to match the window handles of the identified windows.
  • some embodiments also prevent obscuration of the selected windows as a result of the windows being partially or completely off-screen (e.g., outside the visible desktop window that contains the screen layout).
  • a respective image of each of the identified windows is stored in a respective memory buffer, and the process of generating the composite image involves retrieving each of the images from the respective memory buffers and compositing the retrieved images into the composite image.
  • each of the windows is a layered window whose screen data is stored in a respective memory buffer through a programmatic call to the Microsoft® Win32 application programming interface (API), which is available in Microsoft® Windows® operating systems versions 2000 and later.
  • API application programming interface
  • the WS_EX_LAYERED style bit that is associated with a particular window may be set by the shared software process at window creation time (e.g., via the CreateWindowEx API call) or it may be set by the communications application 26 after creation time (e.g., via SetWindowLong API call with GWL_EXSTYLE).
  • window creation time e.g., via the CreateWindowEx API call
  • the communications application 26 after creation time (e.g., via SetWindowLong API call with GWL_EXSTYLE).
  • the WS_EX_LAYERED window style bit set for a window the operating system redirects the drawing of the window into an off-screen bitmap and buffer, which can then be accessed by the communications application 26 for generating the composite image.
  • Similar layered windows functionality is available from other operating systems (e.g., X-Windows on a UNIX based operating system).
  • the communications application 26 transmits the composite image to the viewer one of the first and second network nodes (i.e., the one of the first and second network nodes that receives the composite image) 12 , 14 ( FIG. 2 , block 504 ).
  • this process involves transmitting the composite image to each of the remote viewer network nodes that requested (or subscribed to) to view the shared content on the sharer network node.
  • the composite image (and subsequent updates) is transmitted to the subscribing viewer network nodes over respective realtime data stream connections that are established between the sharer network node and the viewer network nodes.
  • FIG. 3 shows an embodiment of a screen layout 506 on a display 508 of the first network node 12 , which is acting as a sharer network node.
  • the screen layout 506 includes a first window 510 , a second window 512 that is parent of a child window 514 , a third window 516 , and a fourth window 518 .
  • the first window 510 , the second window 512 and its child window 514 are created by respective threads of a first software process (e.g., the Microsoft® Word® document processing software process), the third window is created by a thread of a second software process (e.g., Microsoft® Excel® spreadsheet software process), and the fourth window 518 is created by a thread of a third software process (e.g., the Microsoft® Outlook® personal information manager software process).
  • the first and second windows 510 , 512 are the main windows that are created by their respective software process threads (e.g., created by the WinMain entry point function, optionally with the WS_EX_LAYERED bit set in a Microsoft® Windows® application environment).
  • aspects of the appearance of the child window 514 typically are affected by its parent window 512 .
  • the parent window 512 typically defines the coordinate system that is used to position the child window 514 on the display 508 .
  • the child window 514 may be clipped so that no part of the child window 514 appears outside the borders of its parent window 512 .
  • FIG. 4A shows an embodiment of a first screen layout 520 on a display 522 of the second network node 14 , which is acting as a viewer network node in accordance with a first application sharing example.
  • the communicant on the first network node 12 is acting as a sharer who has elected to share all of the windows of the first software process (App1) with the second network node 14 .
  • the screen layout 520 consists of a composite image 524 that includes respective regions 526 , 528 , 530 that show the first window 510 , the second window 512 , and the child window 514 , which are created by respective threads of a first software process operating on the first network node 14 .
  • the regions 526 , 528 , 530 show the first window 510 , the second window 512 , and the child window 514 as they are arranged in the screen layout 506 on the first network node 14 .
  • the regions 526 , 528 , 530 also are free of obscuration by any other windows in the screen layout 506 on the first network node 14 .
  • the third window 516 and the fourth window 518 (which obscured underlying portions of the first window 510 , the second window 512 , and the child window 514 in the screen layout 506 ) are omitted from the composite image 524 ; in addition, the portions of the first window 510 , the second window 512 , and the child window 514 that are obscured in the screen layout 506 have been replaced by the appropriate contents of the first window 510 , the second window 512 , and the child window 514 in the composite image 524 .
  • FIG. 4B shows an embodiment of a second screen layout 532 on the display 522 of the second network node 14 , which is acting as a viewer network node in accordance with a second application sharing example.
  • the communicant on the first network node 12 is acting as a sharer who has elected to share all of the windows that are associated with only one of the threads (i.e., the thread responsible for creating the second window 512 and its child window 514 ) of the first software process (App1).
  • the screen layout 532 consists of a composite image 534 that includes respective regions 536 , 538 that show the second window 512 and the child window 514 .
  • the regions 536 , 538 show the second window 512 and the child window 514 as they are arranged in the screen layout 506 on the first network node 14 .
  • the regions 536 , 538 also are free of obscuration by any other windows in the screen layout 506 on the first network node 14 .
  • the first window 510 , the third window 516 , and the fourth window 518 are omitted from the composite image 534 ; in addition, the portions of the second window 512 and the child window 514 that are obscured in the screen layout 506 have been replaced by the appropriate contents of the second window 512 and the child window 514 in the composite image 534 .
  • application sharing is initiated after a sharer network node has published one or more applications or documents that are available to share and at least one viewer has subscribed to at least one of the published applications or documents.
  • the sharer can publish a shared application or document to a viewscreen object that is associated with a virtual area and the viewer can subscribe to the shared content by activating the viewscreen object in the virtual area (e.g., by double-clicking on the viewscreen object with a user input device).
  • the viewer typically is granted one of two types of access to the shared content: view access, which allows the viewer to only passively view the shared content; and control access, which allows the viewer to view, control, edit, and manipulate the shared content.
  • view access which allows the viewer to only passively view the shared content
  • control access which allows the viewer to view, control, edit, and manipulate the shared content.
  • the type of access that is granted to the viewer can be set by the sharer or by one or more governance rules that are associated with the context in which the sharing takes place (e.g., a governance rule that is associated with a zone of a virtual area, as described below in section IV).
  • the shared content typically is streamed from the sharer network node to the viewer network node in the form of streaming bitmaps of the windows on the sharers display that are associated with the shared application or document
  • the bitmap of each window can be streamed separately or already composited.
  • the bitmaps typically are compressed before being streamed. If the viewer has only view access, then the viewer can only passively view images of the shared windows on the sharers display. If the viewer has control access, then the viewer network node can transmit remote control commands that are generated by a user input device (e.g., a keyboard, a computer mouse, a touchpad, and a touch screen) to the sharer network node for controlling, editing, and manipulating the shared content on the sharers network node.
  • a user input device e.g., a keyboard, a computer mouse, a touchpad, and a touch screen
  • the application sharing functionality of the communications application 26 is provided by a Scraper module, which is a plug-in for an Application Sharing Service that implements a platform specific part of the application sharing.
  • This embodiment implements the application sharing mode in which all windows created by a shared application automatically are shared with subscribing network nodes.
  • This section describes an exemplary embodiment of the Scraper module and the Application Sharing Service that are implemented in a Microsoft® Windows® application environment that provides layered windows functionality.
  • FIG. 5 shows an embodiment of an application sharing class model.
  • the blocks 540 , 542 , 544 , 546 , and 548 define the interface between Application Sharing Service and Scraper module
  • the blocks 550 , 552 , 554 are the classes that implement Scraper module
  • the blocks 556 , 558 are the classes in the Application Sharing service that use the interfaces.
  • the Scraper module starts the thread that listens for WinEvents.
  • the Scraper module listens for WinEvents to get notifications when windows and menus are created/destroyed and shown/hidden.
  • the Scraper module stops all the application monitors and then shuts down the thread that is listening for WinEvents.
  • the Scraper module gets the thread identifier (ID) and the process ID for the window. The Scraper module then looks up a monitor by the process ID and notifies the application monitor about the event. In response to a notification that a window was destroyed, the Scraper module notifies all of the application monitors of the event since the process and thread IDs are not available.
  • ID thread identifier
  • the Scraper module looks up a monitor by the process ID and notifies the application monitor about the event.
  • the Scraper module In response to a notification that a window was destroyed, the Scraper module notifies all of the application monitors of the event since the process and thread IDs are not available.
  • the Application Sharing Service can request the Scraper module to provide a list of applications that can be shared.
  • the Scraper module builds up the list of IDs and titles of the topmost overlapped windows on the desktop that is presented on the display of the sharer network node.
  • the list is built from the processes that currently are running on the sharer network node.
  • the list also includes the processes that “can be running” on the node. This for instance might include all Microsoft® Word® documents that the sharer might want to share, but currently are not running, or another application (e.g., a calculator application) that the sharer might want to share, but currently is not running.
  • the Scraper module creates an instance of the application monitor that will be responsible for scraping the application windows.
  • the application monitor enumerates all topmost windows on the desktop.
  • the application monitor uses the result of this enumeration as an initial list of the windows.
  • the application monitor determines which windows in the initial list belong to the application being shared.
  • the application monitor then spawns a scraper thread.
  • the scraper thread is responsible for scraping content of the application windows, composing all the application's windows to the final bitmap, and sending the bitmap frame to all the subscribing network nodes.
  • stopping application sharing takes care of gracefully stopping application monitor.
  • the application monitor will shutdown the scraping thread, which makes sure that no more samples are produced. After that it notifies all the subscribing network nodes that they are unsubscribed. Once all the references to the application monitor are gone the monitor will be destroyed.
  • This method returns a list of applications being shared.
  • the subscribe method expects as parameters a reference to a subscriber and an ID of the process to share.
  • the subscriber is a class (see FIG. 5 ) that implements stream_subscriber_itf, as shown in the class model shown in FIG. 5 .
  • the Scraper module finds the monitor by the application ID and adds the subscriber to the monitor. The next time the application monitor generates a frame it will deliver the frame to this subscriber as well.
  • the subscribe method expects as parameters a reference to a subscriber and ID of the process to share. This method finds the shared application by ID and tells it to remove the referenced subscriber. On removing the referenced subscriber, the application monitor will notify the subscriber about the state change. The application monitor also checks shared process state and, if the process exits, then application monitor will notify all the subscribers about the state change, unsubscribe all the subscribers and will terminate itself since there is nothing to monitor.
  • Each application monitor has a thread that wakes up on a periodic basis and performs window scraping.
  • FIGS. 11A and 11B show an embodiment of an exemplary window scraping method.
  • the Scraper module determines the z-order of all the windows that are associated with a shared software process and arranges all the windows according to the z-order ( FIG. 11B ; block 560 ).
  • the scraper module determines a bounding rectangle that encompasses all the windows that are associated with the shared software process ( FIG. 11B ; block 562 ).
  • FIG. 12 shows a screen layout 564 that includes a Microsoft® Word® main window 566 and a Save As dialog box 568 .
  • the Scraper module calculates the rectangle 570 , which is the smallest bounding box that encompasses both windows, including any off-screen content (e.g., the shaded area 571 that corresponds to the bottom of the main window 566 that is outside the visible desktop window 572 ).
  • the bounding rectangle 570 defines the size of the composite bitmap of all the windows to be shared.
  • the Scraper module For each of the windows that are associated with the shared software process, the Scraper module calls a scrape function ( FIG. 11B , block 576 ). If the window is not a layered window ( FIG. 11B , block 578 ), the scrape function sets the WS_EX_LAYERED bit that is associated with the window via the SetWindowLong API call with GWL_EXSTYLE ( FIG. 11B , block 580 ). The scrape function then scrapes the layered window ( FIG. 11B , block 582 ). In this process, the scraper module retrieves images of the windows, including the parts that are obscured by other windows or is outside the visible desktop window (e.g., the shaded area 571 shown in FIG.
  • the scraper module After the scraper module has estimated the bounding rectangle, the scraper module captures a screen image of the shared windows.
  • the Scraper module makes a GetDC( ) function call to GDI (the graphics device interface) that retrieves a handle to a display device context (DC) for the window.
  • the Scraper module performs an iterative capturing of each window.
  • BitBlt( ) is called for each window being shared and the image data of each window is superimposed in the target DC according to the relative positions (in the x, y, and z axes) of the windows in the screen display.
  • the BitBlt function performs a bit-block transfer of the image data corresponding to a block of pixels from the specified source device context into a destination device context.
  • the screen image constitutes the initial composite image. Performing iterative BitBlt( ) requests automatically filters out screen data from unwanted windows and
  • the Scraper module After completing the scraping process, the Scraper module creates a device independent bitmap of the composite image ( FIG. 11B , block 584 ). The Scraper module then creates a sample from the bytes of the bitmap ( FIG. 11B , block 586 ).
  • the sample is a bitmap in a temporal sequence of bitmaps that are captured.
  • the sample is the current composite image.
  • the current sample depicts how the shared application is displayed at current time. Successive samples give the viewers the perception that the shared application is not just a sequence of images, but rather is a live application.
  • the Application Sharing service presents the sample to all the subscribing network nodes, one by one ( FIG. 11B , block 588 ).
  • Samples typically are transmitted to the subscribing network nodes only if there has been a change in the composite image since the last sample was transmitted.
  • the sample typically is compressed before being transmitted to the subscribing network nodes.
  • the Scraper module then goes to sleep until the next scheduled time to scrape another image ( FIG. 11B , block 590 ).
  • the Application Sharing service contemporaneously (i.e., during the same period of time, e.g., during the period between scraping cycles) transmits the sample of the composite image to each of the subscribing remote network nodes.
  • FIG. 13 shows an embodiment of a method by which the Scraper module determines the current z-order of the windows that are associated with a shared software process ( FIG. 11B , block 560 ).
  • the Scraper module starts with an initial list of all top-level windows that are associated with the shared software process ( FIG. 13 , block 600 ).
  • the initial list may be obtained from a previous iteration of the scraping process shown in FIGS. 11A and 11B .
  • the initial list may be obtained directly by querying the application environment
  • the Scraper module is implemented in a Microsoft® Windows® application environment.
  • each software process thread has an entry point function called WinMain, which registers the window class for the main window by calling the RegisterClass function, and creates the main window by calling the CreateWindowEx function.
  • the creation function After creating a window, the creation function returns a window handle that has the HWND data type and uniquely identifies the window; the window handle is used to direct software process actions to the window.
  • the Scraper module obtains a handle to each of the top-level windows associated with the shared software process thread by calling the EnumWindows function, which enumerates all the windows on the desktop and then query the process this window belongs to by calling GetWindowThreadProcessId.
  • the Scraper module determines a z-order of all the windows that currently are associated with the shared software process ( FIG. 13 , block 602 ).
  • the Scraper module obtains a handle to each of the windows on the desktop window by calling the EnumChildWindows function, which enumerates the child windows that belong to the desktop window in z-order by passing the handle to each child window, in turn, to a callback function of the Scraper module.
  • the Scraper module recursively iterates through all the windows on the desktop and identifies those windows that belong to the shared software application. In this process, the Scraper module identifies the children of that desktop that correspond to the top-level windows of the shared software application. For each of the matching top-level windows, the Scraper module calls a function that recursively iterates through the children of the matching top-level window; the function recursively calls itself.
  • the Scraper module sorts the initial list of top-level windows that are associated with the software process according to the determined z-order ( FIG. 13 , block 604 ).
  • the Scraper module appends to the sorted list any of the windows in the initial list of top-level windows that are not included in the sorted list ( FIG. 11 , block 606 ). This step is used to cover any windows (e.g., a window that was deleted after the sorting process started) that might have been missed.
  • the Scraper module replaces the initial list with the sorted list of top-level windows ( FIG. 13 , block 608 ).
  • the Application Sharing Service on the sharer network node only transmits the composite images of the shared window content (in the form of samples) to the subscribing ones of the viewer network nodes.
  • the viewers on the viewer network nodes can only passively view the composite images of the shared windows on the sharers display.
  • the Application Sharing Service on the sharer network node transmits the composite images of the shared window content (in the form of samples) to the subscribing ones of the viewer network nodes.
  • the Scraper module combines commands that are received from the viewer network nodes with the commands that are generated by the sharer on the sharer network node, and passes the combined set of commands to the shared application. This allows the viewers to control, edit, and manipulate the shared application on the sharer network node.
  • the commands typically are derived from events that are generated by one or more user input devices (e.g., a keyboard, a computer mouse, a touchpad, and a touch screen) on the viewer and sharer network nodes.
  • FIG. 14 shows an embodiment 610 of the first network node 610 and an embodiment 612 of the second network node 612 .
  • the first network node 610 (which is referred to as the “sharer network node 610 ”) shares window content 614 that is associated with a shared process 616 with the second network node 612 (which is referred to as the “viewer network node 612 ”).
  • the sharer network node 610 includes a display 618 on which the window content 614 is presented, a display process 620 , an embodiment 622 of the communications application 26 , and a network layer 624 .
  • the viewer network node 612 includes a display 626 , a display process 628 , an embodiment 630 of the communications application 26 , and a network layer 632 .
  • the display processes 620 , 628 provide the display facilities of the sharer network node 610 and the viewer network node 612 , respectively.
  • the display facilities control the writing of visual contents on the sharer and viewer displays 618 , 626 .
  • the display facilities include a graphic device interface (e.g., the GDI available in Microsoft® Windows® application environments) that provides functions that can be called by software processes in order to present visual content on the displays 618 , 626 .
  • the network layers 624 , 632 provide the networking facilities of the sharer network node 610 and the viewer network node 612 , respectively.
  • the network facilities include, for example, networking communications protocol stacks and networking hardware that perform processes associated with sending and receiving information over the network 18 .
  • the communications applications 622 , 630 respectively provide various communications facilities (including application sharing facilities) to the sharer network node 610 and the viewer network node 612 .
  • the communications application 622 on the sharer network node 610 generates a composite image 634 of the shared window content 614 on the sharer's display, transmits the composite image 634 over the network 18 to the viewer network node 612 for presentation on the viewer's display 626 , and grants the viewer remote control access to the shared window content 614 .
  • the communications application 630 on the viewer network node 612 controls the presentation of the composite image 634 on the display 626 , transforms user input into commands, and transmits the commands to the sharer network node 610 .
  • FIG. 15 shows an embodiment of an application sharing method that is implemented by the sharer network node 610 in the application sharing context shown in FIG. 14 .
  • the communications application 622 on the sharer network node 610 captures a composite image of the windows that are associated with the shared software process ( FIG. 15 , block 637 ).
  • the window scraping process described above is used to capture the composite image.
  • the communications application 622 transmits the composite image to the viewer network node 612 ( FIG. 15 , block 639 ).
  • the communications application passes a sample of the composite image to the network layer 624 , which converts the sample into a network format and transmits the converted sample to the viewer network node 612 over the network 18 .
  • the communications application 622 receives commands that are derived from local input device events generated on the sharer network node 610 ( FIG. 15 , block 640 ). Contemporaneously, the communications application 622 receives commands that are derived from remote input device events that are generated on the viewer network node 612 ( FIG. 15 , block 642 ).
  • the commands are directives or instructions to perform tasks, where the directives or instructions are derived from interpretations of input device events that are initiated by user actions (or inputs) on one or more input devices (e.g., clicking one or more buttons of a computer mouse or keys of a computer keyboard).
  • Some pointer input device events such a computer mouse, touch pad, and touch screen events, are tied to the position of a pointer (or cursor) in graphical user interface that is presented on a display.
  • These types of input device events typically are converted into commands that include both an input type parameter value that describes the type of input (e.g., left-click, right-click, left button double-click, scroll wheel, etc.) and a location parameter value that describes where the input was entered with respect to a context-dependent coordinate system.
  • the operating systems on the sharer and viewer network nodes 610 , 612 typically convert the pointer input device events into user commands, where the location parameter values are defined with respect to the coordinate system of the main window of the graphical user interface (e.g., the desktop window in a Microsoft® Windows® application environment).
  • the sharer's input commands that are received by the communications application 622 are the operating system generated user commands; these commands also are used to control the operation of user mode software processes (e.g., the shared process 616 ) on the sharer network node 610 .
  • the communications application 630 on the viewer network node 612 remaps the location parameter values in the operating system generated commands from the coordinate system of the main window of the graphical user interface on the display 626 to a coordinate system of the composite image 634 .
  • the location parameter values are remapped to a coordinate system that has an origin (0,0) at the upper left corner of the composite image 634 and has x and y axes that extend along the bottom and left edges of the composite image 634 .
  • the communications application 622 processes the received commands into a command sequence ( FIG. 15 , block 644 ).
  • the communications application 622 typically remaps the location parameter values in the viewer's commands from the coordinate system of the composite image 634 to the coordinate system of the main window of the graphical user interface on the display 618 .
  • the communications application then arranges the received commands into a sequence in order of time of receipt, and stores the resulting sequence of commands into a memory buffer.
  • the communications application 622 passes the command sequence to the shared process 616 ( FIG. 15 , block 646 ).
  • the communications application typically invokes Win32 API functions that allows it to specify a window that Windows will send the command to.
  • the shared process 616 calls one or more graphical device interface functions that are provided by the display process 620 to present the windows that are associated with the shared software process 616 on the sharer display 618 in accordance with the received command sequence ( FIG. 15 , block 648 ).
  • FIG. 16 shows an embodiment of an exemplary network communications environment 10 that includes the first network node 12 (referred to as the “first client network node”), the second network node 14 (referred to as the “second client network node”), and a virtual environment creator 16 that are interconnected by a network 18 .
  • the first client network node 12 and the second client network node 14 are configured as described above in connection with FIG. 1 .
  • the virtual environment creator 16 includes at least one server network node 28 that provides a network infrastructure service environment 30 .
  • the communications application 26 and the network infrastructure service environment 30 together provide a platform (referred to herein as “the platform”) for creating a spatial virtual communication environment (also referred to herein simply as a “virtual environment”).
  • the network infrastructure service environment 30 manages sessions of the first and second client nodes 12 , 14 in a virtual area 32 in accordance with a virtual area application 34 .
  • the virtual area application 34 is hosted by the virtual area 32 and includes a description of the virtual area 32 .
  • the communications applications 26 operating on the first and second client network nodes 12 , 14 present respective views of the virtual area 32 in accordance with data received from the network infrastructure service environment 30 and provide respective interfaces for receiving commands from the communicants.
  • the communicants typically are represented in the virtual area 32 by respective avatars, which move about the virtual area 32 in response to commands that are input by the communicants at their respective network nodes.
  • Each communicants view of the virtual area 32 typically is presented from the perspective of the communicants avatar, which increases the level of immersion experienced by the communicant.
  • Each communicant typically is able to view any part of the virtual area 32 around his or her avatar.
  • the communications applications 26 establish realtime data stream connections between the first and second client network nodes 12 , 14 and other network nodes sharing the virtual area 32 based on the positions of the communicants' avatars in the virtual area 32 .
  • the network infrastructure service environment 30 also maintains a relationship database 36 that contains records 38 of interactions between communicants. Each interaction record 38 describes the context of an interaction between a pair of communicants.
  • the network 18 may include any of a local area network (LAN), a metropolitan area network (MAN), and a wide area network (WAN) (e.g., the internet).
  • the network 18 typically includes a number of different computing platforms and transport facilities that support the transmission of a wide variety of different media types (e.g., text, voice, audio, and video) between network nodes.
  • the communications application 26 typically operates on a client network node that includes software and hardware resources which, together with administrative policies, user preferences (including preferences regarding the exportation of the user's presence and the connection of the user to areas and other users), and other settings, define a local configuration that influences the administration of realtime connections with other network nodes.
  • the network connections between network nodes may be arranged in a variety of different stream handling topologies, including a peer-to-peer architecture, a server-mediated architecture, and hybrid architectures that combine aspects of peer-to-peer and server-mediated architectures. Exemplary topologies of these types are described in U.S. application Ser. Nos. 11/923,629 and 11/923,634, both of which were filed on Oct. 24, 2007.
  • the network infrastructure service environment 30 typically includes one or more network infrastructure services that cooperate with the communications applications 26 in the process of establishing and administering network connections between the client nodes 12 , 14 and other network nodes (see FIG. 13 ).
  • the network infrastructure services may run on a single network node or may be distributed across multiple network nodes.
  • the network infrastructure services typically run on one or more dedicated network nodes (e.g., a server computer or a network device that performs one or more edge services, such as routing and switching). In some embodiments, however, one or more of the network infrastructure services run on at least one of the communicants' network nodes.
  • the network infrastructure services that are included in the exemplary embodiment of the network infrastructure service environment 30 are an account service, a security service, an area service, a rendezvous service, and an interaction service.
  • the account service manages communicant accounts for the virtual environment.
  • the account service also manages the creation and issuance of authentication tokens that can be used by client network nodes to authenticate themselves to any of the network infrastructure services.
  • the security service controls communicants' access to the assets and other resources of the virtual environment.
  • the access control method implemented by the security service typically is based on one or more of capabilities (where access is granted to entities having proper capabilities or permissions) and an access control list (where access is granted to entities having identities that are on the list). After a particular communicant has been granted access to a resource, that communicant typically uses the functionality provided by the other network infrastructure services to interact in the network communications environment 10 .
  • the area service administers virtual areas.
  • the area service remotely configures the communications applications 26 operating on the first and second client network nodes 12 , 14 in accordance with the virtual area application 34 subject to a set of constraints 47 (see FIG. 13 ).
  • the constraints 47 typically include controls on access to the virtual area.
  • the access controls typically are based on one or more of capabilities (where access is granted to communicants or client nodes having proper capabilities or permissions) and an access control list (where access is granted to communicants or client nodes having identities that are on the list).
  • the area service also manages network connections that are associated with the virtual area subject to the capabilities of the requesting entities, maintains global state information for the virtual area, and serves as a data server for the client network nodes participating in a shared communication session in a context defined by the virtual area 32 .
  • the global state information includes a list of all the objects that are in the virtual area and their respective locations in the virtual area.
  • the area service sends instructions that configure the client network nodes.
  • the area service also registers and transmits initialization information to other client network nodes that request to join the communication session. In this process, the area service may transmit to each joining client network node a list of components (e.g., plugins) that are needed to render the virtual area 32 on the client network node in accordance with the virtual area application 34 .
  • the area service also ensures that the client network nodes can synchronize to a global state if a communications fault occurs.
  • the area service typically manages communicant interactions with virtual areas via governance rules that are associated with the virtual areas.
  • the rendezvous service manages the collection, storage, and distribution of presence information and provides mechanisms for network nodes to communicate with one another (e.g., by managing the distribution of connection handles) subject to the capabilities of the requesting entities.
  • the rendezvous service typically stores the presence information in a presence database.
  • the rendezvous service typically manages communicant interactions with each other via communicant privacy preferences.
  • the interaction service maintains the relationship database 36 that contains the records 38 of interactions between communicants. For every interaction between communicants, one or more services of the network infrastructure service environment 30 (e.g., the area service) transmit interaction data to the interaction service. In response, the interaction service generates one or more respective interaction records and stores them in the relationship database. Each interaction record describes the context of an interaction between a pair of communicants.
  • an interaction record contains an identifier for each of the communicants, an identifier for the place of interaction (e.g., a virtual area instance), a description of the hierarchy of the interaction place (e.g., a description of how the interaction room relates to a larger area), start and end times of the interaction, and a list of all files and other data streams that are shared or recorded during the interaction.
  • the interaction service tracks when it occurred, where it occurred, and what happened during the interaction in terms of communicants involved (e.g., entering and exiting), objects that are activated/deactivated, and the files that were shared.
  • the interaction service also supports queries on the relationship database 36 subject to the capabilities of the requesting entities.
  • the interaction service presents the results of queries on the interaction database records in a sorted order (e.g., most frequent or most recent) based on virtual area.
  • the query results can be used to drive a frequency sort of contacts whom a communicant has met in which virtual areas, as well as sorts of who the communicant has met with regardless of virtual area and sorts of the virtual areas the communicant frequents most often.
  • the query results also may be used by application developers as part of a heuristic system that automates certain tasks based on relationships.
  • a heuristic of this type is a heuristic that permits communicants who have visited a particular virtual area more than five times to enter without knocking by default or a heuristic that allows communicants who were present in an area at a particular time to modify and delete files created by another communicant who was present in the same area at the same time.
  • Queries on the relationship database 36 can be combined with other searches. For example, queries on the relationship database may be combined with queries on contact history data generated for interactions with contacts using a communication system (e.g., Skype, Facebook, and Flickr) that is outside the domain of the network infrastructure service environment 30 .
  • a communication system e.g., Skype, Facebook, and Flickr
  • the communications application 26 and the network infrastructure service environment 30 typically administer the realtime connections with network nodes in a communication context that is defined by an instance of a virtual area.
  • the virtual area instance may correspond to an abstract (non-geometric) virtual space that is defined with respect to abstract coordinates.
  • the virtual area instance may correspond to a visual virtual space that is defined with respect to one-, two- or three-dimensional geometric coordinates that are associated with a particular visualization.
  • Abstract virtual areas may or may not be associated with respective visualizations, whereas visual virtual areas are associated with respective visualizations.
  • communicants typically are represented by respective avatars in a virtual area that has an associated visualization.
  • the avatars move about the virtual area in response to commands that are input by the communicants at their respective network nodes.
  • the communicants view of a virtual area instance typically is presented from the perspective of the communicants avatar, and each communicant typically is able to view any part of the visual virtual area around his or her avatar, increasing the level of immersion that is experienced by the communicant.
  • FIG. 17 shows an embodiment of an exemplary network node that is implemented by a computer system 48 .
  • the computer system 48 includes a display monitor 50 , a computer mouse 52 , a keyboard 54 , speakers 56 , 58 , and a microphone 60 .
  • the display monitor 50 displays a graphical user interface 62 .
  • the graphical user interface 62 is a windows-based graphical user interface that can include multiple windows, icons, and a pointer 64 .
  • the graphical user interface 62 presents a two-dimensional depiction of a shared virtual area 66 that is associated with a three-dimensional visualization representing an art gallery.
  • Communicants are represented in the virtual area 66 by respective avatars 68 , 70 , 72 , each of which may have a respective role (e.g., a curator, an artist, and a visitor) in the context of the virtual area 66 .
  • the virtual area 66 includes zones 74 , 76 , 78 , 80 , 82 that are associated with respective rules that govern the switching of realtime data streams between the network nodes that are represented by the avatars 68 - 72 in the virtual area 66 .
  • the switching rules dictate how local connection processes executing on each of the network nodes establishes communications with the other network nodes based on the locations of the communicants' avatars 68 - 72 in the zones 74 - 82 of the virtual area 66 .
  • a virtual area is defined by a specification that includes a description of geometric elements of the virtual area and one or more rules, including switching rules and governance rules.
  • the switching rules govern realtime stream connections between the network nodes.
  • the governance rules control a communicants access to resources, such as the virtual area itself, regions with the virtual area, and objects within the virtual area.
  • the geometric elements of the virtual area are described in accordance with the COLLADA-Digital Asset Schema Release 1.4.1 April 2006 specification (available from http://www.khronos.org/collada/), and the switching rules are described using an extensible markup language (XML) text format (referred to herein as a virtual space description format (VSDL)) in accordance with the COLLADA Streams Reference specification described in U.S. application Ser. Nos. 11/923,629 and 11/923,634.
  • XML extensible markup language
  • the geometric elements of the virtual area typically include physical geometry and collision geometry of the virtual area.
  • the physical geometry describes the shape of the virtual area.
  • the physical geometry typically is formed from surfaces of triangles, quadrilaterals, or polygons. Colors and textures are mapped onto the physical geometry to create a more realistic appearance for the virtual area. Lighting effects may be provided, for example, by painting lights onto the visual geometry and modifying the texture, color, or intensity near the lights.
  • the collision geometry describes invisible surfaces that determine the ways in which objects can move in the virtual area.
  • the collision geometry may coincide with the visual geometry, correspond to a simpler approximation of the visual geometry, or relate to application-specific requirements of a virtual area designer.
  • the switching rules typically include a description of conditions for connecting sources and sinks of realtime data streams in terms of positions in the virtual area.
  • Each rule typically includes attributes that define the realtime data stream type to which the rule applies and the location or locations in the virtual area where the rule applies.
  • each of the rules optionally may include one or more attributes that specify a required role of the source, a required role of the sink, a priority level of the stream, and a requested stream handling topology.
  • one or more implicit or default switching rules may apply to that part of the virtual area.
  • One exemplary default switching rule is a rule that connects every source to every compatible sink within an area, subject to policy rules.
  • Policy rules may apply globally to all connections between the client nodes or only to respective connections with individual client nodes.
  • An example of a policy rule is a proximity policy rule that only allows connections of sources with compatible sinks that are associated with respective objects that are within a prescribed distance (or radius) of each other in the virtual area.
  • governance rules are associated with a virtual area to control who has access to the virtual area, who has access to its contents, what is the scope of that access to the contents of the virtual area (e.g., what can a user do with the contents), and what are the follow-on consequences of accessing those contents (e.g., record keeping, such as audit logs, and payment requirements).
  • an entire virtual area or a zone of the virtual area is associated with a “governance mesh.”
  • a governance mesh is implemented in a way that is analogous to the implementation of the zone mesh described in U.S. application Ser. Nos. 11/923,629 and 11/923,634.
  • a governance mesh enables a software application developer to associate governance rules with a virtual area or a zone of a virtual area. This avoids the need for the creation of individual permissions for every file in a virtual area and avoids the need to deal with the complexity that potentially could arise when there is a need to treat the same document differently depending on the context.
  • a virtual area is associated with a governance mesh that associates one or more zones of the virtual area with a digital rights management (DRM) function.
  • the DRM function controls access to one or more of the virtual area or one or more zones within the virtual area or objects within the virtual area.
  • the DRM function is triggered every time a communicant crosses a governance mesh boundary within the virtual area.
  • the DRM function determines whether the triggering action is permitted and, if so, what is the scope of the permitted action, whether payment is needed, and whether audit records need to be generated.
  • the associated governance mesh is configured such that if a communicant is able to enter the virtual area he or she is able to perform actions on all the documents that are associated with the virtual area, including manipulating the documents, viewing the documents, downloading the documents, deleting the documents, modifying the documents and re-uploading the documents.
  • the virtual area can become a repository for information that was shared and discussed in the context defined by the virtual area.
  • the communications application 26 includes:
  • HIDs Human Interface Devices
  • audio playback devices a. local Human Interface Devices (HIDs) and audio playback devices
  • HIDS Local Human Interface Devices
  • Audio Playback Devices a.
  • the local HIDs enable a communicant to input commands and other signals into the client network node while participating in a virtual area communications session.
  • exemplary HIDs include a computer keyboard, a computer mouse, a touch screen display, and a microphone.
  • the audio playback devices enable a communicant to playback audio signals that are received during a virtual area communications session.
  • Exemplary audio playback devices include audio processing hardware (e.g., a sound card) for manipulating (e.g., mixing and applying special effects) audio signals, and speakers for outputting sounds.
  • the So3D engine is a three-dimensional visualization engine that controls the presentation of a respective view of a virtual area and objects in the virtual area on a display monitor.
  • the So3D engine typically interfaces with a graphical user interface driver and the HID devices to present the views of the virtual area and to allow the communicant to control the operation of the communications application 26 .
  • the So3D engine receives graphics rendering instructions from the area service.
  • the So3D engine also may read a local communicant avatar database that contains images needed for rendering the communicants avatar in the virtual area. Based on this information, the So3D engine generates a visual representation (i.e., an image) of the virtual area and the objects in the virtual area from the point of view (position and orientation) of the communicants avatar in the virtual area.
  • the visual representation typically is passed to the graphics rendering components of the operating system, which drive the graphics rendering hardware to render the visual representation of the virtual area on the client network node.
  • the communicant can control the presented view of the virtual area by inputting view control commands via a HID device (e.g., a computer mouse).
  • the So3D engine updates the view of the virtual area in accordance with the view control commands.
  • the So3D engine also updates the graphic representation of the virtual area on the display monitor in accordance with updated object position information received from the area service 26 .
  • the system database and storage facility stores various kinds of information that is used by the platform.
  • Exemplary information that typically is stored by the storage facility includes the presence database, the relationship database, an avatar database, a real user id (RUID) database, an art cache database, and an area application database. This information may be stored on a single network node or it may be distributed across multiple network nodes.
  • a communicant typically connects to the network 18 from a client network node.
  • the client network node typically is implemented by a general-purpose computer system or a dedicated communications computer system (or “console”, such as a network-enabled video game console).
  • the client network node executes communications processes that establish realtime data stream connections with other network nodes and typically executes visualization rendering processes that present a view of each virtual area entered by the communicant.
  • FIG. 18 shows an embodiment of a client network node that is implemented by a computer system 120 .
  • the computer system 120 includes a processing unit 122 , a system memory 124 , and a system bus 126 that couples the processing unit 122 to the various components of the computer system 120 .
  • the processing unit 122 may include one or more data processors, each of which may be in the form of any one of various commercially available computer processors.
  • the system memory 124 includes one or more computer-readable media that typically are associated with a software application addressing space that defines the addresses that are available to software applications.
  • the system memory 124 may include a read only memory (ROM) that stores a basic input/output system (BIOS) that contains start-up routines for the computer system 120 , and a random access memory (RAM).
  • ROM read only memory
  • BIOS basic input/output system
  • RAM random access memory
  • the system bus 126 may be a memory bus, a peripheral bus or a local bus, and may be compatible with any of a variety of bus protocols, including PCI, VESA, Microchannel, ISA, and EISA.
  • the computer system 120 also includes a persistent storage memory 128 (e.g., a hard drive, a floppy drive, a CD ROM drive, magnetic tape drives, flash memory devices, and digital video disks) that is connected to the system bus 126 and contains one or more computer-readable media disks that provide non-volatile or persistent storage for data, data structures and computer-executable instructions.
  • a persistent storage memory 128 e.g., a hard drive, a floppy drive, a CD ROM drive, magnetic tape drives, flash memory devices, and digital video disks
  • a communicant may interact (e.g., input commands or data) with the computer system 120 using one or more input devices 130 (e.g. one or more keyboards, computer mice, microphones, cameras, joysticks, physical motion sensors such Wii input devices, and touch pads). Information may be presented through a graphical user interface (GUI) that is presented to the communicant on a display monitor 132 , which is controlled by a display controller 134 .
  • GUI graphical user interface
  • the computer system 120 also may include other input/output hardware (e.g., peripheral output devices, such as speakers and a printer).
  • the computer system 120 connects to other network nodes through a network adapter 136 (also referred to as a “network interface card” or NIC).
  • NIC network interface card
  • a number of program modules may be stored in the system memory 124 , including application programming interfaces 138 (APIs), an operating system (OS) 140 (e.g., the Windows XP® operating system available from Microsoft Corporation of Redmond, Wash. U.S.A.), the communications application 26 , drivers 142 (e.g., a GUI driver), network transport protocols 144 , and data 146 (e.g., input data, output data, program data, a registry, and configuration settings).
  • APIs application programming interfaces 138
  • OS 140 e.g., the Windows XP® operating system available from Microsoft Corporation of Redmond, Wash. U.S.A.
  • drivers 142 e.g., a GUI driver
  • network transport protocols 144 e.g., input data, output data, program data, a registry, and configuration settings.
  • the one or more server network nodes of the virtual environment creator 16 are implemented by respective general-purpose computer systems of the same type as the client network node 120 , except that each server network node typically includes one or more server software applications.
  • the one or more server network nodes of the virtual environment creator 16 are implemented by respective network devices that perform edge services (e.g., routing and switching).
  • each of the client network nodes generates a respective set of realtime data streams (e.g., motion data streams, audio data streams, chat data streams, file transfer data streams, and video data streams).
  • each communicant manipulates one or more input devices (e.g., the computer mouse 52 and the keyboard 54 ) that generate motion data streams, which control the movement of his or her avatar in the virtual area 66 .
  • the communicants voice and other sounds that are generated locally in the vicinity of the computer system 48 are captured by the microphone 60 .
  • the microphone 60 generates audio signals that are converted into realtime audio streams. Respective copies of the audio streams are transmitted to the other network nodes that are represented by avatars in the virtual area 66 .
  • the sounds that are generated locally at these other network nodes are converted into real-time audio signals and transmitted to the computer system 48 .
  • the computer system 48 converts the audio streams generated by the other network nodes into audio signals that are rendered by the speakers 56 , 58 .
  • the motion data streams and audio streams may be transmitted from each of the communicant nodes to the other client network nodes either directly or indirectly.
  • each of the client network nodes receives copies of the realtime data streams that are transmitted by the other client network nodes.
  • one or more of the client network nodes receives one or more stream mixes that are derived from realtime data streams that are sourced (or originated) from other ones of the network nodes.
  • the area service maintains global state information that includes a current specification of the virtual area, a current register of the objects that are in the virtual area, and a list of any stream mixes that currently are being generated by the network node hosting the area service.
  • the objects register typically includes for each object in the virtual area a respective object identifier (e.g., a label that uniquely identifies the object), a connection handle (e.g., a URI, such as an IP address) that enables a network connection to be established with a network node that is associated with the object, and interface data that identifies the realtime data sources and sinks that are associated with the object (e.g., the sources and sinks of the network node that is associated with the object).
  • a respective object identifier e.g., a label that uniquely identifies the object
  • a connection handle e.g., a URI, such as an IP address
  • the objects register also typically includes one or more optional role identifiers for each object; the role identifiers may be assigned explicitly to the objects by either the communicants or the area service, or may be inferred from other attributes of the objects or the user.
  • the objects register also includes the current position of each of the objects in the virtual area as determined by the area service from an analysis of the realtime motion data streams received from the network nodes associated with objects in the virtual area.
  • the area service receives realtime motion data streams from the network nodes associated with objects in the virtual area, tracks the communicants' avatars and other objects that enter, leave, and move around in the virtual area based on the motion data.
  • the area service updates the objects register in accordance with the current locations of the tracked objects.
  • the area service maintains for each of the client network nodes a set of configuration data, including interface data, a zone list and the positions of the objects that currently are in the virtual area.
  • the interface data includes for each object associated with each of the client network nodes a respective list of all the sources and sinks of realtime data stream types that are associated with the object.
  • the zone list is a register of all the zones in the virtual area that currently are occupied by the avatar associated with the corresponding client network node.
  • the communications application 26 also includes a graphical navigation and interaction interface (referred to herein as a “seeker interface”) that interfaces the user with the spatial virtual communication environment.
  • the seeker interface includes navigation controls that enable the user to navigate the virtual environment and interaction controls that enable the user to control his or her interactions with other communicants in the virtual communication environment.
  • the navigation and interaction controls typically are responsive to user selections that are made using any type of input device, including a computer mouse, a touch pad, a touch screen display, a keyboard, and a video game controller.
  • the seeker interface is an application that operates on each client network node.
  • the seeker interface is a small, lightweight interface that a user can keep up and running all the time on his or her desktop.
  • the seeker interface allows the user to launch virtual area applications and provides the user with immediate access to realtime contacts and realtime collaborative places (or areas).
  • the seeker interface is integrated with realtime communications applications and/or realtime communications components of the underlying operating system such that the seeker interface can initiate and receive realtime communications with other network nodes.
  • a virtual area is integrated with the users desktop through the seeker interface such that the user can upload files into the virtual environment created by the virtual environment creator 16 , use files stored in association with the virtual area using the native client software applications independently of the virtual environment while still present in a virtual area, and more generally treat presence and position within a virtual area as an aspect of their operating environment analogous to other operating system functions rather than just one of several applications.
  • the spatial virtual communication environment typically can be modeled as a spatial hierarchy of places (also referred to herein as “locations”) and objects.
  • the spatial hierarchy includes an ordered sequence of levels ranging from a top level to a bottom level.
  • Each of the places in a successive one of the levels of the spatial hierarchy is contained in a respective one of the places in a preceding one of the levels.
  • Each of the objects in the spatial hierarchy is contained in a respective one of the places.
  • the levels of the spatial hierarchy typically are associated with respective visualizations that are consistent with a geographical, architectural, or urban metaphor, and are labeled accordingly.
  • each virtual area is defined by respective meshes, some of which define elements of a physical environment (e.g., spaces, such as rooms and courtyards, that are associated with a building) that may contain objects (e.g., avatars and props, such as view screen objects and conferencing objects).
  • a physical environment e.g., spaces, such as rooms and courtyards, that are associated with a building
  • objects e.g., avatars and props, such as view screen objects and conferencing objects.
  • the navigational controls of the seeker interface allow the user to traverse a path through the virtual environment in accordance with a navigational model that is tied to the underlying spatial hierarchy of places and objects.
  • the network infrastructure service environment 30 records the path traversed by the user.
  • the network infrastructure service environment 30 records a history that includes a temporally ordered list of views of the virtual area that are presented to the user as the user navigates through the virtual area. Each view typically corresponds to a view of a respective renderable zone of the virtual area.
  • the navigation controls enable the user to move to selected ones of the zones in the history.
  • the navigation controls also include a graphical representation of a depth path that shows the location in the spatial hierarchy that corresponds to the users current view of the virtual area.
  • the graphical representation of the depth path includes a respective user-selectable link to a respective view of each of the preceding levels in the spatial hierarchical model of the virtual area above the current view.
  • the interaction controls of the seeker interface allow the user to manage interactions with other communicants.
  • the interaction options that available to the user typically depend on the zones in which the user has a presence. In some embodiments, the interaction options that are available to communicants who have presence in a particular zone are different from the options that are available to other communicants who do not have presence in that zone.
  • the level of detail and interactivity of the user typically depend on whether or not the user has a presence the particular zone.
  • the user if the user is outside the virtual area, the user is provided with a minimal level of detail of the interactions occurring within the virtual area (e.g., the user can see an outline of the floorplan, background textures, and plants of the area, but the user cannot see where other communicants are present in the area); if the user is within the virtual area but outside a particular zone of the area, the user is provided with a medium level of detail of the interactions occurring within the particular zone (e.g., the user can see where other communicants are present in the area, see a visualization of their current states—talking, typing a chat message, whether or not their headphones and microphones are turned-on—and see whether any of the view screens are active); if the user is within the particular zone of the area, the user is provided with full level of detail of the interactions occurring with the particular zone (e.g., the user can see a thumbnail of the file being shared on a view screen, hear and speak with other communicants in the area, and see elements of a log of chat messages that
  • FIG. 19 shows an embodiment 160 of the seeker interface that is displayed in a window 162 and includes one or more tabs 164 , each of which has a browsing area 166 and a toolbar 168 .
  • Each of the tabs 164 typically is associated with a respective view of the virtual environment.
  • the view presented in the table 164 (labeled “My Areas”) is associated with a respective set of virtual areas, which may be a default set of virtual areas in the virtual environment or it may be a set of virtual areas that is identified by a respective filter on the interaction database.
  • the tab 164 is associated with a set of three virtual areas (i.e., Acme, Sococo Help Area, and Personal Space), which may be a default set of areas that are associated with the user or may be identified by a filter that identifies all of the areas that are associated with the user (e.g., all of the areas in which the user has interacted). Additional tabs may be created by selecting the “+” button 170 .
  • the browsing area 166 of each tab shows graphic representations of the elements of the virtual environment that are associated with the tab.
  • the browsing area 166 shows top-level views 172 , 174 , 176 of the virtual areas that are associated with the tab 164 .
  • the user may navigate to the next lower level in the spatial hierarchical model of any of the virtual areas by selecting the corresponding graphic representation of the virtual area.
  • the toolbar 168 includes an adaptive set of navigational and interaction tools that automatically are selected by the seeker interface based on the current location of the user in the virtual environment
  • the toolbar 168 includes a back button 178 , a forward button 180 , a placemarks button 182 , and a home button 184 .
  • the back button 178 corresponds to a backward control that enables the user to incrementally move backward to preceding ones of the zones in the history of the zones that were traversed by the user.
  • the forward button 180 corresponds to a forward control that enables the user to incrementally move forward to successive ones of the zones in the history of the zones that were traversed by the user.
  • the placemarks button 182 provides a placemarking control for storing links to zones and a placemark navigation control for viewing a list of links to previously placemarked zones.
  • a placemark is created by storing an image of the location shown in the current view in association with a hyperlink to the corresponding location in the virtual area.
  • a placemarks window is presented to the user.
  • the placemarks window includes live visualizations of all locations that have been placemarked by the user.
  • Each of the images in the placemarks window is associated with a respective user-selectable hyperlink.
  • a view of the virtual area corresponding to the location associated with the selected hyperlink is automatically displayed in the browsing area 166 of the seeker interface window 162 .
  • the home button 184 corresponds to a control that returns the user to a view of a designated “home” location in the virtual environment (e.g., the view shown in FIG. 19 ).
  • the platform in response to a user selection of the graphic representation 172 of the Acme virtual area shown in FIG. 19 , the platform moves the user into a default zone of the virtual area, automatically establishes a presence for the user in the default zone, and automatically establishes a network connection between the user and each of the other communicants occupying the selected zone.
  • the platform Based on the switching rules established by the area designer, the platform multiplexes the specified realtime data streams (e.g., streams from the microphones and speakers) of all the communicants in default zone such that they can both see each others' sprites or avatars and communicate with (e.g., speak to and hear) each other.
  • the specified realtime data streams e.g., streams from the microphones and speakers
  • the seeker interface shows a top or floorplan view of the Acme virtual area in the browsing area 166 of the tab 164 and provides the user with a default set of interaction options.
  • a presence automatically is established in a courtyard zone 190 of the virtual area, and the user's microphone and default speakers (e.g., headphones) are turned-on.
  • the floorplan view shown in FIG. 20 the user is represented by a circular sprite 192 ; the other users in the Acme virtual area also are represented by respective circular sprites 194 - 202 .
  • the state of the user's speakers is depicted by the presence or absence of a headphones graphic 203 on the user's sprite 192 : when the speakers are on, the headphones graphic 203 is present and, when the speakers are off, the headphones graphic 203 is absent.
  • the state of the user's microphone is depicted by the presence or absence of a microphone graphic 206 on the user's sprite 192 and a series of concentric circles 204 around the user's sprite 192 : when the microphone is on, the microphone graphic 206 and the concentric circles 204 are present and, when the microphone is off, the microphone graphic 206 and the concentric circles 204 are absent.
  • the headphones graphic 203 , the concentric circles 204 , and the microphone graphic 206 serve as visual reminders of the states of the users sound playback and microphone devices.
  • the toolbar 168 also includes a series of one or more breadcrumb buttons 207 that originate from and include the home button 184 .
  • the breadcrumb buttons 207 correspond to a hierarchical sequence of successive, user-selectable links. Each of the successive links corresponds to a view of a respective level in the hierarchical model of the virtual area in which each successive level is contained by preceding ones of the levels.
  • the breadcrumb buttons 207 include the home button 184 and an Acme button 208 that corresponds to the current view of the Acme virtual area shown in FIG. 20 .
  • the breadcrumb buttons 207 provide the user with single button selection access to respective views of different levels of the virtual environment.
  • the toolbar 168 additionally includes a button 210 and a settings button 212 .
  • the button 210 When an area is selected or in focus, the button 210 appears as an iconographic representation of two people and is labeled “members,” and allows members and moderators to see the list of people associated with an area. When an audio or chat zone is in focus, the button 210 has a different image (e.g., an image of an arrow pointing downward onto a plane to represent the operation of getting) and is labeled “get”.
  • a list of all the members of the Acme virtual area 166 is displayed in a user interface. The user may select any of the communicants in the list and click a get button that is presented in the user interface; in response, the platform transmits an invitation to the selected communicant to join the user in a designated one of the zones.
  • the settings button 212 provides the user with access to a set of controls for specifying default settings that are associated with the current area.
  • the user may navigate from the view of the Acme virtual area shown in FIG. 20 back to the view shown in FIG. 19 in a variety of different ways.
  • the user may select any of the back button 178 , the home button 184 , or any part of the zone 211 that is outside the boundary of the graphic representation of the Acme virtual area shown in FIG. 20 .
  • the user may navigate to any of the zones of the Acme virtual area.
  • the user in order to move to a zone, transmits a command to execute one of the zones displayed on the monitor, (e.g., by selecting the zone and then clicking the Enter button (in the toolbar), or, as a shortcut, double-clicking the zone) and, in response, the platform depicts the user's avatar in the zone corresponding to the zone object.
  • the seeker interface outlines the zone (indicating to the user that it is selected) and updates the breadcrumb buttons 207 to show the selected zone location in the hierarchy. Toolbar buttons specific to the selection will also appear to the right of the breadcrumb buttons 207 .
  • the user also may interact with any objects (e.g., a screen, table, or file) that are present in a zone.
  • the user transmits a command to execute one of the objects displayed on the monitor, (e.g., by selecting the object and then clicking the View button (in the toolbar), or, as a shortcut, double-clicking the object) and, in response, the platform performs an operation with respect to the object (e.g., present a zoomed-in view of the object, open an interaction interface window, etc.).
  • the seeker interface outlines or otherwise highlights the prop (indicating to the user that it is selected) and updates the breadcrumb buttons 207 to show the selected object location in the hierarchy. Toolbar buttons specific to the selection will also appear to the right of the breadcrumb buttons 207 .
  • the platform in response to the user entering a Main space 213 , the platform automatically establishes a network connection between the user and each of the other communicants occupying the selected zone.
  • the user also may enter (and thereby establish a presence in) a space by selecting the space and clicking the Enter button; this causes the platform to move the user's sprite from its current location (i.e., Courtyard) to the selected space (i.e., Main).
  • the settings of user's speakers and microphone typically do not change when the user is moved from one location to another.
  • FIG. 22 shows a situation in which the user has double-clicked a wall object 290 in the view of the Main space 213 shown in FIG. 21 .
  • the seeker interface presents in the browsing area 166 of the tab 164 a 2.5-dimensional view of the contents of the wall object 290 and areas of the Main space 213 surrounding the wall object 290 .
  • the selected wall object 290 corresponds to a North Wall of the Main space 213 .
  • the North Wall contains a pair of view screen objects 289 , 291 (labeled “2” and “3”, respectively) that are presented on the North Wall object 290 .
  • the view screen objects 289 , 291 can be used to present the contents of data files that are associated with the North Wall of the Main space 213 .
  • the 2 .
  • 5 -dimensional view also shows a West Wall object 293 and an East Wall object 295 that are located on the left and right sides of the North Wall object 290 , respectively.
  • Each of the West Wall object 293 and the East Wall object 295 includes a respective view screen object 293 , 295 (labeled “1” and 4”, respectively) that can be used to present the contents of respective data files.
  • the interface also shows in the minimap 256 a view of the Main space 213 and areas of the Acme space surrounding the Main space 213 .
  • the minimap 256 also shows a highlighted view 292 of the selected North Wall object 290 in the Main space 213 .
  • the breadcrumb buttons 207 shown in the toolbar 168 of the tab 164 include a North Wall button 294 that corresponds to the current level in the hierarchical spatial model of the virtual area.
  • the toolbar 168 includes a rotate left button 296 and a rotate right button 298 that allow the user to rotate the current view left and right by ninety degrees (90°) so that the user can view the contents of different walls of the Main space in the central viewing area of the 2.5-dimensional view of the Main space 213 .
  • the user also can double-click a different one of the walls that are shown in the minimap 256 in order to change the contents that are presented in the central viewing area of the 2.5-dimensional view of the Main space 213 .
  • FIG. 23 shows the seeker interface after the user has selected the view screen object 291 (labeled “3”) on the North Wall in the view of the Main space 213 shown in FIG. 22 .
  • the user may have executed the view screen object 291 either by double-clicking any part of the view screen object 291 shown in the central viewing area of the 2.5-dimensional view shown in the browsing area 166 of the tab 164 or by double-clicking the corresponding view screen object in the minimap 256 shown in FIG. 22 .
  • the browsing area 166 of the tab 164 shows a zoomed-in view of the view screen object 291 and areas of the North Wall object 290 surrounding the view screen object 291 , as shown in FIG. 23 .
  • the user may double-click on any of the areas of the North Wall object 290 surrounding the view screen object 291 in the browsing area 166 shown in FIG. 22 in order to return to the browsing area and minimap views of the Main space 213 that are shown in FIG. 22 .
  • the minimap 256 shows a 2.5-dimensional view of the contents of the wall object 290 along with areas of the Main space surrounding the wall object 290 ; this view corresponds to the preceding level in the hierarchical spatial model of the virtual area.
  • the breadcrumb buttons 207 include a Screen 3 button 302 that corresponds to the current level in the hierarchical spatial model of the virtual area.
  • the toolbar 168 includes a Share button 304 that allows a user to designate a shared data file whose contents will be presented on the view screen object 291 (i.e., Screen 3 ) and thereby allow all communicants in the Main space 213 to simultaneously share the data file.
  • the view screen object 291 shown in the browsing area 166 includes a Share link 306 that also allows the user to designate the shared data file.
  • the seeker interface opens a separate Select Source interface window 310 that allows the user to designate a data file whose contents will be shared on the view screen object 291 as described above in section IV.
  • the Select Source interface includes a text box 312 for receiving a data file identifier (e.g., a local data file storage path name or a Uniform Resource Identifier (URI)) and a Browse button 314 that enables the user to browse different locations for the data file identifier.
  • the data file identifier may be located on the client node 12 or another network node.
  • the Select Source interface window 310 also includes a favorites button 316 that allows the user to browse a list of previously bookmarked files, applications, or URIs for the data file identifier.
  • the communications application 26 After the user has selected a data file identifier in the Select Source interface the communications application 26 generates a thumbnail image of the selected data file:
  • the Scraper module is invoked to capture the thumbnail image of the selected data file.
  • the thumbnail image may be a sample of the main window of the shared application that is associated with the selected data file.
  • the thumbnail image is displayed on the view screen object 291 in both the browsing area 166 and the minimap 256 .
  • the designated data file corresponds to a PowerPoint® data file that contains a slide of a pie chart. The user can terminate the presentation of the data file on the view screen object 291 by selecting the clear icon 318 .
  • One or more viewers can subscribe to the windows of the shared application showing the contents of the selected data file by clicking (or double-clicking) on the thumbnail image shown on the viewscreen object 291 .
  • Each viewer can view, control, edit, and manipulate the shared window content presented on the viewscreen object 291 subject to any governance rules that are associated with the selected data file or the zone containing the viewscreen object 291 .
  • the viewer can input commands to the shared process executing on the sharer network node by using one or more input devices on the viewer's network node as described above in section IV. Assuming that realtime performance can be achieved over the respective network connections between the sharer network node and the viewer network nodes, the edits and other manipulations of the shared data file typically will appear to each of the collaborators as if they were made on the same network node.
  • the embodiments that are described herein enable application sharing with high fidelity, realtime performance, viewer immersion, and privacy protection. Some embodiments also enable multichannel application sharing in which two or more communicants share applications and screen content with each other at the same time. These embodiments typically include an interface that allows each receiver to distinguish one shared window from another.

Abstract

Ones of the windows associated with a software process are identified in a screen layout on a local display of a sharer network node. On the sharer network node, an image of the identified windows as they are arranged in the screen layout and free of obscuration by any other windows in the screen layout is generated. The image is transmitted from the sharer network node to a viewer network node. Also, commands derived from input device events on sharer and viewer network nodes are received. The received commands are processed into a command sequence, which is passed to a shared process executing on the sharer network node. Windows associated with the shared process are presented on the sharer network node in accordance with the received command sequence. An image of the presented windows is generated and transmitted from the sharer network node to the viewer network node.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application is an application for reissue of U.S. Pat. No. 8,407,605 and relates to the following co-pending patent applications, the entirety of each of which is incorporated herein by reference:
U.S. patent application Ser. No. 12/354,709, filed Jan. 15, 2009;
U.S. patent application Ser. No. 61/120,372, filed Dec. 5, 2008;
U.S. patent application Ser. No. 61/042,714, filed Apr. 5, 2008;
U.S. patent application Ser. No. 11/923,629, filed Oct. 24, 2007; and
U.S. patent application Ser. No. 11/923,634, filed Oct. 24, 2007.
BACKGROUND OF THE INVENTION
When face-to-face communications are not practical, people often rely on one or more technological solutions to meet their communications needs. These solutions typically are designed to simulate one or more aspects of face-to-face communications. Traditional telephony systems enable voice communications between callers. Instant messaging (also referred to as “chat”) communications systems enable users to communicate text messages in real time through instant message computer clients that are interconnected by an instant message server. Some instant messaging systems additionally allow users to be represented in a virtual environment by user-controllable graphic objects (referred to as “avatars”). Interactive virtual reality communication systems enable users in remote locations to communicate over multiple real-time channels and to interact with each other by manipulating their respective avatars in three-dimensional virtual spaces. Each of these modes of communications typically can handle some form of data sharing between the communicants.
A common form of data sharing is application sharing, which involves transmitting application data from one node (referred to as the “sharer node”) to one or more other nodes (referred to as “viewer nodes”). Application sharing has a variety of useful applications, including providing remote technical support, remote collaboration, and remote presentation of demonstrations, documents, and images. In some proposed systems, an application sharing program on the sharer node periodically collects drawing commands (e.g., GDI calls for drawing lines and curves, rendering fonts and handling palettes) from a chained display driver process on the sharer node, packages the drawing commands into an order packet, and sends the order packet to a respective counterpart application sharing program on each of the viewer nodes that accurately constructs the shared view of the sharer's display. Such an application sharing approach, however, requires each viewer node to render its own version of the shared application by passing the drawing commands in the order packet to a display process (e.g., the GDI interface provided by a Microsoft® Windows® operating system).
What are needed are improved application sharing apparatus and methods.
BRIEF SUMMARY OF THE INVENTION
In one aspect, the invention features a method in accordance with which ones of the windows that are associated with a software process are identified in a screen layout on a local display of a sharer network node. On the sharer network node, a composite image of the identified windows as they are arranged in the screen layout and free of obscuration by any other windows in the screen layout is generated. The composite image is transmitted from the sharer network node to a viewer network node.
In one aspect, the invention features a method in accordance with which locally-generated commands that are derived from local input device events on a sharer network node are received. Remotely-generated commands that are derived from remote input device events on a remote viewer network node also are received. The received commands are processed into a command sequence. The command sequence is passed to a shared process executing on the sharer network node. In a screen layout on a local display of the sharer network node, one or more windows that are associated with the shared process are presented in accordance with the received command sequence. An image of the one or more windows as they are presented in the screen layout is generated. The image is transmitted from the sharer network node to the viewer network node.
The invention also features apparatus operable to implement the inventive methods described above and computer-readable media storing computer-readable instructions causing a computer to implement the inventive methods described above.
Other features and advantages of the invention will become apparent from the following description, including the drawings and the claims.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a diagrammatic view of an embodiment of a network communication environment that includes a first network node and a second network node.
FIG. 2 is a flow diagram of an embodiment of an application sharing method.
FIG. 3 is a diagrammatic view of an embodiment of a screen layout on a display of a sharer network node.
FIG. 4A is a diagrammatic view of an embodiment of a screen layout on a display of a viewer network node.
FIG. 4B is a diagrammatic view of an embodiment of a screen layout on a display of a viewer network node.
FIG. 5 is a diagrammatic view of an embodiment of an application sharing class model.
FIG. 6 is a diagrammatic view of an embodiment of application sharing components implementing a method in an embodiment of an application sharing process.
FIG. 7 is a diagrammatic view of an embodiment of application sharing components implementing a method in an embodiment of an application sharing process.
FIG. 8 is a diagrammatic view of an embodiment of application sharing components implementing a method in an embodiment of an application sharing process.
FIG. 9 is a diagrammatic view of an embodiment of application sharing components implementing a method in an embodiment of an application sharing process.
FIG. 10 is a diagrammatic view of an embodiment of application sharing components implementing a method in an embodiment of an application sharing process.
FIGS. 11A and 11B are flow diagrams of an embodiment an application sharing method.
FIG. 12 is a diagrammatic view of an embodiment of a screen layout on a display of a sharer network node.
FIG. 13 is a flow diagram of an embodiment of a method of determining the layering order of the windows associated with a shared software process.
FIG. 14 is a block diagram of an embodiment of a shared network node implementing an embodiment of an application sharing process in connection with an embodiment of a viewer network node.
FIG. 15 is a flow diagram of an embodiment of an application sharing method.
FIG. 16 is a diagrammatic view of an embodiment of a network communication environment that includes the first network node of FIG. 1, the second network node of FIG. 1, and a virtual environment creator.
FIG. 17 is a diagrammatic view of an embodiment of a network node that includes a graphical user interface presenting a depiction of a virtual area.
FIG. 18 is a block diagram of the network communication environment of FIG. 16 that shows components of an embodiment of a client network node.
FIG. 19 is a diagrammatic view of an embodiment of a graphical user interface.
FIG. 20 is a diagrammatic view of an embodiment of a graphical user interface.
FIG. 21 is a diagrammatic view of an embodiment of a graphical user interface.
FIG. 22 is a diagrammatic view of an embodiment of a graphical user interface.
FIG. 23 is a diagrammatic view of an embodiment of a graphical user interface.
FIG. 24 is a diagrammatic view of an embodiment of a graphical user interface.
FIG. 25 is a diagrammatic view of an embodiment of a graphical user interface.
DETAILED DESCRIPTION OF THE INVENTION
In the following description, like reference numbers are used to identify like elements. Furthermore, the drawings are intended to illustrate major features of exemplary embodiments in a diagrammatic manner. The drawings are not intended to depict every feature of actual embodiments nor relative dimensions of the depicted elements, and are not drawn to scale.
I. DEFINITION OF TERMS
A “window” is a visual area of a display that typically includes a user interface. A window typically displays the output of a software process and typically enables a user to input commands or data for the software process. A window that has a parent is called a “child window.” A window that has no parent, or whose parent is the desktop window, is called a “top-level window.” A “desktop” is a system-defined window that paints the background of a graphical user interface (GUI) and serves as the base for all windows displayed by all software processes.
The term “window scraping” refers to a process of extracting data from the display output of another software process. The extraction process is performed by a “scraper” software process.
A “layering order” (also referred to as the “z-order”) is an ordering of overlapping two-dimensional objects, such as windows in a graphical user interface (GUI), along an axis (typically referred to as the “z-axis”) that is perpendicular to a display on which the GUI is presented.
“Compositing” is the combining of visual elements from separate sources into a single composite image (also referred to herein as a “frame”).
The term “obscuration” means the act or process of concealing or hiding by or as if by covering.
A “communicant” is a person who communicates or otherwise interacts with other persons over one or more network connections, where the communication or interaction may or may not occur in the context of a virtual area. A “user” is a communicant who is operating a particular network node that defines a particular perspective for descriptive purposes. A “sharer” is a communicant who is operating a sharer network node. A “viewer” is a communicant who is operating a viewer network node.
A “realtime contact” of a user is a communicant or other person who has communicated with the user via a realtime communications platform.
A “computer” is any machine, device, or apparatus that processes data according to computer-readable instructions that are stored on a computer-readable medium either temporarily or permanently. An “operating system” is a software component of a computer system that manages and coordinates the performance of tasks and the sharing of computing and hardware resources. A “software process” (also referred to as software, an application, computer software, a computer application, a program, and a computer program) is a set of instructions that a computer can interpret and execute to perform one or more specific tasks. A software process may have one or more “threads” of execution. A “shared software process” is a software process whose output is shared with a viewer network node. A “computer data file” is a block of information that durably stores data for use by a software application.
A “database” is an organized collection of records that are presented in a standardized format that can be searched by computers. A database may be stored on a single computer-readable data storage medium on a single computer or it may be distributed across multiple computer-readable data storage media on one or more computers.
A “data sink” (referred to herein simply as a “sink”) is any of a device (e.g., a computer), part of a device, or software that receives data.
A “data source” (referred to herein simply as a “source”) is any of a device (e.g., a computer), part of a device, or software that originates data.
A “network node” (also referred to simply as a “node”) is a junction or connection point in a communications network. Exemplary network nodes include, but are not limited to, a terminal, a computer, and a network switch. A “server” network node is a host computer on a network that responds to requests for information or service. A “client” network node is a computer on a network that requests information or service from a server. A “network connection” is a link between two communicating network nodes. The term “local network node” refers to a network node that currently is the primary subject of discussion. The term “remote network node” refers to a network node that is connected to a local network node by a network communications link. A “connection handle” is a pointer or identifier (e.g., a uniform resource identifier (URI)) that can be used to establish a network connection with a communicant, resource, or service on a network node. A “sharer network node” is a network node that is sharing content with another network node, which is referred to as a “viewer network node. A “network communication” can include any type of information (e.g., text, voice, audio, video, electronic mail message, data file, motion data stream, and data packet) that is transmitted or otherwise conveyed from one network node to another network node over a network connection.
A “communicant interaction” is any type of direct or indirect action or influence between a communicant and another network entity, which may include for example another communicant, a virtual area, or a network service. Exemplary types of communicant interactions include communicants communicating with each other in realtime, a communicant entering a virtual area, and a communicant requesting access to a resource from a network service.
“Presence” refers to the ability and willingness of a networked entity (e.g., a communicant, service, or device) to communicate, where such willingness affects the ability to detect and obtain information about the state of the entity on a network and the ability to connect to the entity:
A “realtime data stream” is data that is structured and processed in a continuous flow and is designed to be received with no delay or only imperceptible delay. Realtime data streams include digital representations of voice, video, user movements, facial expressions and other physical phenomena, as well as data within the computing environment that may benefit from rapid transmission, rapid execution, or both rapid transmission and rapid execution, including for example, avatar movement instructions, text chat, realtime data feeds (e.g., sensor data, machine control instructions, transaction streams and stock quote information feeds), and file transfers.
A “virtual area” (also referred to as an “area” or a “place”) is a representation of a computer-managed space or scene. Virtual areas typically are one-dimensional, two-dimensional, or three-dimensional representations; although in some embodiments a virtual area may correspond to a single point. Oftentimes, a virtual area is designed to simulate a physical, real-world space. For example, using a traditional computer monitor, a virtual area may be visualized as a two-dimensional graphic of a three-dimensional computer-generated space. However, virtual areas do not require an associated visualization to implement switching rules. A virtual area typically refers to an instance of a virtual area schema, where the schema defines the structure and contents of a virtual area in terms of variables and the instance defines the structure and contents of a virtual area in terms of values that have been resolved from a particular context.
A “virtual area application” (also referred to as a “virtual area specification”) is a description of a virtual area that is used in creating a virtual environment. The virtual area application typically includes definitions of geometry, physics, and realtime switching rules that are associated with one or more zones of the virtual area.
A “virtual environment” is a representation of a computer-managed space that includes at least one virtual area and supports realtime communications between communicants.
A “zone” is a region of a virtual area that is associated with at least one switching rule or governance rule. A “switching rule” is an instruction that specifies a connection or disconnection of one or more realtime data sources and one or more realtime data sinks subject to one or more conditions precedent A switching rule controls switching (e.g., routing, connecting, and disconnecting) of realtime data streams between network nodes communicating in the context of a virtual area. A governance rule controls a communicants access to a resource (e.g., an area, a region of an area, or the contents of that area or region), the scope of that access, and follow-on consequences of that access (e.g., a requirement that audit records relating to that access must be recorded). A “renderable zone” is a zone that is associated with a respective visualization.
A “position” in a virtual area refers to a location of a point or an area or a volume in the virtual area. A point typically is represented by a single set of one-dimensional, two-dimensional, or three-dimensional coordinates (e.g., Cartesian coordinates, polar coordinates, or spherical coordinates) that define a spot in the virtual area. A coordinate can be defined as any single or plurality of numbers that establish location. An area typically is represented by the three-dimensional coordinates of three or more coplanar vertices that define a boundary of a closed two-dimensional shape in the virtual area. A volume typically is represented by the three-dimensional coordinates of four or more non-coplanar vertices that define a closed boundary of a three-dimensional shape in the virtual area.
A “spatial state” is an attribute that describes where a user has presence in a virtual area. The spatial state attribute typically has a respective value (e.g., a zone_ID value) for each of the zones in which the user has presence.
A “placemark” is a stored reference (e.g., a hyperlink) to a location in a virtual area. A placemark typically can be selected to present a view of the associated location in the virtual area to a user. The verb “placemark” means the act or operation of creating a placemark.
In the context of a virtual area, an “object” is any type of discrete element in a virtual area that may be usefully treated separately from the geometry of the virtual area. Exemplary objects include doors, portals, windows, view screens, and speakerphone. An object typically has attributes or properties that are separate and distinct from the attributes and properties of the virtual area. An “avatar” is an object that represents a communicant in a virtual area.
The term “double-click” refers to the act or operation of entering or inputting an execution command (e.g., double-clicking the left computer mouse button or by single-clicking a user interface button associated with an execute command, e.g., enter zone or view object). The term “shift-click” refers to the act or operation of entering or inputting a selection command (e.g., clicking the left computer mouse button) while the Shift key of an alphanumeric input device is activated. The term “shift-double-click” refers to the act or operation of entering or inputting an execution command while the Shift key of an alphanumeric input device is activated.
As used herein, the term “includes” means includes but not limited to, the term “including” means including but not limited to. The term “based on” means based at least in part on.
II. INTRODUCTION
The embodiments that are described herein enable application sharing with high fidelity, realtime performance, viewer immersion, and privacy protection. In some embodiments, the screen content that is associated with each thread of a software process can be determined and composited into a respective composite image (or frame) that is free of other window content. The contents of the windows that are associated with one or more software application threads on a network node can be broadcasted to other network nodes without risk of obscuration by the screen content (e.g., windows containing application content, messages, or dialog boxes) that might be generated by other software processes, thereby preventing corruption of shared window content by overlapping screen content that sometimes is generated by processes that are outside of the user's immediate control. This feature avoids the need for sharers to interrupt a presentation in order to remove obscuring screen content, thereby creating a more immersive collaboration experience for viewers of the shared window content. In addition, sharers don't have to worry that private information inadvertently will be shared along with the intended screen content, thereby maintaining the sharer's privacy.
Some embodiments also enable multichannel application sharing in which two or more communicants share applications and screen content with each other at the same time. These embodiments typically include an interface that allows each receiver to distinguish one shared window from another.
III. APPLICATION SHARING
A. Introduction
FIG. 1 shows an embodiment of an exemplary network communications environment 10 that includes a first network node 12 and a second network node 14 that are interconnected by a network 18. The first network node 12 includes a computer-readable memory 20, a processor 22, and input/output (I/O) hardware 24 (including a display). The processor 22 executes at least one communications application 26 that is stored in the memory 20. The second network node 14 typically is configured in substantially the same way as the first network node 12. In operation, the communications applications 26 typically provide one or more modes of communication between the first and second network nodes 12, 14 (e.g., text, voice, audio, and video). In addition, the communications applications 26 enable one-way or two-way application sharing between the first and second network nodes 12, 14.
Embodiments of the communications application may implement one or more of the following application sharing modes:
    • Share all windows created by a given process;
    • Share only one window and do not share any other windows;
    • Share a given window and all child windows, where the child windows may belong to the same process that created the given window or they may belong to a different process.
    • Share multiple applications. For instance, composing windows created by different processes into a single frame instead of sharing each of the applications independently.
FIG. 2 shows an embodiment of an application sharing method that is implemented by the communications application 26 operating on one or both of the first and second network nodes 12, 14. This process typically is performed in response to a request from a viewer network node to share one or more documents relating to a software process operating on a sharer network node.
In accordance with the method of FIG. 2, the communications application 26 identifies ones of the windows that are associated with a software process in a screen layout on a display of the sharer network node (FIG. 2, block 500). In some embodiments, the communications application 26 may identify all of the windows in the screen layout that are associated with the software process or it may identify ones of the windows in the screen layout that match a handle that is assigned to a thread of the software process. In some cases, this process involves identifying a parent window and at least one associated child window that are created by the software process thread. In some embodiments, the communications application 26 identifies all of the windows created by a designated group of software processes.
The communications application 26 generates a composite image of the identified windows as they are arranged in the screen layout and free of obscuration by any other windows in the screen layout (FIG. 2, block 502). In some embodiments, this process involves: determining a layering order of the identified windows in relation to each other that corresponds to the relative layering order of the identified windows in the screen layout; for each of the identified windows, retrieving a respective image of the window; and compositing the retrieved images into the composite image in accordance with the determined layering order. In some embodiments, the process of determining the layering order involves: for each of the windows in the screen layout, generating a z-order list that associates a respective z-order value with a respective window handle of the window; and deriving the layering order of the identified windows from the z-order list The process of deriving the layering order typically involves: for each of the z-order values in the z-order list, matching the associated window handle with the window handle of a respective one of the identified windows; and ordering the identified windows in the layering order in accordance with the respective z-order values in the z-order list associated with the ones of the z-order values determined to match the window handles of the identified windows.
In addition to preventing obscuration by other windows, some embodiments also prevent obscuration of the selected windows as a result of the windows being partially or completely off-screen (e.g., outside the visible desktop window that contains the screen layout). For example, a respective image of each of the identified windows is stored in a respective memory buffer, and the process of generating the composite image involves retrieving each of the images from the respective memory buffers and compositing the retrieved images into the composite image. For example, in some exemplary embodiments, each of the windows is a layered window whose screen data is stored in a respective memory buffer through a programmatic call to the Microsoft® Win32 application programming interface (API), which is available in Microsoft® Windows® operating systems versions 2000 and later. These operating systems provide an extended window style that is invoked by setting the WS_EX_LAYERED window style bit. The WS_EX_LAYERED style bit that is associated with a particular window may be set by the shared software process at window creation time (e.g., via the CreateWindowEx API call) or it may be set by the communications application 26 after creation time (e.g., via SetWindowLong API call with GWL_EXSTYLE). With the WS_EX_LAYERED window style bit set for a window, the operating system redirects the drawing of the window into an off-screen bitmap and buffer, which can then be accessed by the communications application 26 for generating the composite image. Similar layered windows functionality is available from other operating systems (e.g., X-Windows on a UNIX based operating system).
After the composite image has been generated, the communications application 26 transmits the composite image to the viewer one of the first and second network nodes (i.e., the one of the first and second network nodes that receives the composite image) 12, 14 (FIG. 2, block 504). In some embodiments, this process involves transmitting the composite image to each of the remote viewer network nodes that requested (or subscribed to) to view the shared content on the sharer network node. The composite image (and subsequent updates) is transmitted to the subscribing viewer network nodes over respective realtime data stream connections that are established between the sharer network node and the viewer network nodes.
FIG. 3 shows an embodiment of a screen layout 506 on a display 508 of the first network node 12, which is acting as a sharer network node. The screen layout 506 includes a first window 510, a second window 512 that is parent of a child window 514, a third window 516, and a fourth window 518. The first window 510, the second window 512 and its child window 514 are created by respective threads of a first software process (e.g., the Microsoft® Word® document processing software process), the third window is created by a thread of a second software process (e.g., Microsoft® Excel® spreadsheet software process), and the fourth window 518 is created by a thread of a third software process (e.g., the Microsoft® Outlook® personal information manager software process). The first and second windows 510, 512 are the main windows that are created by their respective software process threads (e.g., created by the WinMain entry point function, optionally with the WS_EX_LAYERED bit set in a Microsoft® Windows® application environment). In a Microsoft® Windows® application environment, aspects of the appearance of the child window 514 typically are affected by its parent window 512. For example, the parent window 512 typically defines the coordinate system that is used to position the child window 514 on the display 508. In addition, in some implementations, the child window 514 may be clipped so that no part of the child window 514 appears outside the borders of its parent window 512.
FIG. 4A shows an embodiment of a first screen layout 520 on a display 522 of the second network node 14, which is acting as a viewer network node in accordance with a first application sharing example. In this example, the communicant on the first network node 12 is acting as a sharer who has elected to share all of the windows of the first software process (App1) with the second network node 14. Thus, the screen layout 520 consists of a composite image 524 that includes respective regions 526, 528, 530 that show the first window 510, the second window 512, and the child window 514, which are created by respective threads of a first software process operating on the first network node 14. The regions 526, 528, 530 show the first window 510, the second window 512, and the child window 514 as they are arranged in the screen layout 506 on the first network node 14. The regions 526, 528, 530 also are free of obscuration by any other windows in the screen layout 506 on the first network node 14. For example, the third window 516 and the fourth window 518 (which obscured underlying portions of the first window 510, the second window 512, and the child window 514 in the screen layout 506) are omitted from the composite image 524; in addition, the portions of the first window 510, the second window 512, and the child window 514 that are obscured in the screen layout 506 have been replaced by the appropriate contents of the first window 510, the second window 512, and the child window 514 in the composite image 524.
FIG. 4B shows an embodiment of a second screen layout 532 on the display 522 of the second network node 14, which is acting as a viewer network node in accordance with a second application sharing example. In this example, the communicant on the first network node 12 is acting as a sharer who has elected to share all of the windows that are associated with only one of the threads (i.e., the thread responsible for creating the second window 512 and its child window 514) of the first software process (App1). Thus, the screen layout 532 consists of a composite image 534 that includes respective regions 536, 538 that show the second window 512 and the child window 514. The regions 536, 538 show the second window 512 and the child window 514 as they are arranged in the screen layout 506 on the first network node 14. The regions 536, 538 also are free of obscuration by any other windows in the screen layout 506 on the first network node 14. For example, the first window 510, the third window 516, and the fourth window 518 are omitted from the composite image 534; in addition, the portions of the second window 512 and the child window 514 that are obscured in the screen layout 506 have been replaced by the appropriate contents of the second window 512 and the child window 514 in the composite image 534.
B. Application Sharing Embodiments
1. Introduction
In some embodiments, application sharing is initiated after a sharer network node has published one or more applications or documents that are available to share and at least one viewer has subscribed to at least one of the published applications or documents. In some embodiments, the sharer can publish a shared application or document to a viewscreen object that is associated with a virtual area and the viewer can subscribe to the shared content by activating the viewscreen object in the virtual area (e.g., by double-clicking on the viewscreen object with a user input device).
The viewer typically is granted one of two types of access to the shared content: view access, which allows the viewer to only passively view the shared content; and control access, which allows the viewer to view, control, edit, and manipulate the shared content. The type of access that is granted to the viewer can be set by the sharer or by one or more governance rules that are associated with the context in which the sharing takes place (e.g., a governance rule that is associated with a zone of a virtual area, as described below in section IV).
The shared content typically is streamed from the sharer network node to the viewer network node in the form of streaming bitmaps of the windows on the sharers display that are associated with the shared application or document The bitmap of each window can be streamed separately or already composited. The bitmaps typically are compressed before being streamed. If the viewer has only view access, then the viewer can only passively view images of the shared windows on the sharers display. If the viewer has control access, then the viewer network node can transmit remote control commands that are generated by a user input device (e.g., a keyboard, a computer mouse, a touchpad, and a touch screen) to the sharer network node for controlling, editing, and manipulating the shared content on the sharers network node.
2. Application Sharing Service
In some embodiments, the application sharing functionality of the communications application 26 is provided by a Scraper module, which is a plug-in for an Application Sharing Service that implements a platform specific part of the application sharing. This embodiment implements the application sharing mode in which all windows created by a shared application automatically are shared with subscribing network nodes. This section describes an exemplary embodiment of the Scraper module and the Application Sharing Service that are implemented in a Microsoft® Windows® application environment that provides layered windows functionality.
a. Classes
FIG. 5 shows an embodiment of an application sharing class model. In this model, the blocks 540, 542, 544, 546, and 548 define the interface between Application Sharing Service and Scraper module, the blocks 550, 552, 554 are the classes that implement Scraper module, and the blocks 556, 558 are the classes in the Application Sharing service that use the interfaces.
b. Methods
(i) The Methods Start and Stop
Before Application Sharing Service calls any method on the Scraper module it calls the start method and, during shutdown, it calls the stop method. In the illustrated embodiment no calls can be made before the start method was called and after the stop method was called.
In the start method, the Scraper module starts the thread that listens for WinEvents. The Scraper module listens for WinEvents to get notifications when windows and menus are created/destroyed and shown/hidden.
In the stop method, the Scraper module stops all the application monitors and then shuts down the thread that is listening for WinEvents.
When a WinEvent notification is received, the Scraper module gets the thread identifier (ID) and the process ID for the window. The Scraper module then looks up a monitor by the process ID and notifies the application monitor about the event. In response to a notification that a window was destroyed, the Scraper module notifies all of the application monitors of the event since the process and thread IDs are not available.
(ii) The Method Get_Sharable_Application
As shown in FIG. 6, the Application Sharing Service can request the Scraper module to provide a list of applications that can be shared. In response, the Scraper module builds up the list of IDs and titles of the topmost overlapped windows on the desktop that is presented on the display of the sharer network node. In the illustrated embodiments, the list is built from the processes that currently are running on the sharer network node. In other embodiments, the list also includes the processes that “can be running” on the node. This for instance might include all Microsoft® Word® documents that the sharer might want to share, but currently are not running, or another application (e.g., a calculator application) that the sharer might want to share, but currently is not running.
(iii) The Method Start_Share_Application
Referring to FIG. 7, when the Application Sharing service starts sharing an application, the Scraper module creates an instance of the application monitor that will be responsible for scraping the application windows. The application monitor enumerates all topmost windows on the desktop. The application monitor uses the result of this enumeration as an initial list of the windows. The application monitor determines which windows in the initial list belong to the application being shared. The application monitor then spawns a scraper thread. The scraper thread is responsible for scraping content of the application windows, composing all the application's windows to the final bitmap, and sending the bitmap frame to all the subscribing network nodes.
(iv) The Method Stop_Share_Application
Referring to FIG. 8, stopping application sharing takes care of gracefully stopping application monitor. In this process, the application monitor will shutdown the scraping thread, which makes sure that no more samples are produced. After that it notifies all the subscribing network nodes that they are unsubscribed. Once all the references to the application monitor are gone the monitor will be destroyed.
(v) The Method Get_Shared_Applications
This method returns a list of applications being shared.
(vi) The Method Subscribe
Referring to FIG. 9, the subscribe method expects as parameters a reference to a subscriber and an ID of the process to share. The subscriber is a class (see FIG. 5) that implements stream_subscriber_itf, as shown in the class model shown in FIG. 5. The Scraper module finds the monitor by the application ID and adds the subscriber to the monitor. The next time the application monitor generates a frame it will deliver the frame to this subscriber as well.
(vii) The Method Unsubscribe
Referring to FIG. 10, the subscribe method expects as parameters a reference to a subscriber and ID of the process to share. This method finds the shared application by ID and tells it to remove the referenced subscriber. On removing the referenced subscriber, the application monitor will notify the subscriber about the state change. The application monitor also checks shared process state and, if the process exits, then application monitor will notify all the subscribers about the state change, unsubscribe all the subscribers and will terminate itself since there is nothing to monitor.
c. Window Scraping
Each application monitor has a thread that wakes up on a periodic basis and performs window scraping.
FIGS. 11A and 11B show an embodiment of an exemplary window scraping method.
In accordance with the method of FIGS. 11A and 11B, the Scraper module determines the z-order of all the windows that are associated with a shared software process and arranges all the windows according to the z-order (FIG. 11B; block 560).
The scraper module determines a bounding rectangle that encompasses all the windows that are associated with the shared software process (FIG. 11B; block 562). For example, FIG. 12 shows a screen layout 564 that includes a Microsoft® Word® main window 566 and a Save As dialog box 568. The Scraper module calculates the rectangle 570, which is the smallest bounding box that encompasses both windows, including any off-screen content (e.g., the shaded area 571 that corresponds to the bottom of the main window 566 that is outside the visible desktop window 572). The bounding rectangle 570 defines the size of the composite bitmap of all the windows to be shared.
For each of the windows that are associated with the shared software process, the Scraper module calls a scrape function (FIG. 11B, block 576). If the window is not a layered window (FIG. 11B, block 578), the scrape function sets the WS_EX_LAYERED bit that is associated with the window via the SetWindowLong API call with GWL_EXSTYLE (FIG. 11B, block 580). The scrape function then scrapes the layered window (FIG. 11B, block 582). In this process, the scraper module retrieves images of the windows, including the parts that are obscured by other windows or is outside the visible desktop window (e.g., the shaded area 571 shown in FIG. 12) and paints the retrieved images into the composite image. The windows are scraped according to the z-order so the topmost window is painted on top of all the window images in the final composite image. After the scraper module has estimated the bounding rectangle, the scraper module captures a screen image of the shared windows. In a Microsoft® Windows® application environment for each of the windows, the Scraper module makes a GetDC( ) function call to GDI (the graphics device interface) that retrieves a handle to a display device context (DC) for the window. The Scraper module performs an iterative capturing of each window. In this process; BitBlt( )is called for each window being shared and the image data of each window is superimposed in the target DC according to the relative positions (in the x, y, and z axes) of the windows in the screen display. The BitBlt function performs a bit-block transfer of the image data corresponding to a block of pixels from the specified source device context into a destination device context. The screen image constitutes the initial composite image. Performing iterative BitBlt( ) requests automatically filters out screen data from unwanted windows and
allows individual window compressed bitmaps to be sent, instead of just the compressed composited bitmap.
After completing the scraping process, the Scraper module creates a device independent bitmap of the composite image (FIG. 11B, block 584). The Scraper module then creates a sample from the bytes of the bitmap (FIG. 11B, block 586). The sample is a bitmap in a temporal sequence of bitmaps that are captured. In some embodiments, the sample is the current composite image. The current sample depicts how the shared application is displayed at current time. Successive samples give the viewers the perception that the shared application is not just a sequence of images, but rather is a live application. Once sample is created, the Application Sharing service presents the sample to all the subscribing network nodes, one by one (FIG. 11B, block 588). Samples typically are transmitted to the subscribing network nodes only if there has been a change in the composite image since the last sample was transmitted. The sample typically is compressed before being transmitted to the subscribing network nodes. The Scraper module then goes to sleep until the next scheduled time to scrape another image (FIG. 11B, block 590). The Application Sharing service contemporaneously (i.e., during the same period of time, e.g., during the period between scraping cycles) transmits the sample of the composite image to each of the subscribing remote network nodes.
FIG. 13 shows an embodiment of a method by which the Scraper module determines the current z-order of the windows that are associated with a shared software process (FIG. 11B, block 560).
The Scraper module starts with an initial list of all top-level windows that are associated with the shared software process (FIG. 13, block 600). The initial list may be obtained from a previous iteration of the scraping process shown in FIGS. 11A and 11B. Alternatively, the initial list may be obtained directly by querying the application environment For example, in some embodiments, the Scraper module is implemented in a Microsoft® Windows® application environment. In these embodiments, each software process thread has an entry point function called WinMain, which registers the window class for the main window by calling the RegisterClass function, and creates the main window by calling the CreateWindowEx function. After creating a window, the creation function returns a window handle that has the HWND data type and uniquely identifies the window; the window handle is used to direct software process actions to the window. In these embodiments, the Scraper module obtains a handle to each of the top-level windows associated with the shared software process thread by calling the EnumWindows function, which enumerates all the windows on the desktop and then query the process this window belongs to by calling GetWindowThreadProcessId.
The Scraper module determines a z-order of all the windows that currently are associated with the shared software process (FIG. 13, block 602). When implemented in a Microsoft® Windows® application environment, the Scraper module obtains a handle to each of the windows on the desktop window by calling the EnumChildWindows function, which enumerates the child windows that belong to the desktop window in z-order by passing the handle to each child window, in turn, to a callback function of the Scraper module. The Scraper module recursively iterates through all the windows on the desktop and identifies those windows that belong to the shared software application. In this process, the Scraper module identifies the children of that desktop that correspond to the top-level windows of the shared software application. For each of the matching top-level windows, the Scraper module calls a function that recursively iterates through the children of the matching top-level window; the function recursively calls itself.
The Scraper module sorts the initial list of top-level windows that are associated with the software process according to the determined z-order (FIG. 13, block 604).
The Scraper module appends to the sorted list any of the windows in the initial list of top-level windows that are not included in the sorted list (FIG. 11, block 606). This step is used to cover any windows (e.g., a window that was deleted after the sorting process started) that might have been missed.
The Scraper module replaces the initial list with the sorted list of top-level windows (FIG. 13, block 608).
d. Remote Access
If the viewers have only view access, then the Application Sharing Service on the sharer network node only transmits the composite images of the shared window content (in the form of samples) to the subscribing ones of the viewer network nodes. The viewers on the viewer network nodes can only passively view the composite images of the shared windows on the sharers display.
If the viewers have control access, on the other hand, then the Application Sharing Service on the sharer network node transmits the composite images of the shared window content (in the form of samples) to the subscribing ones of the viewer network nodes. In addition, the Scraper module combines commands that are received from the viewer network nodes with the commands that are generated by the sharer on the sharer network node, and passes the combined set of commands to the shared application. This allows the viewers to control, edit, and manipulate the shared application on the sharer network node. The commands typically are derived from events that are generated by one or more user input devices (e.g., a keyboard, a computer mouse, a touchpad, and a touch screen) on the viewer and sharer network nodes.
FIG. 14 shows an embodiment 610 of the first network node 610 and an embodiment 612 of the second network node 612. In the illustrated embodiment, the first network node 610 (which is referred to as the “sharer network node 610”) shares window content 614 that is associated with a shared process 616 with the second network node 612 (which is referred to as the “viewer network node 612”). The sharer network node 610 includes a display 618 on which the window content 614 is presented, a display process 620, an embodiment 622 of the communications application 26, and a network layer 624. Similarly, the viewer network node 612 includes a display 626, a display process 628, an embodiment 630 of the communications application 26, and a network layer 632.
The display processes 620, 628 provide the display facilities of the sharer network node 610 and the viewer network node 612, respectively. The display facilities control the writing of visual contents on the sharer and viewer displays 618, 626. In some embodiments, the display facilities include a graphic device interface (e.g., the GDI available in Microsoft® Windows® application environments) that provides functions that can be called by software processes in order to present visual content on the displays 618, 626.
The network layers 624, 632 provide the networking facilities of the sharer network node 610 and the viewer network node 612, respectively. The network facilities include, for example, networking communications protocol stacks and networking hardware that perform processes associated with sending and receiving information over the network 18.
The communications applications 622, 630 respectively provide various communications facilities (including application sharing facilities) to the sharer network node 610 and the viewer network node 612. In the illustrated embodiment, the communications application 622 on the sharer network node 610 generates a composite image 634 of the shared window content 614 on the sharer's display, transmits the composite image 634 over the network 18 to the viewer network node 612 for presentation on the viewer's display 626, and grants the viewer remote control access to the shared window content 614. The communications application 630 on the viewer network node 612 controls the presentation of the composite image 634 on the display 626, transforms user input into commands, and transmits the commands to the sharer network node 610.
FIG. 15 shows an embodiment of an application sharing method that is implemented by the sharer network node 610 in the application sharing context shown in FIG. 14.
In accordance with the method of FIG. 15, the communications application 622 on the sharer network node 610 captures a composite image of the windows that are associated with the shared software process (FIG. 15, block 637). In some embodiments, the window scraping process described above is used to capture the composite image. The communications application 622 transmits the composite image to the viewer network node 612 (FIG. 15, block 639). In this process, the communications application passes a sample of the composite image to the network layer 624, which converts the sample into a network format and transmits the converted sample to the viewer network node 612 over the network 18.
The communications application 622 receives commands that are derived from local input device events generated on the sharer network node 610 (FIG. 15, block 640). Contemporaneously, the communications application 622 receives commands that are derived from remote input device events that are generated on the viewer network node 612 (FIG. 15, block 642). In this context, the commands are directives or instructions to perform tasks, where the directives or instructions are derived from interpretations of input device events that are initiated by user actions (or inputs) on one or more input devices (e.g., clicking one or more buttons of a computer mouse or keys of a computer keyboard). Some pointer input device events, such a computer mouse, touch pad, and touch screen events, are tied to the position of a pointer (or cursor) in graphical user interface that is presented on a display. These types of input device events typically are converted into commands that include both an input type parameter value that describes the type of input (e.g., left-click, right-click, left button double-click, scroll wheel, etc.) and a location parameter value that describes where the input was entered with respect to a context-dependent coordinate system.
The operating systems on the sharer and viewer network nodes 610, 612 typically convert the pointer input device events into user commands, where the location parameter values are defined with respect to the coordinate system of the main window of the graphical user interface (e.g., the desktop window in a Microsoft® Windows® application environment). The sharer's input commands that are received by the communications application 622 (FIG. 15, block 640) are the operating system generated user commands; these commands also are used to control the operation of user mode software processes (e.g., the shared process 616) on the sharer network node 610. The viewers input commands that are received by the communications application 622 (FIG. 15, block 642), on the other hand, typically are versions of the operating system generated user input commands. In particular, before transmitting the viewer commands to the sharer network node 610, the communications application 630 on the viewer network node 612 remaps the location parameter values in the operating system generated commands from the coordinate system of the main window of the graphical user interface on the display 626 to a coordinate system of the composite image 634. For example, in the illustrated embodiment, the location parameter values are remapped to a coordinate system that has an origin (0,0) at the upper left corner of the composite image 634 and has x and y axes that extend along the bottom and left edges of the composite image 634.
The communications application 622 processes the received commands into a command sequence (FIG. 15, block 644). In this process, the communications application 622 typically remaps the location parameter values in the viewer's commands from the coordinate system of the composite image 634 to the coordinate system of the main window of the graphical user interface on the display 618. The communications application then arranges the received commands into a sequence in order of time of receipt, and stores the resulting sequence of commands into a memory buffer.
The communications application 622 passes the command sequence to the shared process 616 (FIG. 15, block 646). In this process, for each command, the communications application typically invokes Win32 API functions that allows it to specify a window that Windows will send the command to.
The shared process 616 calls one or more graphical device interface functions that are provided by the display process 620 to present the windows that are associated with the shared software process 616 on the sharer display 618 in accordance with the received command sequence (FIG. 15, block 648).
The process (FIG. 15, blocks 640-648) is repeated in accordance with a specified update schedule.
IV. AN EXEMPLARY OPERATING ENVIRONMENT
A. System Architecture
1. Introduction
FIG. 16 shows an embodiment of an exemplary network communications environment 10 that includes the first network node 12 (referred to as the “first client network node”), the second network node 14 (referred to as the “second client network node”), and a virtual environment creator 16 that are interconnected by a network 18. The first client network node 12 and the second client network node 14 are configured as described above in connection with FIG. 1. The virtual environment creator 16 includes at least one server network node 28 that provides a network infrastructure service environment 30. The communications application 26 and the network infrastructure service environment 30 together provide a platform (referred to herein as “the platform”) for creating a spatial virtual communication environment (also referred to herein simply as a “virtual environment”).
In some embodiments, the network infrastructure service environment 30 manages sessions of the first and second client nodes 12, 14 in a virtual area 32 in accordance with a virtual area application 34. The virtual area application 34 is hosted by the virtual area 32 and includes a description of the virtual area 32. The communications applications 26 operating on the first and second client network nodes 12, 14 present respective views of the virtual area 32 in accordance with data received from the network infrastructure service environment 30 and provide respective interfaces for receiving commands from the communicants. The communicants typically are represented in the virtual area 32 by respective avatars, which move about the virtual area 32 in response to commands that are input by the communicants at their respective network nodes. Each communicants view of the virtual area 32 typically is presented from the perspective of the communicants avatar, which increases the level of immersion experienced by the communicant. Each communicant typically is able to view any part of the virtual area 32 around his or her avatar. In some embodiments, the communications applications 26 establish realtime data stream connections between the first and second client network nodes 12, 14 and other network nodes sharing the virtual area 32 based on the positions of the communicants' avatars in the virtual area 32.
The network infrastructure service environment 30 also maintains a relationship database 36 that contains records 38 of interactions between communicants. Each interaction record 38 describes the context of an interaction between a pair of communicants.
2. Network Environment
The network 18 may include any of a local area network (LAN), a metropolitan area network (MAN), and a wide area network (WAN) (e.g., the internet). The network 18 typically includes a number of different computing platforms and transport facilities that support the transmission of a wide variety of different media types (e.g., text, voice, audio, and video) between network nodes.
The communications application 26 (see FIG. 13) typically operates on a client network node that includes software and hardware resources which, together with administrative policies, user preferences (including preferences regarding the exportation of the user's presence and the connection of the user to areas and other users), and other settings, define a local configuration that influences the administration of realtime connections with other network nodes. The network connections between network nodes may be arranged in a variety of different stream handling topologies, including a peer-to-peer architecture, a server-mediated architecture, and hybrid architectures that combine aspects of peer-to-peer and server-mediated architectures. Exemplary topologies of these types are described in U.S. application Ser. Nos. 11/923,629 and 11/923,634, both of which were filed on Oct. 24, 2007.
3. Network Infrastructure Services
The network infrastructure service environment 30 typically includes one or more network infrastructure services that cooperate with the communications applications 26 in the process of establishing and administering network connections between the client nodes 12, 14 and other network nodes (see FIG. 13). The network infrastructure services may run on a single network node or may be distributed across multiple network nodes. The network infrastructure services typically run on one or more dedicated network nodes (e.g., a server computer or a network device that performs one or more edge services, such as routing and switching). In some embodiments, however, one or more of the network infrastructure services run on at least one of the communicants' network nodes. Among the network infrastructure services that are included in the exemplary embodiment of the network infrastructure service environment 30 are an account service, a security service, an area service, a rendezvous service, and an interaction service.
Account Service
The account service manages communicant accounts for the virtual environment. The account service also manages the creation and issuance of authentication tokens that can be used by client network nodes to authenticate themselves to any of the network infrastructure services.
Security Service
The security service controls communicants' access to the assets and other resources of the virtual environment. The access control method implemented by the security service typically is based on one or more of capabilities (where access is granted to entities having proper capabilities or permissions) and an access control list (where access is granted to entities having identities that are on the list). After a particular communicant has been granted access to a resource, that communicant typically uses the functionality provided by the other network infrastructure services to interact in the network communications environment 10.
Area Service
The area service administers virtual areas. In some embodiments, the area service remotely configures the communications applications 26 operating on the first and second client network nodes 12, 14 in accordance with the virtual area application 34 subject to a set of constraints 47 (see FIG. 13). The constraints 47 typically include controls on access to the virtual area. The access controls typically are based on one or more of capabilities (where access is granted to communicants or client nodes having proper capabilities or permissions) and an access control list (where access is granted to communicants or client nodes having identities that are on the list).
The area service also manages network connections that are associated with the virtual area subject to the capabilities of the requesting entities, maintains global state information for the virtual area, and serves as a data server for the client network nodes participating in a shared communication session in a context defined by the virtual area 32. The global state information includes a list of all the objects that are in the virtual area and their respective locations in the virtual area. The area service sends instructions that configure the client network nodes. The area service also registers and transmits initialization information to other client network nodes that request to join the communication session. In this process, the area service may transmit to each joining client network node a list of components (e.g., plugins) that are needed to render the virtual area 32 on the client network node in accordance with the virtual area application 34. The area service also ensures that the client network nodes can synchronize to a global state if a communications fault occurs. The area service typically manages communicant interactions with virtual areas via governance rules that are associated with the virtual areas.
Rendezvous Service
The rendezvous service manages the collection, storage, and distribution of presence information and provides mechanisms for network nodes to communicate with one another (e.g., by managing the distribution of connection handles) subject to the capabilities of the requesting entities. The rendezvous service typically stores the presence information in a presence database. The rendezvous service typically manages communicant interactions with each other via communicant privacy preferences.
Interaction Service
The interaction service maintains the relationship database 36 that contains the records 38 of interactions between communicants. For every interaction between communicants, one or more services of the network infrastructure service environment 30 (e.g., the area service) transmit interaction data to the interaction service. In response, the interaction service generates one or more respective interaction records and stores them in the relationship database. Each interaction record describes the context of an interaction between a pair of communicants. For example, in some embodiments, an interaction record contains an identifier for each of the communicants, an identifier for the place of interaction (e.g., a virtual area instance), a description of the hierarchy of the interaction place (e.g., a description of how the interaction room relates to a larger area), start and end times of the interaction, and a list of all files and other data streams that are shared or recorded during the interaction. Thus, for each realtime interaction, the interaction service tracks when it occurred, where it occurred, and what happened during the interaction in terms of communicants involved (e.g., entering and exiting), objects that are activated/deactivated, and the files that were shared.
The interaction service also supports queries on the relationship database 36 subject to the capabilities of the requesting entities. The interaction service presents the results of queries on the interaction database records in a sorted order (e.g., most frequent or most recent) based on virtual area. The query results can be used to drive a frequency sort of contacts whom a communicant has met in which virtual areas, as well as sorts of who the communicant has met with regardless of virtual area and sorts of the virtual areas the communicant frequents most often. The query results also may be used by application developers as part of a heuristic system that automates certain tasks based on relationships. An example of a heuristic of this type is a heuristic that permits communicants who have visited a particular virtual area more than five times to enter without knocking by default or a heuristic that allows communicants who were present in an area at a particular time to modify and delete files created by another communicant who was present in the same area at the same time. Queries on the relationship database 36 can be combined with other searches. For example, queries on the relationship database may be combined with queries on contact history data generated for interactions with contacts using a communication system (e.g., Skype, Facebook, and Flickr) that is outside the domain of the network infrastructure service environment 30.
4. Virtual Areas
The communications application 26 and the network infrastructure service environment 30 typically administer the realtime connections with network nodes in a communication context that is defined by an instance of a virtual area. The virtual area instance may correspond to an abstract (non-geometric) virtual space that is defined with respect to abstract coordinates. Alternatively, the virtual area instance may correspond to a visual virtual space that is defined with respect to one-, two- or three-dimensional geometric coordinates that are associated with a particular visualization. Abstract virtual areas may or may not be associated with respective visualizations, whereas visual virtual areas are associated with respective visualizations.
As explained above, communicants typically are represented by respective avatars in a virtual area that has an associated visualization. The avatars move about the virtual area in response to commands that are input by the communicants at their respective network nodes. In some embodiments, the communicants view of a virtual area instance typically is presented from the perspective of the communicants avatar, and each communicant typically is able to view any part of the visual virtual area around his or her avatar, increasing the level of immersion that is experienced by the communicant.
FIG. 17 shows an embodiment of an exemplary network node that is implemented by a computer system 48. The computer system 48 includes a display monitor 50, a computer mouse 52, a keyboard 54, speakers 56, 58, and a microphone 60. The display monitor 50 displays a graphical user interface 62. The graphical user interface 62 is a windows-based graphical user interface that can include multiple windows, icons, and a pointer 64. In the illustrated embodiment, the graphical user interface 62 presents a two-dimensional depiction of a shared virtual area 66 that is associated with a three-dimensional visualization representing an art gallery. Communicants are represented in the virtual area 66 by respective avatars 68, 70, 72, each of which may have a respective role (e.g., a curator, an artist, and a visitor) in the context of the virtual area 66.
As explained in detail below, the virtual area 66 includes zones 74, 76, 78, 80, 82 that are associated with respective rules that govern the switching of realtime data streams between the network nodes that are represented by the avatars 68-72 in the virtual area 66. (During a typical communication session, the dashed lines demarcating the zones 74-82 in FIG. 16 are not visible to the communicants although there may be visual cues associated with such zone boundaries.) The switching rules dictate how local connection processes executing on each of the network nodes establishes communications with the other network nodes based on the locations of the communicants' avatars 68-72 in the zones 74-82 of the virtual area 66.
A virtual area is defined by a specification that includes a description of geometric elements of the virtual area and one or more rules, including switching rules and governance rules. The switching rules govern realtime stream connections between the network nodes. The governance rules control a communicants access to resources, such as the virtual area itself, regions with the virtual area, and objects within the virtual area. In some embodiments, the geometric elements of the virtual area are described in accordance with the COLLADA-Digital Asset Schema Release 1.4.1 April 2006 specification (available from http://www.khronos.org/collada/), and the switching rules are described using an extensible markup language (XML) text format (referred to herein as a virtual space description format (VSDL)) in accordance with the COLLADA Streams Reference specification described in U.S. application Ser. Nos. 11/923,629 and 11/923,634.
The geometric elements of the virtual area typically include physical geometry and collision geometry of the virtual area. The physical geometry describes the shape of the virtual area. The physical geometry typically is formed from surfaces of triangles, quadrilaterals, or polygons. Colors and textures are mapped onto the physical geometry to create a more realistic appearance for the virtual area. Lighting effects may be provided, for example, by painting lights onto the visual geometry and modifying the texture, color, or intensity near the lights. The collision geometry describes invisible surfaces that determine the ways in which objects can move in the virtual area. The collision geometry may coincide with the visual geometry, correspond to a simpler approximation of the visual geometry, or relate to application-specific requirements of a virtual area designer.
The switching rules typically include a description of conditions for connecting sources and sinks of realtime data streams in terms of positions in the virtual area. Each rule typically includes attributes that define the realtime data stream type to which the rule applies and the location or locations in the virtual area where the rule applies. In some embodiments, each of the rules optionally may include one or more attributes that specify a required role of the source, a required role of the sink, a priority level of the stream, and a requested stream handling topology. In some embodiments, if there are no explicit switching rules defined for a particular part of the virtual area, one or more implicit or default switching rules may apply to that part of the virtual area. One exemplary default switching rule is a rule that connects every source to every compatible sink within an area, subject to policy rules. Policy rules may apply globally to all connections between the client nodes or only to respective connections with individual client nodes. An example of a policy rule is a proximity policy rule that only allows connections of sources with compatible sinks that are associated with respective objects that are within a prescribed distance (or radius) of each other in the virtual area.
In some embodiments, governance rules are associated with a virtual area to control who has access to the virtual area, who has access to its contents, what is the scope of that access to the contents of the virtual area (e.g., what can a user do with the contents), and what are the follow-on consequences of accessing those contents (e.g., record keeping, such as audit logs, and payment requirements). In some embodiments, an entire virtual area or a zone of the virtual area is associated with a “governance mesh.” In some embodiments, a governance mesh is implemented in a way that is analogous to the implementation of the zone mesh described in U.S. application Ser. Nos. 11/923,629 and 11/923,634. A governance mesh enables a software application developer to associate governance rules with a virtual area or a zone of a virtual area. This avoids the need for the creation of individual permissions for every file in a virtual area and avoids the need to deal with the complexity that potentially could arise when there is a need to treat the same document differently depending on the context.
In some embodiments, a virtual area is associated with a governance mesh that associates one or more zones of the virtual area with a digital rights management (DRM) function. The DRM function controls access to one or more of the virtual area or one or more zones within the virtual area or objects within the virtual area. The DRM function is triggered every time a communicant crosses a governance mesh boundary within the virtual area. The DRM function determines whether the triggering action is permitted and, if so, what is the scope of the permitted action, whether payment is needed, and whether audit records need to be generated. In an exemplary implementation of a virtual area, the associated governance mesh is configured such that if a communicant is able to enter the virtual area he or she is able to perform actions on all the documents that are associated with the virtual area, including manipulating the documents, viewing the documents, downloading the documents, deleting the documents, modifying the documents and re-uploading the documents. In this way, the virtual area can become a repository for information that was shared and discussed in the context defined by the virtual area.
Additional details regarding the specification of a virtual area are described in U.S. application Ser. No. 61/042,714 (which was filed on Apr. 4, 2008), Ser. No. 11/923,629 (which was filed on Oct. 24, 2007), and Ser. No. 11/923,634 (which was filed on Oct. 24, 2007).
5. Communications Application
In some embodiments, the communications application 26 includes:
a. local Human Interface Devices (HIDs) and audio playback devices;
b. a So3D graphical display, avatar, and physics engine;
c. a system database and storage facility.
a. Local Human Interface Devices (HIDS) and Audio Playback Devices
The local HIDs enable a communicant to input commands and other signals into the client network node while participating in a virtual area communications session. Exemplary HIDs include a computer keyboard, a computer mouse, a touch screen display, and a microphone.
The audio playback devices enable a communicant to playback audio signals that are received during a virtual area communications session. Exemplary audio playback devices include audio processing hardware (e.g., a sound card) for manipulating (e.g., mixing and applying special effects) audio signals, and speakers for outputting sounds.
b. So3D Graphical Display, Avatar, and Physics Engine
The So3D engine is a three-dimensional visualization engine that controls the presentation of a respective view of a virtual area and objects in the virtual area on a display monitor. The So3D engine typically interfaces with a graphical user interface driver and the HID devices to present the views of the virtual area and to allow the communicant to control the operation of the communications application 26.
In some embodiments, the So3D engine receives graphics rendering instructions from the area service. The So3D engine also may read a local communicant avatar database that contains images needed for rendering the communicants avatar in the virtual area. Based on this information, the So3D engine generates a visual representation (i.e., an image) of the virtual area and the objects in the virtual area from the point of view (position and orientation) of the communicants avatar in the virtual area. The visual representation typically is passed to the graphics rendering components of the operating system, which drive the graphics rendering hardware to render the visual representation of the virtual area on the client network node.
The communicant can control the presented view of the virtual area by inputting view control commands via a HID device (e.g., a computer mouse). The So3D engine updates the view of the virtual area in accordance with the view control commands. The So3D engine also updates the graphic representation of the virtual area on the display monitor in accordance with updated object position information received from the area service 26.
c. System Database and Storage Facility
The system database and storage facility stores various kinds of information that is used by the platform. Exemplary information that typically is stored by the storage facility includes the presence database, the relationship database, an avatar database, a real user id (RUID) database, an art cache database, and an area application database. This information may be stored on a single network node or it may be distributed across multiple network nodes.
6. Client Node Architecture
A communicant typically connects to the network 18 from a client network node. The client network node typically is implemented by a general-purpose computer system or a dedicated communications computer system (or “console”, such as a network-enabled video game console). The client network node executes communications processes that establish realtime data stream connections with other network nodes and typically executes visualization rendering processes that present a view of each virtual area entered by the communicant.
FIG. 18 shows an embodiment of a client network node that is implemented by a computer system 120. The computer system 120 includes a processing unit 122, a system memory 124, and a system bus 126 that couples the processing unit 122 to the various components of the computer system 120. The processing unit 122 may include one or more data processors, each of which may be in the form of any one of various commercially available computer processors. The system memory 124 includes one or more computer-readable media that typically are associated with a software application addressing space that defines the addresses that are available to software applications. The system memory 124 may include a read only memory (ROM) that stores a basic input/output system (BIOS) that contains start-up routines for the computer system 120, and a random access memory (RAM). The system bus 126 may be a memory bus, a peripheral bus or a local bus, and may be compatible with any of a variety of bus protocols, including PCI, VESA, Microchannel, ISA, and EISA. The computer system 120 also includes a persistent storage memory 128 (e.g., a hard drive, a floppy drive, a CD ROM drive, magnetic tape drives, flash memory devices, and digital video disks) that is connected to the system bus 126 and contains one or more computer-readable media disks that provide non-volatile or persistent storage for data, data structures and computer-executable instructions.
A communicant may interact (e.g., input commands or data) with the computer system 120 using one or more input devices 130 (e.g. one or more keyboards, computer mice, microphones, cameras, joysticks, physical motion sensors such Wii input devices, and touch pads). Information may be presented through a graphical user interface (GUI) that is presented to the communicant on a display monitor 132, which is controlled by a display controller 134. The computer system 120 also may include other input/output hardware (e.g., peripheral output devices, such as speakers and a printer). The computer system 120 connects to other network nodes through a network adapter 136 (also referred to as a “network interface card” or NIC).
A number of program modules may be stored in the system memory 124, including application programming interfaces 138 (APIs), an operating system (OS) 140 (e.g., the Windows XP® operating system available from Microsoft Corporation of Redmond, Wash. U.S.A.), the communications application 26, drivers 142 (e.g., a GUI driver), network transport protocols 144, and data 146 (e.g., input data, output data, program data, a registry, and configuration settings).
7. Server Node Architecture
In some embodiments, the one or more server network nodes of the virtual environment creator 16 are implemented by respective general-purpose computer systems of the same type as the client network node 120, except that each server network node typically includes one or more server software applications.
In other embodiments, the one or more server network nodes of the virtual environment creator 16 are implemented by respective network devices that perform edge services (e.g., routing and switching).
B. Exemplary Communication Session
Referring back to FIG. 17, during a communication session, each of the client network nodes generates a respective set of realtime data streams (e.g., motion data streams, audio data streams, chat data streams, file transfer data streams, and video data streams). For example, each communicant manipulates one or more input devices (e.g., the computer mouse 52 and the keyboard 54) that generate motion data streams, which control the movement of his or her avatar in the virtual area 66. In addition, the communicants voice and other sounds that are generated locally in the vicinity of the computer system 48 are captured by the microphone 60. The microphone 60 generates audio signals that are converted into realtime audio streams. Respective copies of the audio streams are transmitted to the other network nodes that are represented by avatars in the virtual area 66. The sounds that are generated locally at these other network nodes are converted into real-time audio signals and transmitted to the computer system 48. The computer system 48 converts the audio streams generated by the other network nodes into audio signals that are rendered by the speakers 56, 58. The motion data streams and audio streams may be transmitted from each of the communicant nodes to the other client network nodes either directly or indirectly. In some stream handling topologies, each of the client network nodes receives copies of the realtime data streams that are transmitted by the other client network nodes. In other stream handling topologies, one or more of the client network nodes receives one or more stream mixes that are derived from realtime data streams that are sourced (or originated) from other ones of the network nodes.
In some embodiments, the area service maintains global state information that includes a current specification of the virtual area, a current register of the objects that are in the virtual area, and a list of any stream mixes that currently are being generated by the network node hosting the area service. The objects register typically includes for each object in the virtual area a respective object identifier (e.g., a label that uniquely identifies the object), a connection handle (e.g., a URI, such as an IP address) that enables a network connection to be established with a network node that is associated with the object, and interface data that identifies the realtime data sources and sinks that are associated with the object (e.g., the sources and sinks of the network node that is associated with the object). The objects register also typically includes one or more optional role identifiers for each object; the role identifiers may be assigned explicitly to the objects by either the communicants or the area service, or may be inferred from other attributes of the objects or the user. In some embodiments, the objects register also includes the current position of each of the objects in the virtual area as determined by the area service from an analysis of the realtime motion data streams received from the network nodes associated with objects in the virtual area. In this regard, the area service receives realtime motion data streams from the network nodes associated with objects in the virtual area, tracks the communicants' avatars and other objects that enter, leave, and move around in the virtual area based on the motion data. The area service updates the objects register in accordance with the current locations of the tracked objects.
In the process of administering realtime data stream connections with other network nodes, the area service maintains for each of the client network nodes a set of configuration data, including interface data, a zone list and the positions of the objects that currently are in the virtual area. The interface data includes for each object associated with each of the client network nodes a respective list of all the sources and sinks of realtime data stream types that are associated with the object. The zone list is a register of all the zones in the virtual area that currently are occupied by the avatar associated with the corresponding client network node. When a communicant first enters a virtual area, the area service typically initializes the current object positions database with position initialization information. Thereafter, the area service updates the current object positions database with the current positions of the objects in the virtual area as determined from an analysis of the realtime motion data streams received from the other client network nodes sharing the virtual area.
C. Interfacing with a Spatial Virtual Communication Environment
In addition to the local Human Interface Device (HID) and audio playback devices, the So3D graphical display, avatar, and physics engine, and the system database and storage facility, the communications application 26 also includes a graphical navigation and interaction interface (referred to herein as a “seeker interface”) that interfaces the user with the spatial virtual communication environment. The seeker interface includes navigation controls that enable the user to navigate the virtual environment and interaction controls that enable the user to control his or her interactions with other communicants in the virtual communication environment. The navigation and interaction controls typically are responsive to user selections that are made using any type of input device, including a computer mouse, a touch pad, a touch screen display, a keyboard, and a video game controller. The seeker interface is an application that operates on each client network node. The seeker interface is a small, lightweight interface that a user can keep up and running all the time on his or her desktop. The seeker interface allows the user to launch virtual area applications and provides the user with immediate access to realtime contacts and realtime collaborative places (or areas). The seeker interface is integrated with realtime communications applications and/or realtime communications components of the underlying operating system such that the seeker interface can initiate and receive realtime communications with other network nodes. A virtual area is integrated with the users desktop through the seeker interface such that the user can upload files into the virtual environment created by the virtual environment creator 16, use files stored in association with the virtual area using the native client software applications independently of the virtual environment while still present in a virtual area, and more generally treat presence and position within a virtual area as an aspect of their operating environment analogous to other operating system functions rather than just one of several applications.
The spatial virtual communication environment typically can be modeled as a spatial hierarchy of places (also referred to herein as “locations”) and objects. The spatial hierarchy includes an ordered sequence of levels ranging from a top level to a bottom level. Each of the places in a successive one of the levels of the spatial hierarchy is contained in a respective one of the places in a preceding one of the levels. Each of the objects in the spatial hierarchy is contained in a respective one of the places. The levels of the spatial hierarchy typically are associated with respective visualizations that are consistent with a geographical, architectural, or urban metaphor, and are labeled accordingly. The zones of each virtual area are defined by respective meshes, some of which define elements of a physical environment (e.g., spaces, such as rooms and courtyards, that are associated with a building) that may contain objects (e.g., avatars and props, such as view screen objects and conferencing objects).
The navigational controls of the seeker interface allow the user to traverse a path through the virtual environment in accordance with a navigational model that is tied to the underlying spatial hierarchy of places and objects. The network infrastructure service environment 30 records the path traversed by the user. In some embodiments, the network infrastructure service environment 30 records a history that includes a temporally ordered list of views of the virtual area that are presented to the user as the user navigates through the virtual area. Each view typically corresponds to a view of a respective renderable zone of the virtual area. In these embodiments, the navigation controls enable the user to move to selected ones of the zones in the history. The navigation controls also include a graphical representation of a depth path that shows the location in the spatial hierarchy that corresponds to the users current view of the virtual area. In some embodiments, the graphical representation of the depth path includes a respective user-selectable link to a respective view of each of the preceding levels in the spatial hierarchical model of the virtual area above the current view.
The interaction controls of the seeker interface allow the user to manage interactions with other communicants. The interaction options that available to the user typically depend on the zones in which the user has a presence. In some embodiments, the interaction options that are available to communicants who have presence in a particular zone are different from the options that are available to other communicants who do not have presence in that zone. The level of detail and interactivity of the user typically depend on whether or not the user has a presence the particular zone. In one exemplary embodiment, if the user is outside the virtual area, the user is provided with a minimal level of detail of the interactions occurring within the virtual area (e.g., the user can see an outline of the floorplan, background textures, and plants of the area, but the user cannot see where other communicants are present in the area); if the user is within the virtual area but outside a particular zone of the area, the user is provided with a medium level of detail of the interactions occurring within the particular zone (e.g., the user can see where other communicants are present in the area, see a visualization of their current states—talking, typing a chat message, whether or not their headphones and microphones are turned-on—and see whether any of the view screens are active); if the user is within the particular zone of the area, the user is provided with full level of detail of the interactions occurring with the particular zone (e.g., the user can see a thumbnail of the file being shared on a view screen, hear and speak with other communicants in the area, and see elements of a log of chat messages that were generated by communicants in the zone). In some embodiments, the switching and governance rules that are associated with the zones of the virtual area control how the network infrastructure services distinguish between those who have presence in the particular zone from those who do not.
FIG. 19 shows an embodiment 160 of the seeker interface that is displayed in a window 162 and includes one or more tabs 164, each of which has a browsing area 166 and a toolbar 168.
Each of the tabs 164 typically is associated with a respective view of the virtual environment. In the illustrated embodiment the view presented in the table 164 (labeled “My Areas”) is associated with a respective set of virtual areas, which may be a default set of virtual areas in the virtual environment or it may be a set of virtual areas that is identified by a respective filter on the interaction database. In particular, the tab 164 is associated with a set of three virtual areas (i.e., Acme, Sococo Help Area, and Personal Space), which may be a default set of areas that are associated with the user or may be identified by a filter that identifies all of the areas that are associated with the user (e.g., all of the areas in which the user has interacted). Additional tabs may be created by selecting the “+” button 170.
The browsing area 166 of each tab shows graphic representations of the elements of the virtual environment that are associated with the tab. For example, in the illustrated embodiment, the browsing area 166 shows top- level views 172, 174, 176 of the virtual areas that are associated with the tab 164. The user may navigate to the next lower level in the spatial hierarchical model of any of the virtual areas by selecting the corresponding graphic representation of the virtual area.
The toolbar 168 includes an adaptive set of navigational and interaction tools that automatically are selected by the seeker interface based on the current location of the user in the virtual environment In the illustrated embodiment, the toolbar 168 includes a back button 178, a forward button 180, a placemarks button 182, and a home button 184. The back button 178 corresponds to a backward control that enables the user to incrementally move backward to preceding ones of the zones in the history of the zones that were traversed by the user. The forward button 180 corresponds to a forward control that enables the user to incrementally move forward to successive ones of the zones in the history of the zones that were traversed by the user. The placemarks button 182 provides a placemarking control for storing links to zones and a placemark navigation control for viewing a list of links to previously placemarked zones. In response to a user selection of the placemarking control, a placemark is created by storing an image of the location shown in the current view in association with a hyperlink to the corresponding location in the virtual area. In response to a user selection of the placemark navigation control, a placemarks window is presented to the user. The placemarks window includes live visualizations of all locations that have been placemarked by the user. Each of the images in the placemarks window is associated with a respective user-selectable hyperlink. In response to user selection of one of the hyperlinks in the placemarks window, a view of the virtual area corresponding to the location associated with the selected hyperlink is automatically displayed in the browsing area 166 of the seeker interface window 162. The home button 184 corresponds to a control that returns the user to a view of a designated “home” location in the virtual environment (e.g., the view shown in FIG. 19).
Referring to FIG. 20, in response to a user selection of the graphic representation 172 of the Acme virtual area shown in FIG. 19, the platform moves the user into a default zone of the virtual area, automatically establishes a presence for the user in the default zone, and automatically establishes a network connection between the user and each of the other communicants occupying the selected zone. Based on the switching rules established by the area designer, the platform multiplexes the specified realtime data streams (e.g., streams from the microphones and speakers) of all the communicants in default zone such that they can both see each others' sprites or avatars and communicate with (e.g., speak to and hear) each other.
The seeker interface shows a top or floorplan view of the Acme virtual area in the browsing area 166 of the tab 164 and provides the user with a default set of interaction options. In the illustrated embodiment, a presence automatically is established in a courtyard zone 190 of the virtual area, and the user's microphone and default speakers (e.g., headphones) are turned-on. In the floorplan view shown in FIG. 20, the user is represented by a circular sprite 192; the other users in the Acme virtual area also are represented by respective circular sprites 194-202. The state of the user's speakers is depicted by the presence or absence of a headphones graphic 203 on the user's sprite 192: when the speakers are on, the headphones graphic 203 is present and, when the speakers are off, the headphones graphic 203 is absent. The state of the user's microphone is depicted by the presence or absence of a microphone graphic 206 on the user's sprite 192 and a series of concentric circles 204 around the user's sprite 192: when the microphone is on, the microphone graphic 206 and the concentric circles 204 are present and, when the microphone is off, the microphone graphic 206 and the concentric circles 204 are absent The headphones graphic 203, the concentric circles 204, and the microphone graphic 206 serve as visual reminders of the states of the users sound playback and microphone devices.
In addition to the backward button 178, the forward button 180, the placemarks button 182, and the home button 184, the toolbar 168 also includes a series of one or more breadcrumb buttons 207 that originate from and include the home button 184. The breadcrumb buttons 207 correspond to a hierarchical sequence of successive, user-selectable links. Each of the successive links corresponds to a view of a respective level in the hierarchical model of the virtual area in which each successive level is contained by preceding ones of the levels. In the illustrated embodiment, the breadcrumb buttons 207 include the home button 184 and an Acme button 208 that corresponds to the current view of the Acme virtual area shown in FIG. 20. The breadcrumb buttons 207 provide the user with single button selection access to respective views of different levels of the virtual environment. The toolbar 168 additionally includes a button 210 and a settings button 212.
When an area is selected or in focus, the button 210 appears as an iconographic representation of two people and is labeled “members,” and allows members and moderators to see the list of people associated with an area. When an audio or chat zone is in focus, the button 210 has a different image (e.g., an image of an arrow pointing downward onto a plane to represent the operation of getting) and is labeled “get”. In response to a user selection of the button 210, a list of all the members of the Acme virtual area 166 is displayed in a user interface. The user may select any of the communicants in the list and click a get button that is presented in the user interface; in response, the platform transmits an invitation to the selected communicant to join the user in a designated one of the zones.
The settings button 212 provides the user with access to a set of controls for specifying default settings that are associated with the current area.
The user may navigate from the view of the Acme virtual area shown in FIG. 20 back to the view shown in FIG. 19 in a variety of different ways. For example, the user may select any of the back button 178, the home button 184, or any part of the zone 211 that is outside the boundary of the graphic representation of the Acme virtual area shown in FIG. 20.
The user may navigate to any of the zones of the Acme virtual area. In some embodiments, in order to move to a zone, the user transmits a command to execute one of the zones displayed on the monitor, (e.g., by selecting the zone and then clicking the Enter button (in the toolbar), or, as a shortcut, double-clicking the zone) and, in response, the platform depicts the user's avatar in the zone corresponding to the zone object. In response to the zone execution command, the seeker interface outlines the zone (indicating to the user that it is selected) and updates the breadcrumb buttons 207 to show the selected zone location in the hierarchy. Toolbar buttons specific to the selection will also appear to the right of the breadcrumb buttons 207.
The user also may interact with any objects (e.g., a screen, table, or file) that are present in a zone. In some embodiments, in order to interact with an object, the user transmits a command to execute one of the objects displayed on the monitor, (e.g., by selecting the object and then clicking the View button (in the toolbar), or, as a shortcut, double-clicking the object) and, in response, the platform performs an operation with respect to the object (e.g., present a zoomed-in view of the object, open an interaction interface window, etc.). In response to the object execution command, the seeker interface outlines or otherwise highlights the prop (indicating to the user that it is selected) and updates the breadcrumb buttons 207 to show the selected object location in the hierarchy. Toolbar buttons specific to the selection will also appear to the right of the breadcrumb buttons 207.
Referring to FIG. 21, in some embodiments, in response to the user entering a Main space 213, the platform automatically establishes a network connection between the user and each of the other communicants occupying the selected zone. The user also may enter (and thereby establish a presence in) a space by selecting the space and clicking the Enter button; this causes the platform to move the user's sprite from its current location (i.e., Courtyard) to the selected space (i.e., Main). The settings of user's speakers and microphone typically do not change when the user is moved from one location to another.
FIG. 22 shows a situation in which the user has double-clicked a wall object 290 in the view of the Main space 213 shown in FIG. 21.
In response to the user's command to execute the wall object 290, the seeker interface presents in the browsing area 166 of the tab 164 a 2.5-dimensional view of the contents of the wall object 290 and areas of the Main space 213 surrounding the wall object 290. In the embodiment shown in FIG. 22, the selected wall object 290 corresponds to a North Wall of the Main space 213. The North Wall contains a pair of view screen objects 289, 291 (labeled “2” and “3”, respectively) that are presented on the North Wall object 290. The view screen objects 289, 291 can be used to present the contents of data files that are associated with the North Wall of the Main space 213. The 2.5-dimensional view also shows a West Wall object 293 and an East Wall object 295 that are located on the left and right sides of the North Wall object 290, respectively. Each of the West Wall object 293 and the East Wall object 295 includes a respective view screen object 293, 295 (labeled “1” and 4”, respectively) that can be used to present the contents of respective data files.
The interface also shows in the minimap 256 a view of the Main space 213 and areas of the Acme space surrounding the Main space 213. The minimap 256 also shows a highlighted view 292 of the selected North Wall object 290 in the Main space 213.
The breadcrumb buttons 207 shown in the toolbar 168 of the tab 164 include a North Wall button 294 that corresponds to the current level in the hierarchical spatial model of the virtual area. The toolbar 168 includes a rotate left button 296 and a rotate right button 298 that allow the user to rotate the current view left and right by ninety degrees (90°) so that the user can view the contents of different walls of the Main space in the central viewing area of the 2.5-dimensional view of the Main space 213. The user also can double-click a different one of the walls that are shown in the minimap 256 in order to change the contents that are presented in the central viewing area of the 2.5-dimensional view of the Main space 213.
FIG. 23 shows the seeker interface after the user has selected the view screen object 291 (labeled “3”) on the North Wall in the view of the Main space 213 shown in FIG. 22. The user may have executed the view screen object 291 either by double-clicking any part of the view screen object 291 shown in the central viewing area of the 2.5-dimensional view shown in the browsing area 166 of the tab 164 or by double-clicking the corresponding view screen object in the minimap 256 shown in FIG. 22. In response to the user double-clicking on the view screen object 291 shown in FIG. 22, the browsing area 166 of the tab 164 shows a zoomed-in view of the view screen object 291 and areas of the North Wall object 290 surrounding the view screen object 291, as shown in FIG. 23. The user may double-click on any of the areas of the North Wall object 290 surrounding the view screen object 291 in the browsing area 166 shown in FIG. 22 in order to return to the browsing area and minimap views of the Main space 213 that are shown in FIG. 22. In the embodiment shown in FIG. 23, the minimap 256 shows a 2.5-dimensional view of the contents of the wall object 290 along with areas of the Main space surrounding the wall object 290; this view corresponds to the preceding level in the hierarchical spatial model of the virtual area. The breadcrumb buttons 207 include a Screen 3 button 302 that corresponds to the current level in the hierarchical spatial model of the virtual area. The toolbar 168 includes a Share button 304 that allows a user to designate a shared data file whose contents will be presented on the view screen object 291 (i.e., Screen 3) and thereby allow all communicants in the Main space 213 to simultaneously share the data file. The view screen object 291 shown in the browsing area 166 includes a Share link 306 that also allows the user to designate the shared data file.
Referring to FIG. 24, in response to a user selection of the Share button 304 or the Share link 306, the seeker interface opens a separate Select Source interface window 310 that allows the user to designate a data file whose contents will be shared on the view screen object 291 as described above in section IV. The Select Source interface includes a text box 312 for receiving a data file identifier (e.g., a local data file storage path name or a Uniform Resource Identifier (URI)) and a Browse button 314 that enables the user to browse different locations for the data file identifier. The data file identifier may be located on the client node 12 or another network node. The Select Source interface window 310 also includes a favorites button 316 that allows the user to browse a list of previously bookmarked files, applications, or URIs for the data file identifier.
Referring to FIG. 25, after the user has selected a data file identifier in the Select Source interface the communications application 26 generates a thumbnail image of the selected data file: In some embodiments, the Scraper module is invoked to capture the thumbnail image of the selected data file. In some embodiments the thumbnail image may be a sample of the main window of the shared application that is associated with the selected data file. The thumbnail image is displayed on the view screen object 291 in both the browsing area 166 and the minimap 256. In the illustrated embodiment, the designated data file corresponds to a PowerPoint® data file that contains a slide of a pie chart. The user can terminate the presentation of the data file on the view screen object 291 by selecting the clear icon 318.
One or more viewers can subscribe to the windows of the shared application showing the contents of the selected data file by clicking (or double-clicking) on the thumbnail image shown on the viewscreen object 291. Each viewer can view, control, edit, and manipulate the shared window content presented on the viewscreen object 291 subject to any governance rules that are associated with the selected data file or the zone containing the viewscreen object 291. If a viewer has control access to the shared window content, the viewer can input commands to the shared process executing on the sharer network node by using one or more input devices on the viewer's network node as described above in section IV. Assuming that realtime performance can be achieved over the respective network connections between the sharer network node and the viewer network nodes, the edits and other manipulations of the shared data file typically will appear to each of the collaborators as if they were made on the same network node.
V. CONCLUSION
The embodiments that are described herein enable application sharing with high fidelity, realtime performance, viewer immersion, and privacy protection. Some embodiments also enable multichannel application sharing in which two or more communicants share applications and screen content with each other at the same time. These embodiments typically include an interface that allows each receiver to distinguish one shared window from another.
Other embodiments are within the scope of the claims.

Claims (42)

The invention claimed is:
1. A method, comprising, by a sharer network node:
on a local display of the sharer network node, displaying a representation of a virtual area, a representation of a first viewscreen object in the virtual area, a representation of a second viewscreen object in the virtual area, a representation of a user who is present in the virtual area and operating the sharer network node, and a respective representation of each of one or more other communicants who are present in the virtual area and respectively operating one or more other network nodes;
in a screen layout of windows on the local display, selecting multiple windows that are associated with a first software process based on user input in connection with the first viewscreen object, wherein at least one of the selected windows is at least partially obscured by one or more unselected ones of the windows in the screen layout;
generating a first sequence of composite images of the selected windows as they are arranged in the screen layout at different respective times but free of obscuration in any area obscured by any of the unselected ones of the windows in the screen layout;
transmitting the first sequence of composite images from the sharer network node to a particular one of the one or more other network nodes in connection with the first viewscreen object; and
contemporaneously with the transmitting, communicating a second sequence of composite images of windows associated with a second software process between the sharer network node and the particular one of the one or more other network nodes in connection with the second viewscreen object.
2. The method of claim 1, wherein the selecting comprises selecting all of the windows in the screen layout that are associated with the first software process.
3. The method of claim 1, wherein the selecting comprises selecting ones of the windows in the screen layout that match a handle that is assigned to the first software process.
4. The method of claim 3, wherein the selecting comprises selecting a parent window and at least one associated child window that are created by the first software process.
5. The method of claim 1, wherein the generating comprises for each of the composite images in the first sequence:
determining a layering order of the selected windows in relation to each other that corresponds to the relative layering order of the selected windows in the screen layout;
for each of the selected windows, retrieving a respective image of the window; and
compositing the retrieved images into the composite image in accordance with the determined layering order.
6. The method of claim 5, wherein the determining comprises:
generating a z-order list that associates a respective z-order value with a respective window handle for each of the windows in the screen layout; and
deriving the layering order of the selected windows from the z-order list.
7. The method of claim 6, wherein the deriving comprises:
for each of the z-order values in the z-order list, matching the associated window handle with the window handle of a respective one of the selected windows; and
ordering the selected windows in the layering order in accordance with the respective z-order values in the z-order list associated with the ones of the z-order values determined to match the window handles of the selected windows.
8. The method of claim 5, wherein the generating further comprises determining two-dimensional locations of the selected windows in the screen layout for each of the composite images in the first sequence.
9. The method of claim 1, wherein:
the selecting comprises selecting ones of the windows that are associated with a designated group of software processes; and
the generating comprises, for each of the composite images in the first sequence:
determining a layering order of the selected windows in relation to each other that corresponds to the relative layering order of the selected windows in the screen layout;
for each of the selected windows, retrieving a respective image of the window; and
compositing the retrieved images into the composite image in accordance with the determined layering order.
10. The method of claim 1, wherein a respective image of each of the selected windows is stored in a respective memory buffer, and the generating comprises retrieving each of the images from the respective memory buffers and compositing the retrieved images into the composite image for each of the composite images in the first sequence.
11. The method of claim 1, further comprising by the sharer network node setting the windows that are associated with the first software process to a layered windows style such that a respective image of each of the selected windows is stored in a respective memory buffer.
12. The method of claim 1, wherein the transmitting is performed based on a request from the viewer network node to view screen data associated with the first viewscreen object.
13. The method of claim 12, further comprising transmitting the first sequence of composite images from the sharer network node to multiple of the other network nodes based on a respective request from each of the multiple other network nodes to view screen data associated with the first viewscreen object, wherein the first sequence of composite images is transmitted to the multiple other network nodes contemporaneously.
14. The method of claim 1, wherein the virtual area comprises a plurality of zones, each zone of the plurality of zones supporting establishment of communicant presence in the zone; and further comprising:
on the local display, displaying a visualization of the virtual area, wherein the visualization comprises a spatial layout of the zones;
on the local display, presenting navigation controls and interaction controls, wherein the navigation controls enable the user to specify where to establish a presence in the virtual area and the interaction controls enable the user to manage interactions with the one or more other communicants in the virtual area;
establishing a respective presence of the user in one or more of the zones of the virtual area in response to input received via the navigation controls; and
on the local display, depicting the representations of the user and the one or more other communicants in the zones in which the user and the one or more other communicants are present.
15. The method of claim 1, further comprising presenting the first sequence of composite images on the local display in connection with the first viewscreen object.
16. The method of claim 1, further comprising by the sharer network node:
receiving locally-generated commands that are derived from local input device events on the sharer network node;
receiving remotely-generated commands that are derived from remote input device events on a remote viewer network node;
processing the received locally-generated commands and the received remotely-generated commands into a command sequence;
passing the command sequence to the first software process;
presenting the windows in the screen layout on the local display in accordance with the received command sequence.
17. The method of claim 16, wherein the processing comprises remapping screen location parameter values in the remotely-generated commands from a coordinate system of a shared application image received from the sharer network node to a coordinate system of a main window on the local display.
18. The method of claim 17, wherein the processing comprises arranging into the command sequence (i) the remotely-generated commands with the remapped screen location parameter values and (ii) the locally-generated commands.
19. The method of claim 18, wherein the arranging comprises ordering (i) the remotely-generated commands with the remapped screen location parameter values and (ii) the locally-generated commands based on respective times that the locally-generated commands are derived from the local input device events and respective times that the remotely-generated commands are received.
20. The method of claim 17, further comprising remapping screen location parameter values in the commands derived from the remote input device events from a coordinate system of a main window on a remote display of the viewer network node to a coordinate system of the shared application image.
21. The method of claim 20, wherein the remapping is performed on the viewer network node, and further comprising transmitting the remotely-generated commands with the remapped screen location parameter values to the sharer network node.
22. The method of claim 16, wherein the command sequence controls operation of the shared process.
23. The method of claim 1, wherein:
in the screen layout, at least one of the selected windows is obscured by a particular unselected one of the windows in the screen layout; and
the generating comprises retrieving a respective image of each of the selected windows from a respective memory buffer for each of respective ones of the composite images in the first sequence, and, from the retrieved images, generating the composite image of the selected windows as they are arranged in the screen layout but free of obscuration in any area obscured by the particular unselected window in the screen layout.
24. The method of claim 1, wherein the transmitting is based on a subscription of the particular one of the other network nodes to content associated with the first viewscreen object.
25. The method of claim 24, wherein
multiple of the communicant operating respective ones of the other network nodes are contemporaneously subscribed to content associated with the first viewscreen object; and
the transmitting comprises contemporaneously transmitting the sequence of composite images to the network nodes being operated by the communicants who are subscribed to the content associated with the first viewscreen object.
26. The method of claim 24, wherein the virtual area is associated with a governance rule that controls access to the sequence of composite images.
27. The method of claim 26, wherein the governance rule defines one or more conditions on abilities of communicants in the virtual area to subscribe to the transmitted sequence of composite images.
28. The method of claim 26, wherein the governance rule defines one or more conditions on one or more types of access to the transmitted sequence of composite images.
29. The method of claim 28, wherein the governance rule defines a condition on abilities of communicants in the virtual area to view the transmitted sequence of composite images.
30. The method of claim 28, wherein the governance rule defines a condition on abilities of communicants in the virtual area to control the transmitted sequence of composite images.
31. The method of claim 1, further comprising selecting one or more windows that are associated with a second software process, and generating the first sequence of composite image from the selected windows of the first and second software processes as they are arranged in the screen layout at different respective times but free of obscuration in any area obscured by any unselected ones of the windows in the screen layout.
32. The method of claim 1, wherein the communicating comprises by the sharer network node receiving the second sequence of composite images from the particular one of the one or more other network nodes.
33. The method of claim 1, wherein the communicating comprises by the sharer network node transmitting the second sequence of composite images to the particular one of the one or more other network nodes.
34. A method, comprising:
in a screen layout of windows on a local display of a sharer network node, selecting ones of the windows that are associated with a software process, wherein at least one of the selected windows is at least partially obscured by one or more unselected ones of the windows in the screen layout;
on the sharer network node, generating a composite image of the selected windows as they are arranged in the screen layout but free of obscuration in any area obscured by any of the unselected ones of the windows in the screen layout, wherein the generating comprises generating the composite image of the selected windows that includes any contents of the selected windows that are located outside a visible desktop window that contains the screen layout; and
transmitting the composite image from the sharer network node to a viewer network node.
35. A method, comprising:
on a local display of a sharer network node, displaying a visualization of a virtual area comprising a plurality of zones, wherein the visualization comprises a spatial layout of the plurality of zones, each zone of the plurality of zones supports establishment of communication sessions between network nodes associated with the zone, a sharer at the sharer network node and a viewer at the viewer network node both have presence in a particular one of the zones of the plurality that comprises viewscreen objects, and the displaying comprises displaying respective representations of the viewer, the sharer, and the viewscreen objects in the virtual area;
on the local display, presenting navigation controls and interaction controls, wherein the navigation controls enable the sharer to specify where to establish a presence in the virtual area and the interaction controls enable the sharer to manage interactions with the viewer in the virtual area;
in response to input received via the navigation controls, establishing a presence of the sharer in the particular zone of the virtual area;
in a screen layout of windows on the local display, selecting ones of the windows that are associated with a software process, wherein at least one of the selected windows is at least partially obscured by one or more unselected ones of the windows in the screen layout;
on the sharer network node, generating a composite image of the selected windows as they are arranged in the screen layout but free of obscuration in any area obscured by any of the unselected ones of the windows in the screen layout;
transmitting the composite image from the sharer network node to a viewer network node;
performing the selecting, the generating, and the transmitting with respect to one or more other software processes executing on one or more other network nodes; and
on the local display, presenting the respective composite images transmitted from the one or more other network nodes in connection with different respective ones of the viewscreen objects in the particular zone of the virtual area at the same time.
36. The method of claim 35, further comprising for each of the viewscreen objects in connection with which a respective one of the composite images is presented, on the local display showing the graphical depiction of the respective sharer who is operating the network node from which the respective composite image is transmitted.
37. A method, comprising:
in a screen layout of windows on a local display of a sharer network node, selecting ones of the windows that are associated with a software process, wherein at least one of the selected windows is at least partially obscured by one or more unselected ones of the windows in the screen layout;
on the sharer network node, generating a composite image of the selected windows as they are arranged in the screen layout but free of obscuration in any area obscured by any of the unselected ones of the windows in the screen layout;
transmitting the composite image from the sharer network node to a viewer network node;
sending a request to share screen data associated with the a software process executing on the viewer network node; and
at the sharer network node, receiving a composite image of shared screen data from the viewer network node, wherein the receiving and the transmitting are performed contemporaneously.
38. At least one non-transitory, tangible computer-readable medium having computer-readable program code embodied therein, the computer-readable program code adapted to be executed by a computer to perform operations comprising:
on a local display of the sharer network node, displaying a representation of a virtual area, a representation of a first viewscreen object in the virtual area, a representation of a second viewscreen object in the virtual area, a representation of a user who is present in the virtual area and operating the sharer network node, and a respective representation of each of one or more other communicants who are present in the virtual area and respectively operating one or more other network nodes;
in a screen layout of windows on the local display, selecting multiple windows that are associated with a first software process based on user input in connection with the first viewscreen object, wherein at least one of the selected windows is at least partially obscured by one or more unselected ones of the windows in the screen layout;
generating a first sequence of composite images of the selected windows as they are arranged in the screen layout at different respective times but free of obscuration in any area obscured by any of the unselected ones of the windows in the screen layout;
transmitting the first sequence of composite images from the sharer network node to a particular one of the one or more other network nodes in connection with the first viewscreen object; and
contemporaneously with the transmitting, communicating a second sequence of composite images of windows associated with a second software process between the sharer network node and the particular one of the one or more other network nodes in connection with the second viewscreen object.
39. An apparatus, comprising:
a memory storing computer-readable instructions; and
a data processing unit coupled to the memory, operable to execute the instructions, and based at least in part on the execution of the instructions operable to perform operations comprising
on a local display of the sharer network node, displaying a representation of a virtual area, a representation of a first viewscreen object in the virtual area, a representation of a second viewscreen object in the virtual area, a representation of a user who is present in the virtual area and operating the sharer network node, and a respective representation of each of one or more other communicants who are present in the virtual area and respectively operating one or more other network nodes;
in a screen layout of windows on the local display selecting multiple windows that are associated with a first software process based on user input in connection with the first viewscreen object, wherein at least one of the selected windows is at least partially obscured by one or more unselected ones of the windows in the screen layout;
generating a first sequence of composite images of the selected windows as they are arranged in the screen layout at different respective times but free of obscuration in any area obscured by any of the unselected ones of the windows in the screen layout;
transmitting the first sequence of composite images from the sharer network node to a particular one of the one or more other network nodes in connection with the first viewscreen object; and
contemporaneously with the transmitting, communicating a second sequence of composite images of windows associated with a second software process between the sharer network node and the particular one of the one or more other network nodes in connection with the second viewscreen object.
40. A method, comprising:
on a local display of a sharer network node, displaying a representation of a virtual area, a representation of a first viewscreen object in the virtual area, and a representation of a second viewscreen object in the virtual area;
in a screen layout of windows on the local display of the sharer network node, selecting ones of the windows that are associated with a software process, wherein at least one of the selected windows is at least partially obscured by one or more unselected ones of the windows in the screen layout;
based on user input, associating the software process with the first viewscreen object;
on the sharer network node, generating a first sequence of successive composite images of the selected windows as they are arranged in the screen layout at different respective times but free of obscuration by any other windows in the screen layout;
transmitting the first sequence of composite images from the sharer network node to a first viewer network node based on a subscription of the first viewer network node to content associated with the first viewscreen object;
selecting a second set of one or more of the windows in the screen layout;
generating a second sequence of successive composite images of the selected windows in the second set as they are arranged in the screen layout at different respective times but free of obscuration in any area obscured by any of the windows in the screen layout outside the second set; and
based on a subscription of a second viewer network node to content associated with the second viewscreen object, transmitting the second sequence of composite images from the sharer network node to the second viewer network node, wherein the first sequence of composite images and the second sequence of composite images are respectively transmitted to the first and second viewer network nodes contemporaneously.
41. A method, comprising:
on a local display of a sharer network node, displaying a representation of a virtual area, a representation of a first viewscreen object in the virtual area, and a representation of a second viewscreen object in the virtual area;
in a screen layout of windows on the local display of the sharer network node, selecting ones of the windows that are associated with a first software process, wherein at least one of the selected windows is at least partially obscured by one or more unselected ones of the windows in the screen layout;
based on user input, associating the first software process with the first viewscreen object;
on the sharer network node, generating a first sequence of successive composite images of the selected windows as they are arranged in the screen layout at different respective times but free of obscuration by any other windows in the screen layout;
transmitting the first sequence of composite images from the sharer network node to a first viewer network node based on a subscription of the first viewer network node to content associated with the first viewscreen object;
based on user input, associating a second software process with the second viewscreen object;
selecting a second set of one or more of the windows that associated with the second software process;
generating a second sequence of successive composite images of the selected windows in the second set as they are arranged in the screen layout at different respective times but free of obscuration in any area obscured by any of the windows in the screen layout outside the second set; and
based on a subscription of a second viewer network node to content associated with the second viewscreen object, transmitting the second sequence of composite images from the sharer network node to the second viewer network node, wherein the first sequence of composite images and the second sequence of composite images are respectively transmitted to the first and second viewer network nodes contemporaneously.
42. A method, comprising:
on a local display of a first network node, displaying a representation of a virtual area, a representation of a first viewscreen object in the virtual area, and a representation of a second viewscreen object in the virtual area, wherein a first communicant is operating the first network node and a second communicant is operating a second network node, the first and second communicants are present in a particular zone of the virtual area comprising the first viewscreen object and the second viewscreen object, and the displaying comprises displaying in the virtual area a representation of the first communicant and a representation of the second communicant;
in a screen layout of windows on the local display of the first network node, selecting ones of the windows that are associated with a software process, wherein at least one of the selected windows is at least partially obscured by one or more unselected ones of the windows in the screen layout;
based on input from the first communicant, the first network node associating the software process with the first viewscreen object;
the first network node generating a first sequence of successive composite images of the selected windows as they are arranged in the screen layout at different respective times but free of obscuration by any other windows in the screen layout;
transmitting the first sequence of composite images from the first network node to the second network node based on a subscription of the second network node to content associated with the first viewscreen object;
based on input from the first communicant, the first network node subscribing to content associated with the second viewscreen object;
the first network node receiving from the second network node a second sequence of composite images of one or more windows associated with a software process executing on the second network node, wherein the transmitting of the first sequence of composite images from the first network node to the second network node is contemporaneous with the receiving of the second sequence of composite images by the first network node from the second network node; and
presenting the received second sequence of composite images on the local display in association with the second viewscreen object.
US14/666,298 2007-10-24 2015-03-23 Application sharing Active 2031-06-05 USRE46309E1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/666,298 USRE46309E1 (en) 2007-10-24 2015-03-23 Application sharing

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
US11/923,634 US7844724B2 (en) 2007-10-24 2007-10-24 Automated real-time data stream switching in a shared virtual area communication environment
US11/923,629 US7769806B2 (en) 2007-10-24 2007-10-24 Automated real-time data stream switching in a shared virtual area communication environment
US4271408P 2008-04-05 2008-04-05
US12037208P 2008-12-05 2008-12-05
US12/354,709 US8397168B2 (en) 2008-04-05 2009-01-15 Interfacing with a spatial virtual communication environment
US12/418,270 US8407605B2 (en) 2009-04-03 2009-04-03 Application sharing
US14/666,298 USRE46309E1 (en) 2007-10-24 2015-03-23 Application sharing

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/418,270 Reissue US8407605B2 (en) 2007-10-24 2009-04-03 Application sharing

Publications (1)

Publication Number Publication Date
USRE46309E1 true USRE46309E1 (en) 2017-02-14

Family

ID=42827176

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/418,270 Active 2031-06-05 US8407605B2 (en) 2007-10-24 2009-04-03 Application sharing
US14/666,298 Active 2031-06-05 USRE46309E1 (en) 2007-10-24 2015-03-23 Application sharing

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/418,270 Active 2031-06-05 US8407605B2 (en) 2007-10-24 2009-04-03 Application sharing

Country Status (6)

Country Link
US (2) US8407605B2 (en)
EP (2) EP2645267A1 (en)
KR (1) KR20110134940A (en)
CN (1) CN102804161A (en)
IL (1) IL215387A0 (en)
WO (1) WO2010114724A2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180088788A1 (en) * 2016-09-26 2018-03-29 Microsoft Technology Licensing, Llc Intelligent navigation via a transient user interface control
US10866929B2 (en) * 2018-07-20 2020-12-15 Slack Technologies, Inc. Group-based communication interface with subsidiary file communications
US11126390B2 (en) * 2016-01-04 2021-09-21 Shenzhen Mindray Bio-Medical Electronics Co., Ltd. System and method for controlling coordination between medical devices, medical workstation and communication device

Families Citing this family (119)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9633089B1 (en) 2002-05-25 2017-04-25 hopTo Inc. Aggregated search
US9198084B2 (en) 2006-05-26 2015-11-24 Qualcomm Incorporated Wireless architecture for a traditional wire-based protocol
US8667144B2 (en) 2007-07-25 2014-03-04 Qualcomm Incorporated Wireless architecture for traditional wire based protocol
US8407605B2 (en) 2009-04-03 2013-03-26 Social Communications Company Application sharing
US7769806B2 (en) 2007-10-24 2010-08-03 Social Communications Company Automated real-time data stream switching in a shared virtual area communication environment
US8397168B2 (en) * 2008-04-05 2013-03-12 Social Communications Company Interfacing with a spatial virtual communication environment
US8578000B2 (en) 2008-12-05 2013-11-05 Social Communications Company Realtime kernel
SE533322C2 (en) * 2007-12-21 2010-08-24 Tat The Astonishing Tribe Ab Method, module and apparatus for displaying graphical information
US8811294B2 (en) 2008-04-04 2014-08-19 Qualcomm Incorporated Apparatus and methods for establishing client-host associations within a wireless network
US9514444B2 (en) 2009-01-15 2016-12-06 Sococo, Inc. Encapsulating virtual area based communicant assemblies
EP2279472A4 (en) 2008-04-05 2013-11-20 Social Communications Co Shared virtual area communication environment based apparatus and methods
US9398089B2 (en) 2008-12-11 2016-07-19 Qualcomm Incorporated Dynamic resource sharing among multiple wireless devices
US9319357B2 (en) 2009-01-15 2016-04-19 Social Communications Company Context based virtual area creation
US9853922B2 (en) 2012-02-24 2017-12-26 Sococo, Inc. Virtual area communications
US9065874B2 (en) 2009-01-15 2015-06-23 Social Communications Company Persistent network resource and virtual area associations for realtime collaboration
US8412729B2 (en) 2009-04-22 2013-04-02 Genarts, Inc. Sharing of presets for visual effects or other computer-implemented effects
US8671360B2 (en) * 2009-06-17 2014-03-11 Brigham Young University Distributing multiple client windows using a display server
US9264248B2 (en) 2009-07-02 2016-02-16 Qualcomm Incorporated System and method for avoiding and resolving conflicts in a wireless mobile display digital interface multicast environment
WO2011013805A1 (en) 2009-07-31 2011-02-03 日本電気株式会社 Control server, service-providing system, and method of providing a virtual infrastructure
US8271905B2 (en) 2009-09-15 2012-09-18 International Business Machines Corporation Information presentation in virtual 3D
US20110138313A1 (en) * 2009-12-03 2011-06-09 Kevin Decker Visually rich tab representation in user interface
US9582238B2 (en) * 2009-12-14 2017-02-28 Qualcomm Incorporated Decomposed multi-stream (DMS) techniques for video display systems
US8898577B2 (en) * 2010-03-09 2014-11-25 Microsoft Corporation Application sharing with occlusion removal
US9417691B2 (en) * 2010-03-26 2016-08-16 Nokia Technologies Oy Method and apparatus for ad-hoc peer-to-peer augmented reality environment
JP5740826B2 (en) * 2010-03-29 2015-07-01 セイコーエプソン株式会社 Image display device, image information processing device, and image information processing method
US8762881B2 (en) * 2010-07-15 2014-06-24 Salesforce.Com, Inc. Taking screenshots of a failed application
US8839112B2 (en) 2010-08-10 2014-09-16 Microsoft Corporation Cloning or extending a computer desktop on a wireless display surface
WO2012047997A1 (en) * 2010-10-05 2012-04-12 Citrix Systems, Inc. Display management for native user experiences
FR2968495B1 (en) * 2010-12-06 2019-03-15 Isard PARTIAL LEARNING SHARING METHOD OF SOFTWARE APPLICATION
US8847985B2 (en) 2010-12-30 2014-09-30 International Business Machines Corporation Protecting screen information
US10135900B2 (en) 2011-01-21 2018-11-20 Qualcomm Incorporated User input back channel for wireless displays
US8964783B2 (en) 2011-01-21 2015-02-24 Qualcomm Incorporated User input back channel for wireless displays
US9787725B2 (en) 2011-01-21 2017-10-10 Qualcomm Incorporated User input back channel for wireless displays
US9413803B2 (en) 2011-01-21 2016-08-09 Qualcomm Incorporated User input back channel for wireless displays
US9065876B2 (en) 2011-01-21 2015-06-23 Qualcomm Incorporated User input back channel from a wireless sink device to a wireless source device for multi-touch gesture wireless displays
US20130013318A1 (en) 2011-01-21 2013-01-10 Qualcomm Incorporated User input back channel for wireless displays
US8674957B2 (en) 2011-02-04 2014-03-18 Qualcomm Incorporated User input device for wireless back channel
US9503771B2 (en) 2011-02-04 2016-11-22 Qualcomm Incorporated Low latency wireless display for graphics
US10108386B2 (en) 2011-02-04 2018-10-23 Qualcomm Incorporated Content provisioning for wireless back channel
US9142193B2 (en) 2011-03-17 2015-09-22 Intellitact Llc Linear progression based window management
JP2012203889A (en) * 2011-03-28 2012-10-22 Fujitsu Ltd Screen sharing method, screen sharing device, and program
US20120317501A1 (en) * 2011-06-09 2012-12-13 Anthony Constantine Milou Communication & Collaboration Method for Multiple Simultaneous Users
WO2013039748A2 (en) * 2011-09-16 2013-03-21 Social Communications Company Capabilities based management of virtual areas
US8754892B2 (en) 2011-10-28 2014-06-17 International Business Machines Corporation Visualization of virtual image relationships and attributes
US9612724B2 (en) 2011-11-29 2017-04-04 Citrix Systems, Inc. Integrating native user interface components on a mobile device
JP5857722B2 (en) * 2011-12-20 2016-02-10 ソニー株式会社 Information processing apparatus, information processing method, and program
KR101410416B1 (en) * 2011-12-21 2014-06-27 주식회사 케이티 Remote control method, system and user interface
US9525998B2 (en) 2012-01-06 2016-12-20 Qualcomm Incorporated Wireless display with multiscreen service
WO2013119802A1 (en) 2012-02-11 2013-08-15 Social Communications Company Routing virtual area based communications
US20130254681A1 (en) * 2012-03-26 2013-09-26 International Business Machines Corporation Proxying an active link from a shared computer
KR101980620B1 (en) * 2012-04-03 2019-05-21 엘지전자 주식회사 Sink device, source device and methods of controlling the same
JP6050023B2 (en) * 2012-04-26 2016-12-21 任天堂株式会社 GAME SYSTEM, GAME PROCESSING METHOD, GAME PROGRAM, AND GAME DEVICE
EP2856280A4 (en) * 2012-06-01 2016-05-18 Sas Ip User interface and method of data navigation in the user interface of engineering analysis applications
WO2013181026A1 (en) * 2012-06-02 2013-12-05 Social Communications Company Interfacing with a spatial virtual communications environment
JP2014021511A (en) * 2012-07-12 2014-02-03 Sony Corp Display control unit, display control method, program, and communication system
US20140075377A1 (en) 2012-09-10 2014-03-13 Samsung Electronics Co. Ltd. Method for connecting mobile terminal and external display and apparatus implementing the same
US9383975B1 (en) * 2013-01-28 2016-07-05 Richard Stanley Fencel Projection of software and integrated circuit diagrams into actual 3D space
US20140298192A1 (en) * 2013-03-27 2014-10-02 General Electric Company Display system and method
JP6381187B2 (en) * 2013-08-09 2018-08-29 キヤノン株式会社 Information processing apparatus, information processing method, and program
CA2932781A1 (en) * 2013-08-30 2015-03-05 Zag Holdings Inc. View sharing and dual permission remote browsing methods and systems
CN103595715B (en) * 2013-11-08 2017-02-15 腾讯科技(成都)有限公司 Information sharing method and device for desktop live broadcasting
US10270871B2 (en) 2014-04-08 2019-04-23 Dropbox, Inc. Browser display of native application presence and interaction data
CN103929497B (en) * 2014-04-30 2017-09-15 天脉聚源(北京)传媒科技有限公司 The method and server of a kind of real time propelling movement message
US9467329B2 (en) * 2014-05-29 2016-10-11 Blackberry Limited Coordinating activity views across operating system domains
TWI604375B (en) * 2014-06-23 2017-11-01 緯創資通股份有限公司 Methods for screen sharing and apparatuses using the same
JP6407624B2 (en) 2014-08-14 2018-10-17 株式会社ソニー・インタラクティブエンタテインメント Information processing apparatus and user information display method
JP6407623B2 (en) 2014-08-14 2018-10-17 株式会社ソニー・インタラクティブエンタテインメント Information processing apparatus, information display method, and information processing system
JP6407622B2 (en) * 2014-08-14 2018-10-17 株式会社ソニー・インタラクティブエンタテインメント Information processing apparatus, image data transmission method, and information processing system
CN104238885B (en) * 2014-09-11 2019-09-10 吴涛军 The method and device that a kind of autoplacement shown suitable for more equipment contents and content are shown
US9621650B2 (en) * 2014-09-30 2017-04-11 Google Inc Mobile application state identifier framework
EP3190763B1 (en) * 2014-11-03 2022-01-05 Huawei Technologies Co., Ltd. Screen sharing method and sharing device
US20160139785A1 (en) * 2014-11-16 2016-05-19 Cisco Technology, Inc. Multi-modal communications
JP6312866B2 (en) * 2015-01-23 2018-04-18 マクセル株式会社 Display device and display method
US9846528B2 (en) * 2015-03-02 2017-12-19 Dropbox, Inc. Native application collaboration
US20180253201A1 (en) * 2015-03-26 2018-09-06 Wal-Mart Stores, Inc. Systems and methods for a multi-display collaboration environment
US9733993B2 (en) 2015-07-02 2017-08-15 Microsoft Technology Licensing, Llc Application sharing using endpoint interface entities
US9785484B2 (en) 2015-07-02 2017-10-10 Microsoft Technology Licensing, Llc Distributed application interfacing across different hardware
US9860145B2 (en) 2015-07-02 2018-01-02 Microsoft Technology Licensing, Llc Recording of inter-application data flow
US9658836B2 (en) 2015-07-02 2017-05-23 Microsoft Technology Licensing, Llc Automated generation of transformation chain compatible class
US9733915B2 (en) 2015-07-02 2017-08-15 Microsoft Technology Licensing, Llc Building of compound application chain applications
US9712472B2 (en) 2015-07-02 2017-07-18 Microsoft Technology Licensing, Llc Application spawning responsive to communication
US10198252B2 (en) 2015-07-02 2019-02-05 Microsoft Technology Licensing, Llc Transformation chain application splitting
US10261985B2 (en) 2015-07-02 2019-04-16 Microsoft Technology Licensing, Llc Output rendering in dynamic redefining application
US10198405B2 (en) 2015-07-08 2019-02-05 Microsoft Technology Licensing, Llc Rule-based layout of changing information
US10031724B2 (en) 2015-07-08 2018-07-24 Microsoft Technology Licensing, Llc Application operation responsive to object spatial status
US10277582B2 (en) 2015-08-27 2019-04-30 Microsoft Technology Licensing, Llc Application service architecture
EP3380926A1 (en) 2015-11-26 2018-10-03 Barco NV Content sharing protocol
US10248933B2 (en) 2015-12-29 2019-04-02 Dropbox, Inc. Content item activity feed for presenting events associated with content items
US10620811B2 (en) 2015-12-30 2020-04-14 Dropbox, Inc. Native application collaboration
KR102458444B1 (en) * 2016-01-05 2022-10-25 삼성전자주식회사 Electronic device and method for operating thereof
KR20170096364A (en) * 2016-02-16 2017-08-24 삼성전자주식회사 Method and electronic device for composing screen
US10382502B2 (en) 2016-04-04 2019-08-13 Dropbox, Inc. Change comments for synchronized content items
CN105892976B (en) * 2016-04-29 2019-02-01 广州视睿电子科技有限公司 Realize the method and device of multi-screen interactive
US10212113B2 (en) * 2016-09-19 2019-02-19 Google Llc Uniform resource identifier and image sharing for contextual information display
US20180113603A1 (en) * 2016-10-25 2018-04-26 Prysm, Inc. Floating asset in a workspace
US10455188B2 (en) * 2016-11-18 2019-10-22 Microsoft Technology Licensing, Llc Correlating UI with CPU stacks for profiling sessions
US10303496B2 (en) * 2016-12-01 2019-05-28 Raja Singh Tuli System for reducing user-perceived lag in text data input and exchange with a server
CN107346119A (en) * 2017-08-24 2017-11-14 苏睿 Cooperative control method, apparatus and system
CN107632808A (en) * 2017-09-25 2018-01-26 上海闻泰信息技术有限公司 Display screen auto-changing terminal and its transform method
CN107657698A (en) * 2017-10-10 2018-02-02 覃和平 A kind of unified method paid of gate inhibition
US20190129576A1 (en) * 2017-10-27 2019-05-02 Lenovo Enterprise Solutions (Singapore) Pte. Ltd. Processing of corresponding menu items in response to receiving selection of an item from the respective menu
US10586071B2 (en) * 2017-11-24 2020-03-10 International Business Machines Corporation Safeguarding confidential information during a screen share session
WO2019104085A1 (en) * 2017-11-27 2019-05-31 Blue Jeans Network, Inc. User interface with a hierarchical presentation of selection options for selecting a sharing mode of a video conference
CN108829528B (en) * 2018-06-11 2021-06-25 腾讯科技(深圳)有限公司 Content sharing method and device, and message processing method and device
US11077365B2 (en) 2018-06-27 2021-08-03 Niantic, Inc. Low latency datagram-responsive computer network protocol
CN109684590A (en) * 2018-12-25 2019-04-26 威创集团股份有限公司 A kind of browsing device net page data sharing method and device
WO2020174404A1 (en) * 2019-02-25 2020-09-03 Niantic, Inc. Augmented reality mobile edge computing
CN110221798A (en) * 2019-05-29 2019-09-10 华为技术有限公司 A kind of throwing screen method, system and relevant apparatus
JP2023507770A (en) 2019-12-20 2023-02-27 ナイアンティック, インコーポレイテッド Data hierarchy protocol for data transmission routing
CN111338590A (en) * 2020-02-19 2020-06-26 北京翼鸥教育科技有限公司 Screen sharing initiating and responding method and interaction system
CN115623257A (en) * 2020-04-20 2023-01-17 华为技术有限公司 Screen projection display method, system, terminal device and storage medium
CN112286431B (en) * 2020-10-30 2022-09-23 联想(北京)有限公司 Method and device for processing video sharing content and electronic equipment
CN112269886B (en) * 2020-11-11 2021-05-25 游密科技(深圳)有限公司 Shared image acquisition method, window sharing method and shared image acquisition device
CN112911329B (en) * 2021-02-03 2023-08-25 广州虎牙科技有限公司 Window live broadcast method, device, electronic equipment and computer readable storage medium
CN112559110B (en) * 2021-02-22 2021-05-25 全时云商务服务股份有限公司 Window penetration method of magnifying glass based on windows system
CN114741063B (en) * 2022-04-06 2023-08-22 山东赤子城网络技术有限公司 Method and system for realizing programming framework based on Lua language
US11789602B1 (en) * 2022-04-18 2023-10-17 Spatial Systems Inc. Immersive gallery with linear scroll
CN114942812A (en) * 2022-05-31 2022-08-26 上海嘉车信息科技有限公司 Multimedia information sharing method and device and electronic equipment
US11836412B1 (en) 2023-02-20 2023-12-05 International Business Machines Corporation Screen sharing recall

Citations (257)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5471318A (en) 1993-04-22 1995-11-28 At&T Corp. Multimedia communications network
US5491743A (en) 1994-05-24 1996-02-13 International Business Machines Corporation Virtual conference system and terminal apparatus therefor
US5627978A (en) 1994-12-16 1997-05-06 Lucent Technologies Inc. Graphical user interface for multimedia call set-up and call handling in a virtual conference on a desktop computer conferencing system
US5727155A (en) * 1994-09-09 1998-03-10 Intel Corporation Method and apparatus for dynamically controlling a remote system's access to shared applications on a host system
US5737533A (en) 1995-10-26 1998-04-07 Wegener Internet Projects Bv System for generating a virtual reality scene in response to a database search
US5758110A (en) 1994-06-17 1998-05-26 Intel Corporation Apparatus and method for application sharing in a graphic user interface
US5764916A (en) 1996-09-27 1998-06-09 Ichat, Inc. Method and apparatus for real time communication over a computer network
US5793365A (en) 1996-01-02 1998-08-11 Sun Microsystems, Inc. System and method providing a computer user interface enabling access to distributed workgroup members
US5938724A (en) 1993-03-19 1999-08-17 Ncr Corporation Remote collaboration system that stores annotations to the image at a separate location from the image
US5949414A (en) * 1996-10-31 1999-09-07 Canon Kabushiki Kaisha Window control with side conversation and main conference layers
US5956038A (en) 1995-07-12 1999-09-21 Sony Corporation Three-dimensional virtual reality space sharing method and system, an information recording medium and method, an information transmission medium and method, an information processing method, a client terminal, and a shared server terminal
KR19990078775A (en) 1999-08-06 1999-11-05 신유진 Electronic commerce system in the virtual reality space and method thereof
US5982372A (en) 1996-11-14 1999-11-09 International Business Machines Corp. Visual metaphor for shortcut navigation in a virtual world
US5995096A (en) * 1991-10-23 1999-11-30 Hitachi, Ltd. Conference display control method and apparatus for an electronic conference for displaying either shared or local data and transferring local data
US5999206A (en) 1998-05-22 1999-12-07 Futaba Denshi Kogyo Kabushiki Kaisha Device for expanding light-amount correction dynamic range
US5999208A (en) * 1998-07-15 1999-12-07 Lucent Technologies Inc. System for implementing multiple simultaneous meetings in a virtual reality mixed media meeting room
US6005578A (en) 1997-09-25 1999-12-21 Mindsphere, Inc. Method and apparatus for visual navigation of information objects
US6057856A (en) 1996-09-30 2000-05-02 Sony Corporation 3D virtual reality multi-user interaction with superimposed positional information display for each user
KR20000030491A (en) 2000-03-03 2000-06-05 천병년 Exhibition system in three dimensional virtual reality space and method thereof
US6119147A (en) * 1998-07-28 2000-09-12 Fuji Xerox Co., Ltd. Method and system for computer-mediated, multi-modal, asynchronous meetings in a virtual space
US6215498B1 (en) 1998-09-10 2001-04-10 Lionhearth Technologies, Inc. Virtual command post
US6219045B1 (en) 1995-11-13 2001-04-17 Worlds, Inc. Scalable virtual world chat client-server system
US6226655B1 (en) 1996-10-08 2001-05-01 Netjumper, Inc. Method and apparatus for retrieving data from a network using linked location identifiers
US20010023450A1 (en) 2000-01-25 2001-09-20 Chu Chang-Nam Authoring apparatus and method for creating multimedia file
US6304283B1 (en) * 1995-01-31 2001-10-16 Canon Kabushiki Kaisha Conference apparatus and method for realistically reproducing image data and shared board data
US6308199B1 (en) * 1997-08-11 2001-10-23 Fuji Xerox Co., Ltd. Cooperative work support system for managing a window display
US20010034740A1 (en) 2000-02-14 2001-10-25 Andruid Kerne Weighted interactive grid presentation system and method for streaming a multimedia collage
KR20010100589A (en) 2000-05-04 2001-11-14 신윤철 System for and method of information exchange by recording and playing avatar action
US6335927B1 (en) 1996-11-18 2002-01-01 Mci Communications Corporation System and method for providing requested quality of service in a hybrid network
US6342906B1 (en) * 1999-02-02 2002-01-29 International Business Machines Corporation Annotation layer for synchronous collaboration
US20020019833A1 (en) 2000-08-03 2002-02-14 Takashi Hanamoto Data editing apparatus and method
US20020026388A1 (en) 2000-08-01 2002-02-28 Chris Roebuck Method of distributing a product, providing incentives to a consumer, and collecting data on the activities of a consumer
US20020033848A1 (en) 2000-04-21 2002-03-21 Sciammarella Eduardo Agusto System for managing data objects
US20020033844A1 (en) 1998-10-01 2002-03-21 Levy Kenneth L. Content sensitive connected content
US20020049814A1 (en) 2000-08-31 2002-04-25 Yoo Hwan Soo System and method for book-marking a specific location in virtual space
JP2002123479A (en) 2000-10-17 2002-04-26 Hitachi Ltd Disk control device and method for controlling its cache
US6380952B1 (en) 1998-04-07 2002-04-30 International Business Machines Corporation System for continuous display and navigation in a virtual-reality world
US20020052918A1 (en) 2000-09-07 2002-05-02 Junichi Rekimoto Method and system for supporting virtual conversation, and storage medium
JP2002149580A (en) 2000-11-15 2002-05-24 Canon Inc Server equipment, user terminal equipment, virtual space system, method for distributing user information, method for displaying user information, method for distributing and displaying user information and storage medium
US6396509B1 (en) 1998-02-21 2002-05-28 Koninklijke Philips Electronics N.V. Attention-based interaction in a virtual environment
US20020080195A1 (en) 1999-07-28 2002-06-27 Carlson Samuel Garrett System and method for navigating in a digital information environment
US20020097267A1 (en) 2000-12-26 2002-07-25 Numedeon, Inc. Graphical interactive interface for immersive online communities
US6426778B1 (en) 1998-04-03 2002-07-30 Avid Technology, Inc. System and method for providing interactive components in motion video
JP2002224447A (en) 2001-01-31 2002-08-13 Square Co Ltd Video game device, method for controlling the same, video game program, and recording medium able to read computer in which the program is recorded
US20020112028A1 (en) 2000-11-17 2002-08-15 Colwill Ronald W. Virtual directory
US20020116458A1 (en) 2000-12-14 2002-08-22 Jonathan Bricklin Web-based dating service
US6473096B1 (en) 1998-10-16 2002-10-29 Fuji Xerox Co., Ltd. Device and method for generating scenario suitable for use as presentation materials
US20020165922A1 (en) * 2001-04-13 2002-11-07 Songxiang Wei Application based screen sampling
US6480191B1 (en) 1999-09-28 2002-11-12 Ricoh Co., Ltd. Method and apparatus for recording and playback of multidimensional walkthrough narratives
US20020169829A1 (en) 1998-10-30 2002-11-14 Brian Shuster Method, apparatus and system for directing access to content on a computer network
US20020178072A1 (en) 2001-05-24 2002-11-28 International Business Machines Corporation Online shopping mall virtual association
US20020188959A1 (en) 2001-06-12 2002-12-12 Koninklijke Philips Electronics N.V. Parallel and synchronized display of augmented multimedia information
US20030009469A1 (en) 2001-03-09 2003-01-09 Microsoft Corporation Managing media objects in a database
US20030037110A1 (en) 2001-08-14 2003-02-20 Fujitsu Limited Method for providing area chat rooms, method for processing area chats on terminal side, computer-readable medium for recording processing program to provide area chat rooms, apparatus for providing area chat rooms, and terminal-side apparatus for use in a system to provide area chat rooms
US20030043200A1 (en) 2001-08-09 2003-03-06 Urbanpixel Inc Interactive multi-level mapping in a multiple browser environment
JP2003067317A (en) 2001-08-30 2003-03-07 Fuji Xerox Co Ltd Message exchange method, computer, management device and recording medium
US20030065558A1 (en) 2001-09-12 2003-04-03 Cameron Shaw Method and apparatus for multi-vendor powered business portal with intelligent service promotion and user profile gathering
US20030077561A1 (en) * 2001-10-18 2003-04-24 Abraham Alsop Web server for printable whiteboards
US6567980B1 (en) 1997-08-14 2003-05-20 Virage, Inc. Video cataloger system with hyperlinked output
US6570587B1 (en) 1996-07-26 2003-05-27 Veon Ltd. System and method and linking information to a video
US6572476B2 (en) 2000-04-10 2003-06-03 Konami Corporation Game system and computer readable storage medium
US6580441B2 (en) 1999-04-06 2003-06-17 Vergics Corporation Graph-based visual navigation through store environments
KR20030054874A (en) 2001-12-26 2003-07-02 한국전자통신연구원 Client system for embodying 3-dimension virtual reality and method for embodying virtual reality using same
US20030149731A1 (en) 2002-01-23 2003-08-07 Sony Corporation Information processing device and method, and program
US20030158953A1 (en) 2002-02-21 2003-08-21 Lal Amrish K. Protocol to fix broken links on the world wide web
US20030184579A1 (en) 2002-03-29 2003-10-02 Hong-Jiang Zhang System and method for producing a video skim
US20030189599A1 (en) * 2002-04-05 2003-10-09 Microsoft Corporation Application sharing user interface improvements
US20030192049A1 (en) 2002-04-09 2003-10-09 Schneider Tina Fay Binding interactive multichannel digital document system
US20030195928A1 (en) 2000-10-17 2003-10-16 Satoru Kamijo System and method for providing reference information to allow chat users to easily select a chat room that fits in with his tastes
US20030197739A1 (en) * 2002-04-23 2003-10-23 Bauer Jason M. Distribution of application windows in a computing device display
US20030215779A1 (en) 2002-05-08 2003-11-20 Anne Dupont Telecommunications virtual simulator
KR20040011825A (en) 2002-07-30 2004-02-11 김동현 Cyber department store managing system based on the internet and its managing method
US6704784B2 (en) 1998-01-23 2004-03-09 Sony Corporation Information processing apparatus and method, information processing system and program providing medium
US6708172B1 (en) 1999-12-22 2004-03-16 Urbanpixel, Inc. Community-based shared multiple browser environment
US6714222B1 (en) 2000-06-21 2004-03-30 E2 Home Ab Graphical user interface for communications
US6721741B1 (en) 2000-01-24 2004-04-13 Friskit, Inc. Streaming media search system
US20040078444A1 (en) 2002-10-17 2004-04-22 Malik Dale W. Merging instant messaging (IM) chat sessions
US6731314B1 (en) 1998-08-17 2004-05-04 Muse Corporation Network-based three-dimensional multiple-user shared environment apparatus and method
US6735708B2 (en) 1999-10-08 2004-05-11 Dell Usa, L.P. Apparatus and method for a combination personal digital assistant and network portable device
US20040128350A1 (en) * 2002-03-25 2004-07-01 Lou Topfl Methods and systems for real-time virtual conferencing
US6772195B1 (en) 1999-10-29 2004-08-03 Electronic Arts, Inc. Chat clusters for a virtual world application
US6784901B1 (en) 2000-05-09 2004-08-31 There Method, system and computer program product for the delivery of a chat message in a 3D multi-user environment
US20040179038A1 (en) 2003-03-03 2004-09-16 Blattner Patrick D. Reactive avatars
US20040183827A1 (en) * 2003-03-17 2004-09-23 Daniel Putterman Methods and apparatus for implementing a remote application over a network
JP2004272579A (en) 2003-03-07 2004-09-30 Toshiba Corp Online service provision system, communication management device and program therefor, and communication management method
US20040210847A1 (en) * 2003-04-17 2004-10-21 Supersonic Aerospace International, Llc System and method for customizing multiple windows of information on a display
US6833863B1 (en) 1998-02-06 2004-12-21 Intel Corporation Method and apparatus for still image capture during video streaming operations of a tethered digital camera
US20040268451A1 (en) 2003-04-25 2004-12-30 Apple Computer, Inc. Graphical user interface for browsing, searching and presenting media items
US20050021624A1 (en) 2003-05-16 2005-01-27 Michael Herf Networked chat and media sharing systems and methods
US6862625B1 (en) 1996-09-27 2005-03-01 Avaya Technology Corp. Method and apparatus for real time network communication
US20050058094A1 (en) 2003-09-16 2005-03-17 Mihal Lazaridis Method for creating a peer-to-peer immediate messaging solution without using an instant messaging server
US20050075885A1 (en) 2003-09-25 2005-04-07 Danieli Damon V. Visual indication of current voice speaker
US20050080866A1 (en) 2003-10-14 2005-04-14 Kent Larry G. Selectively displaying time indications for instant messaging (IM) messages
US20050086612A1 (en) 2003-07-25 2005-04-21 David Gettman Graphical user interface for an information display system
US20050088452A1 (en) * 2003-10-23 2005-04-28 Scott Hanggie Dynamic window anatomy
US20050132299A1 (en) * 2003-12-15 2005-06-16 Dan Jones Systems and methods for improved application sharing in a multimedia collaboration session
US6909443B1 (en) * 1999-04-06 2005-06-21 Microsoft Corporation Method and apparatus for providing a three-dimensional task gallery computer interface
US20050138570A1 (en) 2003-12-22 2005-06-23 Palo Alto Research Center, Incorporated Methods and systems for supporting presentation tools using zoomable user interface
US20050144247A1 (en) 2003-12-09 2005-06-30 Christensen James E. Method and system for voice on demand private message chat
JP2005182331A (en) 2003-12-18 2005-07-07 Sony Corp Information processing system, service providing device and method, information processor and information processing method, program and recording medium
US20050154574A1 (en) 2002-10-10 2005-07-14 Kenichi Takemura Information processing system, service providing apparatus and method, information processing apparatus and method, recording medium, and program
US20050215252A1 (en) 2004-03-26 2005-09-29 Samsung Electronics Co., Ltd. System and method for providing a messenger service capable of changing messenger status information based on a schedule
CN1678994A (en) 2002-09-30 2005-10-05 微软公司 System and method for providing access to user interface information
JP2005286749A (en) 2004-03-30 2005-10-13 Matsushita Electric Ind Co Ltd Video image decoding device and video image transmission system using it
US20050232168A1 (en) 2004-04-15 2005-10-20 Citrix Systems, Inc. Methods and apparatus for synchronization of data set representations in a bandwidth-adaptive manner
CN1701568A (en) 2002-11-13 2005-11-23 英特尔公司 Multi-modal web interaction over wireless network
US20050261980A1 (en) 2004-05-22 2005-11-24 Altaf Hadi System and method for delivering real time remote buying, selling, meeting, and interacting in a virtual reality environment
US20060005187A1 (en) * 2004-06-30 2006-01-05 Microsoft Corporation Systems and methods for integrating application windows in a virtual machine environment
US20060031779A1 (en) 2004-04-15 2006-02-09 Citrix Systems, Inc. Selectively sharing screen data
US20060041684A1 (en) 2002-09-17 2006-02-23 Bellsouth Intellectual Property Corporation Server-based message protocol translation
US7007235B1 (en) * 1999-04-02 2006-02-28 Massachusetts Institute Of Technology Collaborative agent interaction control and synchronization system
US7036082B1 (en) 2000-09-21 2006-04-25 Nortel Networks Limited Controlling communications through a virtual reality environment
US20060117264A1 (en) 2000-12-18 2006-06-01 Nortel Networks Limited Graphical user interface for a virtual team environment
KR20060060788A (en) 2004-11-30 2006-06-05 전경열 Virtual reality system using character and realizing method of the same
US7058896B2 (en) 2002-01-16 2006-06-06 Silicon Graphics, Inc. System, method and computer program product for intuitive interactive navigation control in virtual environments
US20060136837A1 (en) 2002-04-03 2006-06-22 Microsoft Corporation Application sharing single document sharing
US7080096B1 (en) 1999-11-02 2006-07-18 Matsushita Electric Works, Ltd. Housing space-related commodity sale assisting system, housing space-related commodity sale assisting method, program for assisting housing space-related commodity sale, and computer-readable recorded medium on which program for assisting housing space-related commodity sale is recorded
US20060161624A1 (en) * 2001-04-13 2006-07-20 Elaine Montgomery Methods and apparatuses for dynamically sharing a portion of a display for application based screen sampling
US20060167996A1 (en) * 2005-01-13 2006-07-27 Orsolini Garry S System and method for enabling electronic presentations
US7086005B1 (en) 1999-11-29 2006-08-01 Sony Corporation Shared virtual space conversation support system using virtual telephones
US20060184886A1 (en) 1999-12-22 2006-08-17 Urbanpixel Inc. Spatial chat in a multiple browser environment
US20060230156A1 (en) * 2005-04-06 2006-10-12 Ericom Software Ltd. Seamless windows functionality to remote desktop sessions regarding z-order
US20060248159A1 (en) 2005-04-28 2006-11-02 International Business Machines Corporation Method and apparatus for presenting navigable data center information in virtual reality using leading edge rendering engines
US20060271877A1 (en) * 2005-05-25 2006-11-30 Citrix Systems, Inc. A system and methods for selective sharing of an application window
US7145898B1 (en) 1996-11-18 2006-12-05 Mci Communications Corporation System, method and article of manufacture for selecting a gateway of a hybrid communication system architecture
US20060293103A1 (en) 2005-06-24 2006-12-28 Seth Mendelsohn Participant interaction with entertainment in real and virtual environments
US20070011232A1 (en) * 2005-07-06 2007-01-11 Microsoft Corporation User interface for starting presentations in a meeting
US7165213B1 (en) 1996-10-30 2007-01-16 Avaya Technology Corp. Method and system for coordinating media and messaging operations in an information processing system
US7168048B1 (en) * 1999-03-24 2007-01-23 Microsoft Corporation Method and structure for implementing a layered object windows
US7184037B2 (en) 1997-10-14 2007-02-27 Koninklijke Philips Electronics N.V. Virtual environment navigation aid
US20070047700A1 (en) 2005-08-29 2007-03-01 Avaya Technology Corp. Managing held telephone calls from a remote telecommunications terminal
US7188317B1 (en) * 2001-06-13 2007-03-06 Microsoft Corporation Dynamic multiple window display having independently linked windows
US20070061399A1 (en) * 2005-09-12 2007-03-15 Microsoft Corporation Filtering obscured data from a remote client display
US20070070066A1 (en) 2005-09-13 2007-03-29 Bakhash E E System and method for providing three-dimensional graphical user interface
US20070136686A1 (en) 2005-12-08 2007-06-14 International Business Machines Corporation Pop-up repelling frame for use in screen sharing
US7240093B1 (en) 2000-02-29 2007-07-03 Microsoft Corporation Use of online messaging to facilitate selection of participants in game play
US20070156908A1 (en) 2005-12-30 2007-07-05 Nokia Corporation Network entity, method and computer program product for effectuating a conference session
US20070162432A1 (en) 2006-01-10 2007-07-12 Aol Llc Searching Recent Content Publication Activity
US20070160129A1 (en) 2006-01-10 2007-07-12 Tatsuro Fujisawa Video decoding apparatus and video decoding method
US20070184855A1 (en) 2006-02-03 2007-08-09 Research In Motion Limited Visual representation of contact location
US20070192427A1 (en) 2006-02-16 2007-08-16 Viktors Berstis Ease of use feature for audio communications within chat conferences
US7260784B2 (en) * 2003-05-07 2007-08-21 International Business Machines Corporation Display data mapping method, system, and program product
US20070198726A1 (en) 2000-10-12 2007-08-23 Peerapp Ltd. Method and system for accelerating receipt of data in a client to client network
US20070204047A1 (en) * 2006-02-27 2007-08-30 Microsoft Corporation Shared telepointer
US20070220568A1 (en) 2006-03-14 2007-09-20 Canon Kabushiki Kaisha Storage apparatus and method for processing the same
US20070226357A1 (en) 2006-03-22 2007-09-27 Mcmurry Kathleen A Providing an Aggregate Reachability Status
US20070233785A1 (en) 2006-03-30 2007-10-04 International Business Machines Corporation Communicating using collaboration spaces
US20070234212A1 (en) 2006-03-31 2007-10-04 Microsoft Corporation Selective window exclusion for captured content
KR20070105088A (en) 2006-04-25 2007-10-30 함현주 Shopping mall system based character and management method thereof
US7293243B1 (en) * 2002-05-22 2007-11-06 Microsoft Corporation Application sharing viewer presentation
US20070279484A1 (en) 2006-05-31 2007-12-06 Mike Derocher User interface for a video teleconference
US20070291706A1 (en) 2006-06-16 2007-12-20 Miller Scott C Methods, devices and architectures for establishing peer-to-peer sessions
US20070291034A1 (en) 2006-06-20 2007-12-20 Dones Nelson C System for presenting a navigable virtual subway system, and method for operating and using the same
US20070299778A1 (en) 2006-06-22 2007-12-27 Microsoft Corporation Local peer-to-peer digital content distribution
WO2008008806A2 (en) 2006-07-11 2008-01-17 Igor Khalatian One-click universal screen sharing
US20080021949A1 (en) 2006-07-20 2008-01-24 Avaya Technology Llc Determining User Availability Based on a Past Event
US20080019285A1 (en) 2006-07-20 2008-01-24 Avaya Technology Llc Rule-based System for Determining User Availability
US20080033941A1 (en) 2006-08-07 2008-02-07 Dale Parrish Verfied network identity with authenticated biographical information
US7336779B2 (en) 2002-03-15 2008-02-26 Avaya Technology Corp. Topical dynamic chat
US20080052373A1 (en) 2006-05-01 2008-02-28 Sms.Ac Systems and methods for a community-based user interface
US7356563B1 (en) 2002-06-06 2008-04-08 Microsoft Corporation Methods of annotating a collaborative application display
US20080086696A1 (en) 2006-03-03 2008-04-10 Cadcorporation.Com Inc. System and Method for Using Virtual Environments
US20080091692A1 (en) 2006-06-09 2008-04-17 Christopher Keith Information collection in multi-participant online communities
US20080098121A1 (en) 2006-10-23 2008-04-24 Nec (China) Co., Ltd. P2p sip enabled multimedia network communication system
US20080101561A1 (en) 2005-07-08 2008-05-01 Nhn Corporation Messenger notification system and method using synchronization server
US20080133580A1 (en) 2006-11-30 2008-06-05 James Andrew Wanless Method and system for providing automated real-time contact information
US20080163090A1 (en) 2006-12-28 2008-07-03 Yahoo! Inc. Interface overlay
US20080168154A1 (en) 2007-01-05 2008-07-10 Yahoo! Inc. Simultaneous sharing communication interface
US7415502B2 (en) 2001-11-16 2008-08-19 Sbc Technology Resources, Inc. Method and system for intelligent routing based on presence detection
US20080209075A1 (en) 2007-02-22 2008-08-28 Yahoo! Inc. Synchronous delivery of media content and real-time communication for online dating
EP1964597A1 (en) 2007-03-01 2008-09-03 Sony Computer Entertainment Europe Ltd. Apparatus and method of modifying an online environment
WO2008106196A1 (en) 2007-03-01 2008-09-04 Sony Computer Entertainment America Inc. Virtual world avatar control, interactivity and communication interactive messaging
US20080244458A1 (en) 2007-03-30 2008-10-02 Microsoft Corporation Remoting of Windows Presentation Framework based Applications in a Non-Composed Desktop
US20080250115A1 (en) 2007-04-04 2008-10-09 Vaidy Iyer Enterprise Notification System
US20080252637A1 (en) 2007-04-14 2008-10-16 Philipp Christian Berndt Virtual reality-based teleconferencing
US20080262911A1 (en) 2007-04-20 2008-10-23 Utbk, Inc. Methods and Systems to Search in Virtual Reality for Real Time Communications
US20080262910A1 (en) 2007-04-20 2008-10-23 Utbk, Inc. Methods and Systems to Connect People via Virtual Reality for Real Time Communications
US20080263460A1 (en) 2007-04-20 2008-10-23 Utbk, Inc. Methods and Systems to Connect People for Virtual Meeting in Virtual Reality
US7451181B2 (en) 1998-09-24 2008-11-11 Fujitsu Limited Apparatus for controlling a shared screen
US20080301557A1 (en) 2007-06-04 2008-12-04 Igor Kotlyar Systems, methods and software products for online dating
WO2009000028A1 (en) 2007-06-22 2008-12-31 Global Coordinate Software Limited Virtual 3d environments
US7474741B2 (en) 2003-01-20 2009-01-06 Avaya Inc. Messaging advise in presence-aware networks
US20090018912A1 (en) 2007-07-12 2009-01-15 Utbk, Inc. Systems and Methods to Provide Communication Connections via Partners
US20090079816A1 (en) 2007-09-24 2009-03-26 Fuji Xerox Co., Ltd. Method and system for modifying non-verbal behavior for social appropriateness in video conferencing and other computer mediated communications
US20090089364A1 (en) 2007-10-02 2009-04-02 Hamilton Ii Rick A Arrangements for interactivity between a virtual universe and the world wide web
US20090089685A1 (en) 2007-09-28 2009-04-02 Mordecai Nicole Y System and Method of Communicating Between A Virtual World and Real World
US20090096810A1 (en) * 2007-10-11 2009-04-16 Green Brian D Method for selectively remoting windows
US20090113314A1 (en) 2007-10-30 2009-04-30 Dawson Christopher J Location and placement of avatars in virtual worlds
US20090112997A1 (en) 2007-10-25 2009-04-30 Cisco Technology, Inc. Utilizing Presence Data Associated with Web Item
US20090113066A1 (en) 2007-10-24 2009-04-30 David Van Wie Automated real-time data stream switching in a shared virtual area communication environment
US7533346B2 (en) 2002-01-09 2009-05-12 Dolby Laboratories Licensing Corporation Interactive spatalized audiovisual system
US20090132943A1 (en) 2007-02-13 2009-05-21 Claudia Juliana Minsky Method and System for Creating a Multifunctional Collage Useable for Client/Server Communication
US20090128567A1 (en) 2007-11-15 2009-05-21 Brian Mark Shuster Multi-instance, multi-user animation with coordinated chat
US20090193077A1 (en) 2008-01-25 2009-07-30 Hiroshi Horii Communication control
CN101499080A (en) 2008-02-01 2009-08-05 网秦无限(北京)科技有限公司 Method and system for fast acquiring information service on mobile terminal
US20090199095A1 (en) 2008-02-01 2009-08-06 International Business Machines Corporation Avatar cloning in a virtual world
US20090222742A1 (en) 2008-03-03 2009-09-03 Cisco Technology, Inc. Context sensitive collaboration environment
US20090235180A1 (en) * 2008-03-17 2009-09-17 Jun Feng Liu Method and Apparatus for Restoring an Occluded Window in Application Sharing Software
US20090241037A1 (en) 2008-03-18 2009-09-24 Nortel Networks Limited Inclusion of Web Content in a Virtual Environment
US20090247196A1 (en) 2008-03-31 2009-10-01 Lg Electronic Inc. Mobile terminal for performing instant messaging service
US20090254840A1 (en) 2008-04-04 2009-10-08 Yahoo! Inc. Local map chat
US20090254843A1 (en) 2008-04-05 2009-10-08 Social Communications Company Shared virtual area communication environment based apparatus and methods
US20090251457A1 (en) 2008-04-03 2009-10-08 Cisco Technology, Inc. Reactive virtual environment
US20090259948A1 (en) 2008-04-15 2009-10-15 Hamilton Ii Rick A Surrogate avatar control in a virtual universe
US7616624B2 (en) 2006-07-20 2009-11-10 Avaya Inc. Determining user availability based on the expected duration of a new event
US20090286605A1 (en) 2008-05-19 2009-11-19 Hamilton Ii Rick A Event determination in a virtual universe
US20090288007A1 (en) 2008-04-05 2009-11-19 Social Communications Company Spatial interfaces for realtime networked communications
US20090300521A1 (en) 2008-05-30 2009-12-03 International Business Machines Corporation Apparatus for navigation and interaction in a virtual meeting place
US20090307620A1 (en) 2008-06-10 2009-12-10 Hamilton Ii Rick A System for concurrently managing multiple avatars
US7640300B2 (en) 2002-06-10 2009-12-29 Microsoft Corporation Presence and notification system for maintaining and communicating information
US20100058229A1 (en) * 2008-09-02 2010-03-04 Palm, Inc. Compositing Windowing System
US20100058202A1 (en) 2008-08-28 2010-03-04 Mohamed Rostom Method system and program product for providing enabling an interactive and social search engine
US7680098B2 (en) 2006-07-20 2010-03-16 Avaya Inc. Determining group availability on different communication media
US20100077034A1 (en) 2008-09-22 2010-03-25 International Business Machines Corporation Modifying environmental chat distance based on avatar population density in an area of a virtual world
US20100131868A1 (en) * 2008-11-26 2010-05-27 Cisco Technology, Inc. Limitedly sharing application windows in application sharing sessions
US7734692B1 (en) 2005-07-22 2010-06-08 Oracle America, Inc. Network collaboration system with private voice chat
US20100146118A1 (en) 2008-12-05 2010-06-10 Social Communications Company Managing interactions in a network communications environment
US20100146085A1 (en) 2008-12-05 2010-06-10 Social Communications Company Realtime kernel
US20100162121A1 (en) 2008-12-22 2010-06-24 Nortel Networks Limited Dynamic customization of a virtual world
US20100169801A1 (en) 2002-11-21 2010-07-01 Aol Llc Multiple avatar personalities
US20100169796A1 (en) 2008-12-28 2010-07-01 Nortel Networks Limited Visual Indication of Audio Context in a Computer-Generated Virtual Environment
US20100164956A1 (en) 2008-12-28 2010-07-01 Nortel Networks Limited Method and Apparatus for Monitoring User Attention with a Computer-Generated Virtual Environment
US20100169837A1 (en) 2008-12-29 2010-07-01 Nortel Networks Limited Providing Web Content in the Context of a Virtual Environment
US20100169799A1 (en) 2008-12-30 2010-07-01 Nortel Networks Limited Method and Apparatus for Enabling Presentations to Large Numbers of Users in a Virtual Environment
US7765259B2 (en) 2006-12-05 2010-07-27 Avaya Inc. System and method for aggregation of user conversations and visualizing personal communications map
US20100211880A1 (en) 2009-02-13 2010-08-19 International Business Machines Corporation Virtual world viewer
US20100221693A1 (en) 2006-03-31 2010-09-02 Rakesh Kumar Gupta Instant Messaging For A Virtual Learning Community
US20100228560A1 (en) 2009-03-04 2010-09-09 Avaya Inc. Predictive buddy list-reorganization based on call history information
US20100228547A1 (en) 2009-03-06 2010-09-09 At&T Intellectual Property I, Lp. Method and apparatus for analyzing discussion regarding media programs
US20100235501A1 (en) 2009-03-16 2010-09-16 Avaya Inc. Advanced Availability Detection
US20100241432A1 (en) 2009-03-17 2010-09-23 Avaya Inc. Providing descriptions of visually presented information to video teleconference participants who are not video-enabled
US20100246570A1 (en) 2009-03-24 2010-09-30 Avaya Inc. Communications session preparation method and apparatus
US20100251127A1 (en) 2009-03-30 2010-09-30 Avaya Inc. System and method for managing trusted relationships in communication sessions using a graphical metaphor
US20100257450A1 (en) 2009-04-03 2010-10-07 Social Communications Company Application sharing
US7813488B2 (en) 2003-09-29 2010-10-12 Siemens Enterprise Communications, Inc. System and method for providing information regarding an identity's media availability
US20100262550A1 (en) 2009-04-08 2010-10-14 Avaya Inc. Inter-corporate collaboration overlay solution for professional social networks
US20100274848A1 (en) 2008-12-05 2010-10-28 Social Communications Company Managing network communications between network nodes and stream transport protocol
US20100287274A1 (en) 2009-05-08 2010-11-11 Canon Kabushiki Kaisha Efficient network utilization using multiple physical interfaces
JP2010535363A (en) 2007-03-01 2010-11-18 ソニー コンピュータ エンタテインメント アメリカ リミテッド ライアビリテイ カンパニー Virtual world avatar control, interactivity and communication interactive messaging
US7840668B1 (en) 2007-05-24 2010-11-23 Avaya Inc. Method and apparatus for managing communication between participants in a virtual environment
US7844724B2 (en) 2007-10-24 2010-11-30 Social Communications Company Automated real-time data stream switching in a shared virtual area communication environment
US20100322395A1 (en) 2009-06-22 2010-12-23 Avaya Inc. Unified communications appliance
US7908663B2 (en) 2004-04-20 2011-03-15 Microsoft Corporation Abstractions and automation for enhanced sharing and collaboration
US20110106662A1 (en) 2009-10-30 2011-05-05 Matthew Stinchcomb System and method for performing interactive online shopping
US7958453B1 (en) 2006-09-29 2011-06-07 Len Bou Taing System and method for real-time, multi-user, interactive and collaborative environments on the web
US20110169927A1 (en) 2010-01-13 2011-07-14 Coco Studios Content Presentation in a Three Dimensional Environment
US20110196930A1 (en) 2004-09-20 2011-08-11 Jitendra Chawla Methods and apparatuses for reporting based on attention of a user during a collaboration session
US20110231781A1 (en) 2010-03-19 2011-09-22 International Business Machines Corporation System and method for virtual object sharing and management in virtual worlds
US20110274104A1 (en) 2007-10-24 2011-11-10 Social Communications Company Virtual area based telephony communications
US20110302509A1 (en) 2007-10-24 2011-12-08 Social Communications Company Promoting communicant interactions in a network communications environment
US20120066306A1 (en) 2010-09-11 2012-03-15 Social Communications Company Relationship based presence indicating in virtual area contexts
US20120124486A1 (en) * 2000-10-10 2012-05-17 Addnclick, Inc. Linking users into live social networking interactions based on the users' actions relative to similar content
US20120216131A1 (en) 2009-01-15 2012-08-23 Social Communications Company Persistent network resource and virtual area associations for realtime collaboration
US20120246582A1 (en) 2008-04-05 2012-09-27 Social Communications Company Interfacing with a spatial virtual communications environment
US20120254858A1 (en) 2009-01-15 2012-10-04 Social Communications Company Creating virtual areas for realtime communications
US20130073978A1 (en) 2011-09-16 2013-03-21 Social Communications Company Capabilities based management of virtual areas
US20140129644A1 (en) * 2002-04-22 2014-05-08 Rosebud LMS, Inc Method and software for enabling n-way collaborative work over a network of computers

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4271408A (en) 1978-10-17 1981-06-02 Stanley Electric Co., Ltd. Colored-light emitting display
US6785708B1 (en) * 1996-10-30 2004-08-31 Avaya Inc. Method and apparatus for synchronizing browse and chat functions on a computer network
US20050080666A1 (en) * 2003-10-09 2005-04-14 Laura Treibitz Doll history software

Patent Citations (297)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5995096A (en) * 1991-10-23 1999-11-30 Hitachi, Ltd. Conference display control method and apparatus for an electronic conference for displaying either shared or local data and transferring local data
US5938724A (en) 1993-03-19 1999-08-17 Ncr Corporation Remote collaboration system that stores annotations to the image at a separate location from the image
US6047314A (en) * 1993-03-19 2000-04-04 Ncr Corporation Remote collaboration system with multiple host computers using multiple applications
US6392760B1 (en) 1993-04-22 2002-05-21 Avaya Technology Corp. Multimedia communications network
US5471318A (en) 1993-04-22 1995-11-28 At&T Corp. Multimedia communications network
US5491743A (en) 1994-05-24 1996-02-13 International Business Machines Corporation Virtual conference system and terminal apparatus therefor
US5758110A (en) 1994-06-17 1998-05-26 Intel Corporation Apparatus and method for application sharing in a graphic user interface
US5727155A (en) * 1994-09-09 1998-03-10 Intel Corporation Method and apparatus for dynamically controlling a remote system's access to shared applications on a host system
US5627978A (en) 1994-12-16 1997-05-06 Lucent Technologies Inc. Graphical user interface for multimedia call set-up and call handling in a virtual conference on a desktop computer conferencing system
US6304283B1 (en) * 1995-01-31 2001-10-16 Canon Kabushiki Kaisha Conference apparatus and method for realistically reproducing image data and shared board data
US5956038A (en) 1995-07-12 1999-09-21 Sony Corporation Three-dimensional virtual reality space sharing method and system, an information recording medium and method, an information transmission medium and method, an information processing method, a client terminal, and a shared server terminal
US5737533A (en) 1995-10-26 1998-04-07 Wegener Internet Projects Bv System for generating a virtual reality scene in response to a database search
US6219045B1 (en) 1995-11-13 2001-04-17 Worlds, Inc. Scalable virtual world chat client-server system
US7181690B1 (en) 1995-11-13 2007-02-20 Worlds. Com Inc. System and method for enabling users to interact in a virtual space
US5793365A (en) 1996-01-02 1998-08-11 Sun Microsystems, Inc. System and method providing a computer user interface enabling access to distributed workgroup members
US6570587B1 (en) 1996-07-26 2003-05-27 Veon Ltd. System and method and linking information to a video
US6862625B1 (en) 1996-09-27 2005-03-01 Avaya Technology Corp. Method and apparatus for real time network communication
US5764916A (en) 1996-09-27 1998-06-09 Ichat, Inc. Method and apparatus for real time communication over a computer network
US6057856A (en) 1996-09-30 2000-05-02 Sony Corporation 3D virtual reality multi-user interaction with superimposed positional information display for each user
US6226655B1 (en) 1996-10-08 2001-05-01 Netjumper, Inc. Method and apparatus for retrieving data from a network using linked location identifiers
US7165213B1 (en) 1996-10-30 2007-01-16 Avaya Technology Corp. Method and system for coordinating media and messaging operations in an information processing system
US7263526B1 (en) 1996-10-30 2007-08-28 Avaya Technology Corp. Method and apparatus for embedding chat functions in a web page
US5949414A (en) * 1996-10-31 1999-09-07 Canon Kabushiki Kaisha Window control with side conversation and main conference layers
US5982372A (en) 1996-11-14 1999-11-09 International Business Machines Corp. Visual metaphor for shortcut navigation in a virtual world
US7145898B1 (en) 1996-11-18 2006-12-05 Mci Communications Corporation System, method and article of manufacture for selecting a gateway of a hybrid communication system architecture
US6335927B1 (en) 1996-11-18 2002-01-01 Mci Communications Corporation System and method for providing requested quality of service in a hybrid network
US6308199B1 (en) * 1997-08-11 2001-10-23 Fuji Xerox Co., Ltd. Cooperative work support system for managing a window display
US6567980B1 (en) 1997-08-14 2003-05-20 Virage, Inc. Video cataloger system with hyperlinked output
US6005578A (en) 1997-09-25 1999-12-21 Mindsphere, Inc. Method and apparatus for visual navigation of information objects
US7184037B2 (en) 1997-10-14 2007-02-27 Koninklijke Philips Electronics N.V. Virtual environment navigation aid
US6704784B2 (en) 1998-01-23 2004-03-09 Sony Corporation Information processing apparatus and method, information processing system and program providing medium
US6833863B1 (en) 1998-02-06 2004-12-21 Intel Corporation Method and apparatus for still image capture during video streaming operations of a tethered digital camera
US6396509B1 (en) 1998-02-21 2002-05-28 Koninklijke Philips Electronics N.V. Attention-based interaction in a virtual environment
US6426778B1 (en) 1998-04-03 2002-07-30 Avid Technology, Inc. System and method for providing interactive components in motion video
US6380952B1 (en) 1998-04-07 2002-04-30 International Business Machines Corporation System for continuous display and navigation in a virtual-reality world
US5999206A (en) 1998-05-22 1999-12-07 Futaba Denshi Kogyo Kabushiki Kaisha Device for expanding light-amount correction dynamic range
US5999208A (en) * 1998-07-15 1999-12-07 Lucent Technologies Inc. System for implementing multiple simultaneous meetings in a virtual reality mixed media meeting room
US6119147A (en) * 1998-07-28 2000-09-12 Fuji Xerox Co., Ltd. Method and system for computer-mediated, multi-modal, asynchronous meetings in a virtual space
US6731314B1 (en) 1998-08-17 2004-05-04 Muse Corporation Network-based three-dimensional multiple-user shared environment apparatus and method
US6215498B1 (en) 1998-09-10 2001-04-10 Lionhearth Technologies, Inc. Virtual command post
US7451181B2 (en) 1998-09-24 2008-11-11 Fujitsu Limited Apparatus for controlling a shared screen
US20020033844A1 (en) 1998-10-01 2002-03-21 Levy Kenneth L. Content sensitive connected content
US6473096B1 (en) 1998-10-16 2002-10-29 Fuji Xerox Co., Ltd. Device and method for generating scenario suitable for use as presentation materials
US20020169829A1 (en) 1998-10-30 2002-11-14 Brian Shuster Method, apparatus and system for directing access to content on a computer network
US6342906B1 (en) * 1999-02-02 2002-01-29 International Business Machines Corporation Annotation layer for synchronous collaboration
US20070101282A1 (en) 1999-03-24 2007-05-03 Microsoft Corporation Method and Structure for Implementing Layered Object Windows
US7168048B1 (en) * 1999-03-24 2007-01-23 Microsoft Corporation Method and structure for implementing a layered object windows
US7007235B1 (en) * 1999-04-02 2006-02-28 Massachusetts Institute Of Technology Collaborative agent interaction control and synchronization system
US6580441B2 (en) 1999-04-06 2003-06-17 Vergics Corporation Graph-based visual navigation through store environments
US6909443B1 (en) * 1999-04-06 2005-06-21 Microsoft Corporation Method and apparatus for providing a three-dimensional task gallery computer interface
US20020080195A1 (en) 1999-07-28 2002-06-27 Carlson Samuel Garrett System and method for navigating in a digital information environment
KR19990078775A (en) 1999-08-06 1999-11-05 신유진 Electronic commerce system in the virtual reality space and method thereof
US6480191B1 (en) 1999-09-28 2002-11-12 Ricoh Co., Ltd. Method and apparatus for recording and playback of multidimensional walkthrough narratives
US6735708B2 (en) 1999-10-08 2004-05-11 Dell Usa, L.P. Apparatus and method for a combination personal digital assistant and network portable device
US6772195B1 (en) 1999-10-29 2004-08-03 Electronic Arts, Inc. Chat clusters for a virtual world application
US7080096B1 (en) 1999-11-02 2006-07-18 Matsushita Electric Works, Ltd. Housing space-related commodity sale assisting system, housing space-related commodity sale assisting method, program for assisting housing space-related commodity sale, and computer-readable recorded medium on which program for assisting housing space-related commodity sale is recorded
US7086005B1 (en) 1999-11-29 2006-08-01 Sony Corporation Shared virtual space conversation support system using virtual telephones
US20060184886A1 (en) 1999-12-22 2006-08-17 Urbanpixel Inc. Spatial chat in a multiple browser environment
US6708172B1 (en) 1999-12-22 2004-03-16 Urbanpixel, Inc. Community-based shared multiple browser environment
US6721741B1 (en) 2000-01-24 2004-04-13 Friskit, Inc. Streaming media search system
US20010023450A1 (en) 2000-01-25 2001-09-20 Chu Chang-Nam Authoring apparatus and method for creating multimedia file
US20010034740A1 (en) 2000-02-14 2001-10-25 Andruid Kerne Weighted interactive grid presentation system and method for streaming a multimedia collage
US7240093B1 (en) 2000-02-29 2007-07-03 Microsoft Corporation Use of online messaging to facilitate selection of participants in game play
KR20000030491A (en) 2000-03-03 2000-06-05 천병년 Exhibition system in three dimensional virtual reality space and method thereof
US6572476B2 (en) 2000-04-10 2003-06-03 Konami Corporation Game system and computer readable storage medium
US20020033848A1 (en) 2000-04-21 2002-03-21 Sciammarella Eduardo Agusto System for managing data objects
KR20010100589A (en) 2000-05-04 2001-11-14 신윤철 System for and method of information exchange by recording and playing avatar action
US6784901B1 (en) 2000-05-09 2004-08-31 There Method, system and computer program product for the delivery of a chat message in a 3D multi-user environment
US6714222B1 (en) 2000-06-21 2004-03-30 E2 Home Ab Graphical user interface for communications
US20020026388A1 (en) 2000-08-01 2002-02-28 Chris Roebuck Method of distributing a product, providing incentives to a consumer, and collecting data on the activities of a consumer
US20020019833A1 (en) 2000-08-03 2002-02-14 Takashi Hanamoto Data editing apparatus and method
US20020049814A1 (en) 2000-08-31 2002-04-25 Yoo Hwan Soo System and method for book-marking a specific location in virtual space
US20020052918A1 (en) 2000-09-07 2002-05-02 Junichi Rekimoto Method and system for supporting virtual conversation, and storage medium
US7036082B1 (en) 2000-09-21 2006-04-25 Nortel Networks Limited Controlling communications through a virtual reality environment
US20120124486A1 (en) * 2000-10-10 2012-05-17 Addnclick, Inc. Linking users into live social networking interactions based on the users' actions relative to similar content
US20070198726A1 (en) 2000-10-12 2007-08-23 Peerapp Ltd. Method and system for accelerating receipt of data in a client to client network
JP2002123479A (en) 2000-10-17 2002-04-26 Hitachi Ltd Disk control device and method for controlling its cache
US20030195928A1 (en) 2000-10-17 2003-10-16 Satoru Kamijo System and method for providing reference information to allow chat users to easily select a chat room that fits in with his tastes
JP2002149580A (en) 2000-11-15 2002-05-24 Canon Inc Server equipment, user terminal equipment, virtual space system, method for distributing user information, method for displaying user information, method for distributing and displaying user information and storage medium
US20020112028A1 (en) 2000-11-17 2002-08-15 Colwill Ronald W. Virtual directory
US20020116458A1 (en) 2000-12-14 2002-08-22 Jonathan Bricklin Web-based dating service
US20060117264A1 (en) 2000-12-18 2006-06-01 Nortel Networks Limited Graphical user interface for a virtual team environment
US7516411B2 (en) 2000-12-18 2009-04-07 Nortel Networks Limited Graphical user interface for a virtual team environment
US20020097267A1 (en) 2000-12-26 2002-07-25 Numedeon, Inc. Graphical interactive interface for immersive online communities
JP2002224447A (en) 2001-01-31 2002-08-13 Square Co Ltd Video game device, method for controlling the same, video game program, and recording medium able to read computer in which the program is recorded
US20030009469A1 (en) 2001-03-09 2003-01-09 Microsoft Corporation Managing media objects in a database
US20060161624A1 (en) * 2001-04-13 2006-07-20 Elaine Montgomery Methods and apparatuses for dynamically sharing a portion of a display for application based screen sampling
US20020165922A1 (en) * 2001-04-13 2002-11-07 Songxiang Wei Application based screen sampling
US20020178072A1 (en) 2001-05-24 2002-11-28 International Business Machines Corporation Online shopping mall virtual association
US20020188959A1 (en) 2001-06-12 2002-12-12 Koninklijke Philips Electronics N.V. Parallel and synchronized display of augmented multimedia information
US7188317B1 (en) * 2001-06-13 2007-03-06 Microsoft Corporation Dynamic multiple window display having independently linked windows
US20030043200A1 (en) 2001-08-09 2003-03-06 Urbanpixel Inc Interactive multi-level mapping in a multiple browser environment
US20030037110A1 (en) 2001-08-14 2003-02-20 Fujitsu Limited Method for providing area chat rooms, method for processing area chats on terminal side, computer-readable medium for recording processing program to provide area chat rooms, apparatus for providing area chat rooms, and terminal-side apparatus for use in a system to provide area chat rooms
JP2003067317A (en) 2001-08-30 2003-03-07 Fuji Xerox Co Ltd Message exchange method, computer, management device and recording medium
US20030065558A1 (en) 2001-09-12 2003-04-03 Cameron Shaw Method and apparatus for multi-vendor powered business portal with intelligent service promotion and user profile gathering
US20030077561A1 (en) * 2001-10-18 2003-04-24 Abraham Alsop Web server for printable whiteboards
US7415502B2 (en) 2001-11-16 2008-08-19 Sbc Technology Resources, Inc. Method and system for intelligent routing based on presence detection
KR20030054874A (en) 2001-12-26 2003-07-02 한국전자통신연구원 Client system for embodying 3-dimension virtual reality and method for embodying virtual reality using same
US7533346B2 (en) 2002-01-09 2009-05-12 Dolby Laboratories Licensing Corporation Interactive spatalized audiovisual system
US7058896B2 (en) 2002-01-16 2006-06-06 Silicon Graphics, Inc. System, method and computer program product for intuitive interactive navigation control in virtual environments
US20030149731A1 (en) 2002-01-23 2003-08-07 Sony Corporation Information processing device and method, and program
US20030158953A1 (en) 2002-02-21 2003-08-21 Lal Amrish K. Protocol to fix broken links on the world wide web
US7336779B2 (en) 2002-03-15 2008-02-26 Avaya Technology Corp. Topical dynamic chat
US20040128350A1 (en) * 2002-03-25 2004-07-01 Lou Topfl Methods and systems for real-time virtual conferencing
US20030184579A1 (en) 2002-03-29 2003-10-02 Hong-Jiang Zhang System and method for producing a video skim
US20060136837A1 (en) 2002-04-03 2006-06-22 Microsoft Corporation Application sharing single document sharing
US7418664B2 (en) 2002-04-03 2008-08-26 Microsoft Corporation Application sharing single document sharing
US20030189599A1 (en) * 2002-04-05 2003-10-09 Microsoft Corporation Application sharing user interface improvements
US20030192049A1 (en) 2002-04-09 2003-10-09 Schneider Tina Fay Binding interactive multichannel digital document system
US20140129644A1 (en) * 2002-04-22 2014-05-08 Rosebud LMS, Inc Method and software for enabling n-way collaborative work over a network of computers
US20030197739A1 (en) * 2002-04-23 2003-10-23 Bauer Jason M. Distribution of application windows in a computing device display
US20030215779A1 (en) 2002-05-08 2003-11-20 Anne Dupont Telecommunications virtual simulator
US7293243B1 (en) * 2002-05-22 2007-11-06 Microsoft Corporation Application sharing viewer presentation
US7356563B1 (en) 2002-06-06 2008-04-08 Microsoft Corporation Methods of annotating a collaborative application display
US7640300B2 (en) 2002-06-10 2009-12-29 Microsoft Corporation Presence and notification system for maintaining and communicating information
KR20040011825A (en) 2002-07-30 2004-02-11 김동현 Cyber department store managing system based on the internet and its managing method
US20060041684A1 (en) 2002-09-17 2006-02-23 Bellsouth Intellectual Property Corporation Server-based message protocol translation
CN1678994A (en) 2002-09-30 2005-10-05 微软公司 System and method for providing access to user interface information
US20050154574A1 (en) 2002-10-10 2005-07-14 Kenichi Takemura Information processing system, service providing apparatus and method, information processing apparatus and method, recording medium, and program
US20040078444A1 (en) 2002-10-17 2004-04-22 Malik Dale W. Merging instant messaging (IM) chat sessions
US20110029898A1 (en) 2002-10-17 2011-02-03 At&T Intellectual Property I, L.P. Merging Instant Messaging (IM) Chat Sessions
CN1701568A (en) 2002-11-13 2005-11-23 英特尔公司 Multi-modal web interaction over wireless network
US20100169801A1 (en) 2002-11-21 2010-07-01 Aol Llc Multiple avatar personalities
US7474741B2 (en) 2003-01-20 2009-01-06 Avaya Inc. Messaging advise in presence-aware networks
US20040179038A1 (en) 2003-03-03 2004-09-16 Blattner Patrick D. Reactive avatars
JP2004272579A (en) 2003-03-07 2004-09-30 Toshiba Corp Online service provision system, communication management device and program therefor, and communication management method
US20040183827A1 (en) * 2003-03-17 2004-09-23 Daniel Putterman Methods and apparatus for implementing a remote application over a network
US20040210847A1 (en) * 2003-04-17 2004-10-21 Supersonic Aerospace International, Llc System and method for customizing multiple windows of information on a display
US20040268451A1 (en) 2003-04-25 2004-12-30 Apple Computer, Inc. Graphical user interface for browsing, searching and presenting media items
US7260784B2 (en) * 2003-05-07 2007-08-21 International Business Machines Corporation Display data mapping method, system, and program product
US20050021624A1 (en) 2003-05-16 2005-01-27 Michael Herf Networked chat and media sharing systems and methods
US7467356B2 (en) 2003-07-25 2008-12-16 Three-B International Limited Graphical user interface for 3d virtual display browser using virtual display windows
US20050086612A1 (en) 2003-07-25 2005-04-21 David Gettman Graphical user interface for an information display system
US20050058094A1 (en) 2003-09-16 2005-03-17 Mihal Lazaridis Method for creating a peer-to-peer immediate messaging solution without using an instant messaging server
JP2007506309A (en) 2003-09-16 2007-03-15 リサーチ イン モーション リミテッド A method for creating a peer-to-peer rapid messaging solution without using an instant messaging server
US20140213309A1 (en) 2003-09-16 2014-07-31 Blackberry Limited Method for Creating a Peer-to-Peer Immediate Messaging Solution Without Using an Instant Messaging Server
US20130109418A1 (en) 2003-09-16 2013-05-02 Research In Motion Limited Method for creating a peer-to-peer immediate messaging solution without using an instant messaging server
US20050075885A1 (en) 2003-09-25 2005-04-07 Danieli Damon V. Visual indication of current voice speaker
US7503006B2 (en) 2003-09-25 2009-03-10 Microsoft Corporation Visual indication of current voice speaker
US7813488B2 (en) 2003-09-29 2010-10-12 Siemens Enterprise Communications, Inc. System and method for providing information regarding an identity's media availability
US20050080866A1 (en) 2003-10-14 2005-04-14 Kent Larry G. Selectively displaying time indications for instant messaging (IM) messages
US20050088452A1 (en) * 2003-10-23 2005-04-28 Scott Hanggie Dynamic window anatomy
US20050144247A1 (en) 2003-12-09 2005-06-30 Christensen James E. Method and system for voice on demand private message chat
US7305438B2 (en) 2003-12-09 2007-12-04 International Business Machines Corporation Method and system for voice on demand private message chat
US20050132299A1 (en) * 2003-12-15 2005-06-16 Dan Jones Systems and methods for improved application sharing in a multimedia collaboration session
JP2005182331A (en) 2003-12-18 2005-07-07 Sony Corp Information processing system, service providing device and method, information processor and information processing method, program and recording medium
US20050138570A1 (en) 2003-12-22 2005-06-23 Palo Alto Research Center, Incorporated Methods and systems for supporting presentation tools using zoomable user interface
US20050215252A1 (en) 2004-03-26 2005-09-29 Samsung Electronics Co., Ltd. System and method for providing a messenger service capable of changing messenger status information based on a schedule
JP2005286749A (en) 2004-03-30 2005-10-13 Matsushita Electric Ind Co Ltd Video image decoding device and video image transmission system using it
US7680885B2 (en) 2004-04-15 2010-03-16 Citrix Systems, Inc. Methods and apparatus for synchronization of data set representations in a bandwidth-adaptive manner
US20050232168A1 (en) 2004-04-15 2005-10-20 Citrix Systems, Inc. Methods and apparatus for synchronization of data set representations in a bandwidth-adaptive manner
US20060031779A1 (en) 2004-04-15 2006-02-09 Citrix Systems, Inc. Selectively sharing screen data
US7908663B2 (en) 2004-04-20 2011-03-15 Microsoft Corporation Abstractions and automation for enhanced sharing and collaboration
US20050261980A1 (en) 2004-05-22 2005-11-24 Altaf Hadi System and method for delivering real time remote buying, selling, meeting, and interacting in a virtual reality environment
US20060005187A1 (en) * 2004-06-30 2006-01-05 Microsoft Corporation Systems and methods for integrating application windows in a virtual machine environment
US20110196930A1 (en) 2004-09-20 2011-08-11 Jitendra Chawla Methods and apparatuses for reporting based on attention of a user during a collaboration session
KR20060060788A (en) 2004-11-30 2006-06-05 전경열 Virtual reality system using character and realizing method of the same
US20060167996A1 (en) * 2005-01-13 2006-07-27 Orsolini Garry S System and method for enabling electronic presentations
US20060230156A1 (en) * 2005-04-06 2006-10-12 Ericom Software Ltd. Seamless windows functionality to remote desktop sessions regarding z-order
US20060248159A1 (en) 2005-04-28 2006-11-02 International Business Machines Corporation Method and apparatus for presenting navigable data center information in virtual reality using leading edge rendering engines
US20060271877A1 (en) * 2005-05-25 2006-11-30 Citrix Systems, Inc. A system and methods for selective sharing of an application window
WO2006127429A3 (en) 2005-05-26 2008-01-10 Citrix Systems Inc Selectively sharing screen data
US20080221998A1 (en) 2005-06-24 2008-09-11 Disney Enterprises, Inc. Participant interaction with entertainment in real and virtual environments
US20060293103A1 (en) 2005-06-24 2006-12-28 Seth Mendelsohn Participant interaction with entertainment in real and virtual environments
US20070011232A1 (en) * 2005-07-06 2007-01-11 Microsoft Corporation User interface for starting presentations in a meeting
US20080101561A1 (en) 2005-07-08 2008-05-01 Nhn Corporation Messenger notification system and method using synchronization server
US7734692B1 (en) 2005-07-22 2010-06-08 Oracle America, Inc. Network collaboration system with private voice chat
US20070047700A1 (en) 2005-08-29 2007-03-01 Avaya Technology Corp. Managing held telephone calls from a remote telecommunications terminal
US20070061399A1 (en) * 2005-09-12 2007-03-15 Microsoft Corporation Filtering obscured data from a remote client display
US20070070066A1 (en) 2005-09-13 2007-03-29 Bakhash E E System and method for providing three-dimensional graphical user interface
US20110029907A1 (en) 2005-09-13 2011-02-03 Bakhash E Eddie System and method for providing three-dimensional graphical user interface
US20070136686A1 (en) 2005-12-08 2007-06-14 International Business Machines Corporation Pop-up repelling frame for use in screen sharing
US20070156908A1 (en) 2005-12-30 2007-07-05 Nokia Corporation Network entity, method and computer program product for effectuating a conference session
US20070160129A1 (en) 2006-01-10 2007-07-12 Tatsuro Fujisawa Video decoding apparatus and video decoding method
US20070162432A1 (en) 2006-01-10 2007-07-12 Aol Llc Searching Recent Content Publication Activity
JP2007184871A (en) 2006-01-10 2007-07-19 Toshiba Corp Moving picture decoding apparatus and moving picture decoding method
US20070184855A1 (en) 2006-02-03 2007-08-09 Research In Motion Limited Visual representation of contact location
US20070192427A1 (en) 2006-02-16 2007-08-16 Viktors Berstis Ease of use feature for audio communications within chat conferences
US20070204047A1 (en) * 2006-02-27 2007-08-30 Microsoft Corporation Shared telepointer
US20080086696A1 (en) 2006-03-03 2008-04-10 Cadcorporation.Com Inc. System and Method for Using Virtual Environments
JP2007251380A (en) 2006-03-14 2007-09-27 Canon Inc Storage device, processing method of storage device, and processing method
US20070220568A1 (en) 2006-03-14 2007-09-20 Canon Kabushiki Kaisha Storage apparatus and method for processing the same
US20070226357A1 (en) 2006-03-22 2007-09-27 Mcmurry Kathleen A Providing an Aggregate Reachability Status
US20070233785A1 (en) 2006-03-30 2007-10-04 International Business Machines Corporation Communicating using collaboration spaces
US20100221693A1 (en) 2006-03-31 2010-09-02 Rakesh Kumar Gupta Instant Messaging For A Virtual Learning Community
US20070234212A1 (en) 2006-03-31 2007-10-04 Microsoft Corporation Selective window exclusion for captured content
KR20070105088A (en) 2006-04-25 2007-10-30 함현주 Shopping mall system based character and management method thereof
US20080052373A1 (en) 2006-05-01 2008-02-28 Sms.Ac Systems and methods for a community-based user interface
US20070279484A1 (en) 2006-05-31 2007-12-06 Mike Derocher User interface for a video teleconference
US20080091692A1 (en) 2006-06-09 2008-04-17 Christopher Keith Information collection in multi-participant online communities
KR20090016692A (en) 2006-06-16 2009-02-17 루센트 테크놀러지스 인크 Methods, devices and architectures for establishing peer-to-peer sessions
US20070291706A1 (en) 2006-06-16 2007-12-20 Miller Scott C Methods, devices and architectures for establishing peer-to-peer sessions
US20070291034A1 (en) 2006-06-20 2007-12-20 Dones Nelson C System for presenting a navigable virtual subway system, and method for operating and using the same
US20070299778A1 (en) 2006-06-22 2007-12-27 Microsoft Corporation Local peer-to-peer digital content distribution
WO2008008806A2 (en) 2006-07-11 2008-01-17 Igor Khalatian One-click universal screen sharing
US7616624B2 (en) 2006-07-20 2009-11-10 Avaya Inc. Determining user availability based on the expected duration of a new event
US7680098B2 (en) 2006-07-20 2010-03-16 Avaya Inc. Determining group availability on different communication media
US20080021949A1 (en) 2006-07-20 2008-01-24 Avaya Technology Llc Determining User Availability Based on a Past Event
US20080019285A1 (en) 2006-07-20 2008-01-24 Avaya Technology Llc Rule-based System for Determining User Availability
US7680480B2 (en) 2006-07-20 2010-03-16 Avaya Inc. Determining user availability based on a past event
US20080033941A1 (en) 2006-08-07 2008-02-07 Dale Parrish Verfied network identity with authenticated biographical information
US7958453B1 (en) 2006-09-29 2011-06-07 Len Bou Taing System and method for real-time, multi-user, interactive and collaborative environments on the web
US20080098121A1 (en) 2006-10-23 2008-04-24 Nec (China) Co., Ltd. P2p sip enabled multimedia network communication system
JP2008182670A (en) 2006-10-23 2008-08-07 Nec (China) Co Ltd P2p sip enabled network communication system, access device, user terminal registration method and session establishment method in the network communication system
US20080133580A1 (en) 2006-11-30 2008-06-05 James Andrew Wanless Method and system for providing automated real-time contact information
US7765259B2 (en) 2006-12-05 2010-07-27 Avaya Inc. System and method for aggregation of user conversations and visualizing personal communications map
US20080163090A1 (en) 2006-12-28 2008-07-03 Yahoo! Inc. Interface overlay
US20080168154A1 (en) 2007-01-05 2008-07-10 Yahoo! Inc. Simultaneous sharing communication interface
US20090132943A1 (en) 2007-02-13 2009-05-21 Claudia Juliana Minsky Method and System for Creating a Multifunctional Collage Useable for Client/Server Communication
US20080209075A1 (en) 2007-02-22 2008-08-28 Yahoo! Inc. Synchronous delivery of media content and real-time communication for online dating
EP1964597A1 (en) 2007-03-01 2008-09-03 Sony Computer Entertainment Europe Ltd. Apparatus and method of modifying an online environment
EP1964597B1 (en) 2007-03-01 2013-02-20 Sony Computer Entertainment Europe Ltd. Apparatus and method of modifying an online environment
WO2008106196A1 (en) 2007-03-01 2008-09-04 Sony Computer Entertainment America Inc. Virtual world avatar control, interactivity and communication interactive messaging
US7979574B2 (en) 2007-03-01 2011-07-12 Sony Computer Entertainment America Llc System and method for routing communications among real and virtual communication devices
JP2010535363A (en) 2007-03-01 2010-11-18 ソニー コンピュータ エンタテインメント アメリカ リミテッド ライアビリテイ カンパニー Virtual world avatar control, interactivity and communication interactive messaging
US20080244458A1 (en) 2007-03-30 2008-10-02 Microsoft Corporation Remoting of Windows Presentation Framework based Applications in a Non-Composed Desktop
US20080250115A1 (en) 2007-04-04 2008-10-09 Vaidy Iyer Enterprise Notification System
US20080252637A1 (en) 2007-04-14 2008-10-16 Philipp Christian Berndt Virtual reality-based teleconferencing
US20080263460A1 (en) 2007-04-20 2008-10-23 Utbk, Inc. Methods and Systems to Connect People for Virtual Meeting in Virtual Reality
US20080262910A1 (en) 2007-04-20 2008-10-23 Utbk, Inc. Methods and Systems to Connect People via Virtual Reality for Real Time Communications
US20080262911A1 (en) 2007-04-20 2008-10-23 Utbk, Inc. Methods and Systems to Search in Virtual Reality for Real Time Communications
US7840668B1 (en) 2007-05-24 2010-11-23 Avaya Inc. Method and apparatus for managing communication between participants in a virtual environment
US20080301557A1 (en) 2007-06-04 2008-12-04 Igor Kotlyar Systems, methods and software products for online dating
WO2009000028A1 (en) 2007-06-22 2008-12-31 Global Coordinate Software Limited Virtual 3d environments
US20090018912A1 (en) 2007-07-12 2009-01-15 Utbk, Inc. Systems and Methods to Provide Communication Connections via Partners
US20090079816A1 (en) 2007-09-24 2009-03-26 Fuji Xerox Co., Ltd. Method and system for modifying non-verbal behavior for social appropriateness in video conferencing and other computer mediated communications
US20090089685A1 (en) 2007-09-28 2009-04-02 Mordecai Nicole Y System and Method of Communicating Between A Virtual World and Real World
US20090089364A1 (en) 2007-10-02 2009-04-02 Hamilton Ii Rick A Arrangements for interactivity between a virtual universe and the world wide web
US20090096810A1 (en) * 2007-10-11 2009-04-16 Green Brian D Method for selectively remoting windows
US8621079B2 (en) 2007-10-24 2013-12-31 Social Communications Company Automated real-time data stream switching in a shared virtual area communication environment
US7844724B2 (en) 2007-10-24 2010-11-30 Social Communications Company Automated real-time data stream switching in a shared virtual area communication environment
US20110274104A1 (en) 2007-10-24 2011-11-10 Social Communications Company Virtual area based telephony communications
US20110302509A1 (en) 2007-10-24 2011-12-08 Social Communications Company Promoting communicant interactions in a network communications environment
US20090113066A1 (en) 2007-10-24 2009-04-30 David Van Wie Automated real-time data stream switching in a shared virtual area communication environment
US7769806B2 (en) 2007-10-24 2010-08-03 Social Communications Company Automated real-time data stream switching in a shared virtual area communication environment
US8930472B2 (en) 2007-10-24 2015-01-06 Social Communications Company Promoting communicant interactions in a network communications environment
US8578044B2 (en) 2007-10-24 2013-11-05 Social Communications Company Automated real-time data stream switching in a shared virtual area communication environment
US20090112997A1 (en) 2007-10-25 2009-04-30 Cisco Technology, Inc. Utilizing Presence Data Associated with Web Item
US20090113314A1 (en) 2007-10-30 2009-04-30 Dawson Christopher J Location and placement of avatars in virtual worlds
US20090128567A1 (en) 2007-11-15 2009-05-21 Brian Mark Shuster Multi-instance, multi-user animation with coordinated chat
US20090193077A1 (en) 2008-01-25 2009-07-30 Hiroshi Horii Communication control
CN101499080A (en) 2008-02-01 2009-08-05 网秦无限(北京)科技有限公司 Method and system for fast acquiring information service on mobile terminal
US20090199095A1 (en) 2008-02-01 2009-08-06 International Business Machines Corporation Avatar cloning in a virtual world
US20090222742A1 (en) 2008-03-03 2009-09-03 Cisco Technology, Inc. Context sensitive collaboration environment
US20090235180A1 (en) * 2008-03-17 2009-09-17 Jun Feng Liu Method and Apparatus for Restoring an Occluded Window in Application Sharing Software
US20090241037A1 (en) 2008-03-18 2009-09-24 Nortel Networks Limited Inclusion of Web Content in a Virtual Environment
US20090247196A1 (en) 2008-03-31 2009-10-01 Lg Electronic Inc. Mobile terminal for performing instant messaging service
US20090251457A1 (en) 2008-04-03 2009-10-08 Cisco Technology, Inc. Reactive virtual environment
US20090254840A1 (en) 2008-04-04 2009-10-08 Yahoo! Inc. Local map chat
US20090288007A1 (en) 2008-04-05 2009-11-19 Social Communications Company Spatial interfaces for realtime networked communications
US8191001B2 (en) 2008-04-05 2012-05-29 Social Communications Company Shared virtual area communication environment based apparatus and methods
US20120179672A1 (en) 2008-04-05 2012-07-12 Social Communications Company Shared virtual area communication environment based apparatus and methods
US20120246582A1 (en) 2008-04-05 2012-09-27 Social Communications Company Interfacing with a spatial virtual communications environment
US20090254843A1 (en) 2008-04-05 2009-10-08 Social Communications Company Shared virtual area communication environment based apparatus and methods
US20090259948A1 (en) 2008-04-15 2009-10-15 Hamilton Ii Rick A Surrogate avatar control in a virtual universe
US20090286605A1 (en) 2008-05-19 2009-11-19 Hamilton Ii Rick A Event determination in a virtual universe
US20090300521A1 (en) 2008-05-30 2009-12-03 International Business Machines Corporation Apparatus for navigation and interaction in a virtual meeting place
US20090307620A1 (en) 2008-06-10 2009-12-10 Hamilton Ii Rick A System for concurrently managing multiple avatars
US20100058202A1 (en) 2008-08-28 2010-03-04 Mohamed Rostom Method system and program product for providing enabling an interactive and social search engine
US20100058229A1 (en) * 2008-09-02 2010-03-04 Palm, Inc. Compositing Windowing System
US20100077034A1 (en) 2008-09-22 2010-03-25 International Business Machines Corporation Modifying environmental chat distance based on avatar population density in an area of a virtual world
US20100131868A1 (en) * 2008-11-26 2010-05-27 Cisco Technology, Inc. Limitedly sharing application windows in application sharing sessions
US20100146085A1 (en) 2008-12-05 2010-06-10 Social Communications Company Realtime kernel
US20100146118A1 (en) 2008-12-05 2010-06-10 Social Communications Company Managing interactions in a network communications environment
US20100274848A1 (en) 2008-12-05 2010-10-28 Social Communications Company Managing network communications between network nodes and stream transport protocol
US20100162121A1 (en) 2008-12-22 2010-06-24 Nortel Networks Limited Dynamic customization of a virtual world
US20100164956A1 (en) 2008-12-28 2010-07-01 Nortel Networks Limited Method and Apparatus for Monitoring User Attention with a Computer-Generated Virtual Environment
US20100169796A1 (en) 2008-12-28 2010-07-01 Nortel Networks Limited Visual Indication of Audio Context in a Computer-Generated Virtual Environment
US20100169837A1 (en) 2008-12-29 2010-07-01 Nortel Networks Limited Providing Web Content in the Context of a Virtual Environment
US20100169799A1 (en) 2008-12-30 2010-07-01 Nortel Networks Limited Method and Apparatus for Enabling Presentations to Large Numbers of Users in a Virtual Environment
US20120254858A1 (en) 2009-01-15 2012-10-04 Social Communications Company Creating virtual areas for realtime communications
US20120216131A1 (en) 2009-01-15 2012-08-23 Social Communications Company Persistent network resource and virtual area associations for realtime collaboration
US20100211880A1 (en) 2009-02-13 2010-08-19 International Business Machines Corporation Virtual world viewer
US20100228560A1 (en) 2009-03-04 2010-09-09 Avaya Inc. Predictive buddy list-reorganization based on call history information
US20100228547A1 (en) 2009-03-06 2010-09-09 At&T Intellectual Property I, Lp. Method and apparatus for analyzing discussion regarding media programs
US20100235501A1 (en) 2009-03-16 2010-09-16 Avaya Inc. Advanced Availability Detection
US20100241432A1 (en) 2009-03-17 2010-09-23 Avaya Inc. Providing descriptions of visually presented information to video teleconference participants who are not video-enabled
US20100246570A1 (en) 2009-03-24 2010-09-30 Avaya Inc. Communications session preparation method and apparatus
US20100251142A1 (en) 2009-03-30 2010-09-30 Avaya Inc. System and method for persistent multimedia conferencing services
US20100251177A1 (en) 2009-03-30 2010-09-30 Avaya Inc. System and method for graphically managing a communication session with a context based contact set
US20100251127A1 (en) 2009-03-30 2010-09-30 Avaya Inc. System and method for managing trusted relationships in communication sessions using a graphical metaphor
US20100246800A1 (en) 2009-03-30 2010-09-30 Avaya Inc. System and method for managing a contact center with a graphical call connection metaphor
US20100251119A1 (en) 2009-03-30 2010-09-30 Avaya Inc. System and method for managing incoming requests for a communication session using a graphical connection metaphor
US20100246571A1 (en) 2009-03-30 2010-09-30 Avaya Inc. System and method for managing multiple concurrent communication sessions using a graphical call connection metaphor
US20100251158A1 (en) 2009-03-30 2010-09-30 Avaya Inc. System and method for graphically managing communication sessions
US20100251124A1 (en) 2009-03-30 2010-09-30 Avaya Inc. System and method for mode-neutral communications with a widget-based communications metaphor
EP2239930A1 (en) 2009-03-30 2010-10-13 Avaya Inc. System and method for managing a contact center with a graphical call connection metaphor
EP2237537A1 (en) 2009-03-30 2010-10-06 Avaya Inc. System and method for graphically managing a communication session with a context based contact set
US20100257450A1 (en) 2009-04-03 2010-10-07 Social Communications Company Application sharing
US20100262550A1 (en) 2009-04-08 2010-10-14 Avaya Inc. Inter-corporate collaboration overlay solution for professional social networks
US20100287274A1 (en) 2009-05-08 2010-11-11 Canon Kabushiki Kaisha Efficient network utilization using multiple physical interfaces
US20100322395A1 (en) 2009-06-22 2010-12-23 Avaya Inc. Unified communications appliance
US20110106662A1 (en) 2009-10-30 2011-05-05 Matthew Stinchcomb System and method for performing interactive online shopping
US20110169927A1 (en) 2010-01-13 2011-07-14 Coco Studios Content Presentation in a Three Dimensional Environment
US20110231781A1 (en) 2010-03-19 2011-09-22 International Business Machines Corporation System and method for virtual object sharing and management in virtual worlds
US20120066306A1 (en) 2010-09-11 2012-03-15 Social Communications Company Relationship based presence indicating in virtual area contexts
US20130073978A1 (en) 2011-09-16 2013-03-21 Social Communications Company Capabilities based management of virtual areas

Non-Patent Citations (23)

* Cited by examiner, † Cited by third party
Title
"Microsoft Computer Dictionary", Third Edition, 1997. Definitions for"audit," "auditing," and "audit trail" (p. 36), "log" (p. 288, "security log" (p. 426), "user account," and "user profile" (p. 488). *
01-Allowance Notice dated Oct. 9, 2012 in related U.S. Appl. No. 12/354,709, filed Jan. 15, 2009.
02-IDS dated Sep. 20, 2012 in related U.S. Appl. No. 12/354,709, filed Jan. 15, 2009.
03-RCE dated Sep. 20, 2012 in related U.S. Appl. No. 12/354,709, filed Jan. 15, 2009.
04-Allowance Notice dated Aug. 28, 2012 in related U.S. Appl. No. 12/354,709, filed Jan. 15, 2009.
05-Amendment dated Mar. 18, 2012 in related U.S. Appl. No. 12/354,709, filed Jan. 15, 2009.
06-Non-final Office Action dated Sep. 7, 2011 in related U.S. Appl. No. 12/354,709, filed Jan. 15, 2009.
07-Non-final Office action dated Dec. 27, 2011 in related U.S. Appl. No. 12/354,709, filed Jan. 15, 2009.
11-Final Office Action dated Sep. 10, 2012 in related U.S. Appl. No. 12/509,658, filed Jul. 27, 2009.
12-Amendment dated May 30, 2012 in related U.S. Appl. No. 12/509,658, filed Jul. 27, 2009.
13-Non-Final Office Action dated Mar. 1, 2012 in related U.S. Appl. No. 12/509,658, filed Jul. 27, 2009.
14-Pre-Appeal Brief Request for Review dated Nov. 10, 2015 in related U.S. Appl. No. 13/666,717, filed Nov. 1, 2012.
15-Advisory Action dated Oct. 16, 2015 in related U.S. Appl. No. 13/666,717, filed Nov. 1, 2012.
16-Response After Final dated Sep. 14, 2015 in related U.S. Appl. No. 13/666,717, filed Nov. 1, 2012.
17-Final Rejection dated Sep. 10, 2015 in related U.S. Appl. No. 13/666,717, filed Nov. 1, 2012.
18-Amendment dated Jul. 20, 2015 in related U.S. Appl. No. 13/666,717, filed Nov. 1, 2012.
19-Non-Final Office Action dated Apr. 22, 2015 in related U.S. Appl. No. 13/666,717, filed Nov. 1, 2012.
20-Notice of Allowance dated Jun. 20, 2016 in related U.S. Appl. No. 13/666,717, filed Nov. 1, 2012.
21-Appeal Brief dated Dec. 8, 2015 in related U.S. Appl. No. 13/666,717, filed Nov. 1, 2012.
22-Pre-Brief Appeal Conference Decision dated Nov. 27, 2015 in related U.S. Appl. No. 13/666,717, filed Nov. 1, 2012.
International Search Report and Written Opinion issued in counterpart International Application No. PCT/US2012/030766 (mailed Oct. 19, 2012).
International search report and written opinion received in counterpart International application No. PCTIUS2010/028088, mailed Oct. 27, 2010.
Search report and written opinion issued on Aug. 13, 2010, in counterpart PCT Application No. PCT/US2010/020596.

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11126390B2 (en) * 2016-01-04 2021-09-21 Shenzhen Mindray Bio-Medical Electronics Co., Ltd. System and method for controlling coordination between medical devices, medical workstation and communication device
US20180088788A1 (en) * 2016-09-26 2018-03-29 Microsoft Technology Licensing, Llc Intelligent navigation via a transient user interface control
US10496734B2 (en) * 2016-09-26 2019-12-03 Microsoft Technology Licensing, Llc Intelligent navigation via a transient user interface control
US20200065360A1 (en) * 2016-09-26 2020-02-27 Microsoft Technology Licensing, Llc Intelligent navigation via a transient user interface control
US11163935B2 (en) * 2016-09-26 2021-11-02 Microsoft Technology Licensing, Llc Intelligent navigation via a transient user interface control
US10866929B2 (en) * 2018-07-20 2020-12-15 Slack Technologies, Inc. Group-based communication interface with subsidiary file communications
US11537564B2 (en) 2018-07-20 2022-12-27 Salesforce, Inc. Group-based communication interface with subsidiary file communications

Also Published As

Publication number Publication date
WO2010114724A2 (en) 2010-10-07
EP2414948A2 (en) 2012-02-08
US8407605B2 (en) 2013-03-26
WO2010114724A3 (en) 2011-01-13
US20100257450A1 (en) 2010-10-07
KR20110134940A (en) 2011-12-15
EP2645267A1 (en) 2013-10-02
CN102804161A (en) 2012-11-28
IL215387A0 (en) 2011-12-29
EP2414948A4 (en) 2012-12-05

Similar Documents

Publication Publication Date Title
USRE46309E1 (en) Application sharing
US9411489B2 (en) Interfacing with a spatial virtual communication environment
US11785056B2 (en) Web browser interface for spatial communication environments
US8930472B2 (en) Promoting communicant interactions in a network communications environment
US9813463B2 (en) Phoning into virtual communication environments
US11023092B2 (en) Shared virtual area communication environment based apparatus and methods
US20090288007A1 (en) Spatial interfaces for realtime networked communications
US20140282074A1 (en) System and method for cooperative sharing of resources of an environment
WO2012135231A2 (en) Creating virtual areas for realtime communications
US20230339816A1 (en) Visual Communications

Legal Events

Date Code Title Description
AS Assignment

Owner name: SOCIAL COMMUNICATIONS COMPANY, OREGON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GO, ALEXANDER SAY;PETTER, VLADIMIR;SIGNING DATES FROM 20090331 TO 20090413;REEL/FRAME:036063/0414

AS Assignment

Owner name: SOCOCO, INC., MASSACHUSETTS

Free format text: CHANGE OF NAME;ASSIGNOR:SOCIAL COMMUNICATIONS COMPANY;REEL/FRAME:049216/0667

Effective date: 20151019

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8