US20060109274A1 - Client/server-based animation software, systems and methods - Google Patents

Client/server-based animation software, systems and methods Download PDF

Info

Publication number
US20060109274A1
US20060109274A1 US11/262,492 US26249205A US2006109274A1 US 20060109274 A1 US20060109274 A1 US 20060109274A1 US 26249205 A US26249205 A US 26249205A US 2006109274 A1 US2006109274 A1 US 2006109274A1
Authority
US
United States
Prior art keywords
animation
processor
instructions executable
client computer
computer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/262,492
Inventor
Donald Alvarez
Mark Parry
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Accelerated Pictures Inc
Original Assignee
Accelerated Pictures LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Accelerated Pictures LLC filed Critical Accelerated Pictures LLC
Priority to US11/262,492 priority Critical patent/US20060109274A1/en
Assigned to ACCELERATED PICTURES, LLC reassignment ACCELERATED PICTURES, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PARRY, MARK, ALVAREZ, DONALD
Publication of US20060109274A1 publication Critical patent/US20060109274A1/en
Assigned to ACCELERATED PICTURES, INC. reassignment ACCELERATED PICTURES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ACCELERATED PICTURES, LLC
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/409Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by using manual input [MDI] or by using control panel, e.g. controlling functions with the panel; characterised by control panel details, by setting parameters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/35Nc in input of data, input till input file format
    • G05B2219/35438Joystick
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/35Nc in input of data, input till input file format
    • G05B2219/35448Datasuit, arm sleeve, actor, operator wears datasuit and generates motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/16Indexing scheme for image data processing or generation, in general involving adaptation to the client's capabilities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2213/00Indexing scheme for animation
    • G06T2213/08Animation software package

Definitions

  • the present disclosure may be related to the following commonly assigned applications/patents:
  • the present invention relates to the field of animation and filmmaking in general and, in particular, to software, systems and methods for creating and/or editing animations and/or films, including any type of film-based and/or digital still and/or video image production.
  • a typical network might comprise a central server system S with version tracking software 100 , which stores the animation data files in bulk storage 101 .
  • version tracking software 100 stores the animation data files in bulk storage 101 .
  • a user accesses the server, checks out the relevant data files, and alters the animation data files with animation software 111 and actuators 115 (here represented as a keyboard and mouse) resident in the PC.
  • animation software 111 and actuators 115 here represented as a keyboard and mouse
  • the artist replays the altered animation data locally through rendering software 112 resident on the PC, viewing the animation data movements at the PC's local display 114 .
  • the user often will check in the altered data files as a new version added to bulk storage 101 and tracked by version tracking software 100 .
  • client-based animation systems make the management of data (including version control, security of intellectual property, etc.) quite cumbersome.
  • data including version control, security of intellectual property, etc.
  • incompatible variations can be introduced. These incompatible variations are a direct result of the local temporary storage of the modified data.
  • the presence of incompatible variations can present severe complication.
  • modern animation includes the use of expensive tools and processes to generate models of the three-dimensional shapes describing objects or characters used in the animation. Local unsupervised and inconsistent modification of the models in checked out animation data can occur. Further, presuming that model modification is made during the animation process, animation data previously recorded must, in the usual case, be completely reworked.
  • both animation software and the work product of the animators is subject to a high risk of piracy.
  • the suite of animation software at the local PC 110 and/or allowing a user to obtain all relevant files related to an animation, the producer of a movie exposes these assets to unauthorized copying.
  • Model A three-dimensional shape, usually described in terms of coordinates and mathematical data, describing the shape of any character or object. Examples of characters include actors, animals, or other beings whose animation can tell or portray the story.
  • the model is typically provided in a neutral pose (known in the art as a ā€œda Vinci poseā€), in which the model is shown standing with limbs spread apart and head looking forward. It is understood in the art that in, many situations, the generation of the model can be extraordinarily expensive.
  • the model is generated, scanned or otherwise digitized with recorded spatial coordinates of numerous points on its surface. A virtual representation of the model can occur when the data is reconstructed.
  • the model may include connectivity data, such that the collection of points defining the model can be treated as the vertices of polygonal approximations of the surface shape of the model.
  • the model may include various mathematical smoothing and/or interpolation algorithms. Such models can include collections of spatial points ranging from hundreds of points to hundreds of thousands or more points.
  • Render To make a model viewable as an image, such as by applying textures to a model and/or imaging the model using a real or virtual camera or by photographing a real object.
  • the term ā€œrigā€ is used to refer to a deformation engine that specifies how movement of the model should translate into animation of a character based on the model. This is the software and data used to deform or transform the ā€œneutral poseā€ of the model into a specific ā€œactive poseā€ variation of the model. Taking the example of the human figure, the rig would impart to the model the skeletal joint movement including shoulder, elbow, hand, finger, neck, head, hip, knee, and foot movement and the like. By having animation software manipulate a rig incorporated to a model, animated movement of the model is achieved.
  • Texture In the usual modern case, one or more are mapped onto the surface of a model to provide a digital image portrayed by the model as manipulated by the rig.
  • Virtual Character The model as deformed by the rig and presented by the texture in animation.
  • Virtual Set The vicinity or fiducial reference point and coordinate system with respect to which the location of any element may be specified.
  • An object on the virtual set usually comprising a model without a rig.
  • Scene A virtual set, one or more props, and one or more virtual characters.
  • Animation associated with a scene It should be noted that upon editing of the final animation story, portions of an action may be distributed without regard to time for example at the beginning, middle and end of the animation story.
  • Editing The process by which portions of actions are assembled to construct a story, narrative, or other product.
  • Actuator A device such as a mouse or keyboard on a personal computer enabling input to the animation software.
  • This term includes our novel adaptation of a ā€œgame controllerā€ for imparting animation to characters.
  • a client-server system provides the ability to control various aspects of a live-action and/or an animated scene, including cameras and/or light sources (either real and/or virtual), animated characters, and other objects.
  • This can include, merely by way of example, moving cameras, lights and/or the like, as well as rendering animated objects (e.g., based on movements of the objects themselves and/or based on movements of cameras, lights, etc.).
  • An exemplary system includes an animation client computer, which may comprise a first processor, a display device, at least one input device, and/or animation client software.
  • the system may further include an animation server computer comprising a processor and animation server software.
  • the animation client software may comprise instructions executable by the first processor to accept a set of input data from the at least one input device.
  • the set of input data may indicate a desired position for an animated object, which might comprise a set of one or more polygons and/or a set of one or more textures to be applied to the set of one or more polygons.
  • the animation client software might further comprise instructions executable by the first processor to transmit the set of input data for reception by the animation server computer.
  • the animation server software can comprise instructions executable by the second processor to receive the set of input data from the animation client computer and/or to process the input data to determine the desired position of the animated object.
  • the animation server software may also comprise additional instructions executable by the second processor to calculate a set of joint rotations defining the desired position of the animated object and/or to transmit the set of joint rotations for reception by the animation client computer.
  • the animation client software may comprise further instructions executable by the first processor to receive the set of joint rotations defining the position of the animated object and/or to calculate (perhaps based on the set of joint rotations) a set of positions for the set of one or more polygons. There may also be additional instructions executable by the first processor to apply to the set of one or more polygons at least one of the textures from the set of one or more textures to render the animated object in the desired position.
  • the rendered animated object then may be displayed by the animation client, and/or the set of joint rotations may be stored at a data store associated with the animation server computer.
  • the animation client computer may be a plurality of animation client computers including a first animation client computer and a second animation client computer.
  • the first animation client computer might comprise the input device(s), while the second animation client computer might comprise the display device(s).
  • the animation server computer then, might receive the set of input data from the first animation client computer and/or transmit the set of joint rotations for reception by the second animation client computer, which might be configured to receive the set of joint rotations, calculate a set of positions for the set of one or more polygons based on the set of joint rotations, apply to the set of one or more polygons at least one of the textures from the set of one or more textures to render the animated object in the desired position, and/or display on the display device the rendered animated object.
  • the animation client software may comprise instructions executable by the first processor to accept a set of input data (which might indicate a desired position for an object) from the at least one input device and/or instructions executable by the first processor to transmit the set of input data for reception by the animation server computer.
  • the animation server software comprises instructions executable by the second processor to receive the set of input data from the animation client computer and/or to transmit for reception by the animation client computer a set of position data, perhaps based on the set of input data received from the animation client computer.
  • the animation client software might further comprise instructions executable by the first processor to receive the set of position data from the animation server computer and/or to place the object in the desired position, based at least in part on the set of position data.
  • the object can be a virtual object (including without limitation a virtual camera, a virtual light source, etc.) and/or a physical object (including without limitation a device, such as a camera, a light source, etc., in communication with the animation client computer, and/or any other appropriate object).
  • the object may be an animated character, which might comprise a set of polygons and at least one texture, such that placing the object in the desired position comprises rendering the animated character in the desired position.
  • the set of position data might comprise data (such as joint rotations, joint angles, etc.) defining a position of the object and/or defining a deformation of a rig describing the object.
  • the set of position data might comprise a position and/or orientation of a real or virtual camera; the position of the object in the scene may be affected by the position and/or orientation of the real or virtual camera, such that the placement of the object depends on the position and/or orientation of the real or virtual camera.
  • the animation server has an associated data store configured to hold a set of one or more object definition files for the animated object, the set of one or more object definition files collectively specifying a set of polygons and textures that define the object (e.g., the object definition files may comprise one or more textures associated with the object).
  • the animation client software may comprise instructions executable by the first processor to download from the animation server computer at least a portion of the set of one or more object definition files necessary to render the object. In some cases, however, the downloaded portion of the set of one or more object definition files may be insufficient to independently recreate the animated object without additional data, which might be resident on the animation server computer. Similarly, in some configurations, the animation client computer might be unable to upload to the animation server computer any modifications of the at least a portion of the set of one or more object definition files.
  • the animation client software comprises further instructions executable by the first processor to modify the object definition files to produce a set of modified object definition files.
  • the animation server software comprises instructions executable by the second processor to receive the set of modified object definition files and/or to track changes to the set of object definition files.
  • the animation server computer may be configured to identify a user of the animation client computer and/or to determine whether to accept the set of modified object definition files, perhaps based on an identity of the user of the animation client computer.
  • the animation server computer may be configured to distribute the set of modified object definition files to a set of animation client computers comprising at least a second animation client computer.
  • the data store is configured to hold a plurality of sets of one or more object definition files for a plurality of animated objects.
  • the animation server software might comprise further instructions executable by the second processor to determine whether to provide to the animation client computer one or more of the sets of the object definition files, based on, for example, a set of payment or billing information and/or an identity of a user of the animation client computer.
  • the animation server software further comprises instructions executable by the second processor to identify a user of the animation client computer and/or to determine, (e.g., based on an identification of the user and/or a set of payment or billing information) whether to allow the animation client computer to interact with the animation server software.
  • the animation server software comprises instructions executable by the second processor to store the set of position data at a data store (which might be associated with the animation server computer).
  • the animation server software comprises instructions to store a plurality of sets of position data (each of which may be, but need not be, based on a separate set of input data) and/or to track a series of changes to a position of the object, based on the plurality of sets of position data.
  • the animation client computer is a first animation client computer
  • the system comprises a second animation client computer in communication with the animation server computer.
  • the second animation client computer may comprise a third processor, a second display device, a second input device, and/or second animation client software.
  • the second animation software may comprise instructions executable by the third processor to accept a second set of input data (which may indicate a desired position for a second object) from the second input device and/or to transmit the second set of input data for reception by the animation server computer.
  • the animation server software may comprise instructions executable by the second processor to receive the second set of input data from the animation client computer and/or to transmit (e.g., for reception by the second animation client computer) a second set of position data, which may be based on the second set of input data received from the second animation client computer.
  • the second animation client software may further comprise instructions executable by the third processor to receive the second set of position data from the animation server computer and/or to place the second object in the desired position, perhaps based on the second set of position data.
  • the first object and the second object may be the same object.
  • the animation server software might comprise instructions to transmit the second set of position data for reception by the first animation client computer, and the animation client software on the first animation client computer might further comprise instructions to place the object in a position defined by the second set of position data, such that the first display displays the object in a position desired by a user of the second animation client computer.
  • the second set of position data might have no impact on a rendering of the first object on the first client computer, and/or the first set of position data might have no impact on a rendering of the second object on the second client computer.
  • Exemplary devices include a joystick, a game controller, a mouse, a keyboard, a steering wheel, an inertial control system, an optical control system, a full or partial body motion capture unit, an optical, mechanical or electromagnetic system configured to capture the position or motion of an actor, puppet or prop, and/or the like.
  • a system for producing animated works comprises a first animation client computer comprising a first processor, a first display device, at least one first input device, and first animation client software.
  • the system further comprises an animation server computer in communication with the animation client computer and comprising a second processor and animation server software.
  • the first animation client software comprises instructions executable by the first processor to accept a first set of input data from the at least one input device; the first set of input data indicates a desired position for a first object.
  • the first animation client software also comprises instructions to transmit the first set of input data for reception by the animation server computer.
  • the animation server software comprises instructions executable by the second processor to receive the first set of input data from the first animation client computer, to calculate a first set of position data (perhaps based on the first set of input data received from the first animation client computer) and to render the first object, based at least in part on the first set of position data.
  • the first animation client software further comprises instructions to display the first object in the desired position.
  • the system may further comprise a second animation client computer comprising a third processor, a second display device, at least one second input device, and second animation client software.
  • the second animation client software can comprise instructions executable by the third processor to accept a second set of input data from the input device, the set of input data indicating a desired position for a second object and/or to transmit the second set of input data for reception by the animation server computer.
  • the animation server software may further comprise instructions to receive the second set of input data from the second animation client computer and/or instructions to transmit for reception by the second animation client computer a second set of position data, based on the second set of input data received from the second animation client computer.
  • the second animation client software comprises instructions to receive the second set of position data from the animation server computer.
  • the second animation client software may also comprise instructions to place the second object in the desired position that object, based at least in part on the second set of position data.
  • Another set of embodiments provides animation client computers and/or animation server computers, which may be similar to those described above.
  • a further set of embodiments provides animation software, including software that can be used to operate the systems described above.
  • An exemplary animation software package may be embodied on at least one computer readable medium and may comprise an animation client component and an animation server component.
  • the animation client component might comprise instructions executable by a first computer to accept a set of input data from at least one input device at the first computer and/or to transmit the set of input data for reception by a second computer.
  • the input data may indicate a desired position for an object.
  • the animation server component may comprise instructions executable by a second computer to receive the set of input data from the first computer and/or to transmit for reception by the first computer a set of position data, based on the set of input data received from the first computer.
  • the animation client component may comprise further instructions executable by the first computer to receive the set of position data from the second computer and/or to place the animated object in the desired position, based at least in part on the set of position data.
  • An exemplary method of creating an animated work comprises accepting at an animation client computer a set of input data (which might indicate a desired position for an object) from at least one input device, and/or transmitting the set of input data for reception by an animation server computer.
  • the method further comprises receiving at the animation server computer the set of input data from the animation client computer and/or transmitting for reception by the animation client computer a set of position data, based on the set of input data received from the animation client computer.
  • the set of position data from the animation server computer may be received at the client computer.
  • the method can further include placing the object in the desired position, based at least in part on the set of position data.
  • FIG. 1 is a block diagram of the prior art animation software design illustrating artist and/or programmer PCs connected to a server system for checking out animation data files, processing the animation data files, and returning the animation data files to bulk storage of the animation data at the server, the exemplary server here being shown with version tracking software;
  • FIG. 2 is a block diagram of an animation system in accordance with one set of embodiments
  • FIG. 3 is a block diagram an animation system in accordance with another set of embodiments.
  • FIG. 4A is a representation of a model that can be animated by various embodiments of the invention.
  • FIG. 4B is a schematic representation of a rig suitable for deforming the model of FIG. 4A , the rig here having manipulation at the neck, shoulders, elbow, hand, hips, knees, and ankles;
  • FIG. 4C is a schematic representation of texture for placement over the model of FIG. 4A to impart a texture to a portion of the exterior of the model in the form of a man's suit;
  • FIG. 5 is a representation of a scene
  • FIG. 6 is a generalized schematic drawing illustrating various components of a client/server animation system, in accordance with embodiments of the invention.
  • FIG. 7 is a flow diagram illustrating a method of creating an animated work, in accordance with various embodiments of the invention.
  • FIG. 8 is a generalized schematic drawing of a computer architecture that can be used in various embodiments of the invention.
  • a client-server system provides the ability to control various aspects of a live-action and/or an animated scene, including cameras and/or light sources (either real and/or virtual), animated characters, and other objects. This can include, merely by way of example, moving cameras, lights and/or the like, as well as rendering animated objects (e.g., based on movements of the objects themselves and/or based on movements of cameras, lights, etc.).
  • a client animation computer accepts input (e.g., via one or more input devices) and provides that input to an animation server computer.
  • the client animation may provide raw input from the input device.
  • the input indicates a desired movement and/or position of an animated character, relative to other objects in a virtual scene.
  • the animation server computer after receiving the input, calculates a set of data (including, merely by way of example, data describing a deformation of a model, such as joint rotations and/or joint angles) that describe the desired movement and/or position of the character.
  • the animation server computer After calculating the set of joint angles, the animation server computer transmits the set of joint angles to the animation client computer.
  • the animation client computer then renders the animated character in the desired position, based on the set of joint angles, as well as a set of polygons and one or more textures defining the animated character.
  • polygons broadly refers not only to the traditional polygons used to form a model of an object, but also to any other structures that commonly are used to form a model of an object, including merely by way of example, NURBS surfaces, subdivision surfaces, level sets, volumetric representations, and point sets, among others.
  • the animation client computer can store some of the files necessary to render the character, and can in fact render the character if provided the proper joint angles. This is beneficial, in many situations, because it relieves the animation server of the relatively processor-intensive task of rendering the animation. This arrangement, however, also allows the server to perform the joint calculations, which, while generally not as processor-intensive as the rendering process, often impose relatively high file input/output (ā€œI/Oā€) requirements, due to the extensive size of the databases used to hold data for performing the calculation of joint angles.
  • I/O file input/output
  • This exemplary system provides a distribution of work that takes advantage of the strength of the animation client (that is, the ability to provide a plurality of animation client computers for performing the processor-intensive rendering tasks for various animation projects), while also taking advantage of the strength of typical server computers (that is, the ability to accommodate relatively high file I/O requirements).
  • a central server provides rendering services
  • extremely powerful (and therefore expensive) servers and in many cases, server farms
  • Such systems often also feature relatively powerful workstations as animation clients, but the processing power of the workstations is not harnessed for the rendering.
  • This exemplary system provides additional advantages, especially when compared with systems on which the animation (i.e., joint rotation calculation) and rendering processes occur on the animation client.
  • the exemplary system described above facilitates the maintenance of data. For instance, since the joint rotations for a particular animation are calculated at the animation server, they can easily be stored there as well, and a variety of version-tracking and change-management protocols may be employed.
  • individual clients as opposed to an animation server
  • joint rotations and/or other position data
  • the system can be configured to prevent the animation client from accessing sufficient data to independently perform the animation process, preventing unauthorized copying of animations and thereby providing greater security for that intellectual property.
  • Model 10 the form of a human figure is disclosed.
  • Model 10 includes face 11 , neck 12 , arms 14 with elbow 15 and wrist 16 leading to hand 17 .
  • the model further includes hip 18 knees 19 and ankles 20 .
  • the ā€œmodelā€ (or ā€œvirtual modelā€) is a geometric description of the shape of the character in one specific pose (commonly called the ā€œmodel pose,ā€ ā€œneutral pose,ā€ or ā€œreference pose.ā€
  • the neutral pose used in the model is commonly a variation on the so called ā€œda Vinci poseā€ in which the model is shown standing with eyes and head looking forward, arms outstretched, legs straight with feet approximately shoulder width apart.
  • the model can be duplicated in any number of ways.
  • a clay model or human model is scanned or digitized, recording the spatial coordinates of a numerous points on the surface of the physical model so that a virtual representation of the model may be reconstructed from the data. It is to be understood that such models can be the product of great effort, taking man years to construct.
  • the model also includes connectivity data (also called an ā€œedge listā€). This data is recorded at the time of scanning or inferred from the locations of the points, so that the collection of points can be treated as the vertices of a polygonal approximation of the surface shape of the original physical model. It is common, but not required, in the prior art for various mathematical smoothing and interpolation algorithms to be performed on the virtual model, so as to provide for a smoother surface representation than is achieved with a pure polygonal representation.
  • virtual models commonly include collections of spatial coordinates ranging from hundreds of points to hundreds of thousands or more points.
  • Rig 30 is illustrated which is compatible with model 10 shown in FIG. 4A .
  • Rig 30 includes head 31 , neck 32 , eyes 33 , shoulders 34 elbows 35 and wrist 36 . Further, hips 38 knees 39 and ankles 40 are also disclosed.
  • rig 30 is mathematically disposed on model 10 so that animation can move the rig 30 at neck 32 , shoulders 34 elbows 35 and wrist 36 . Further, movement of hips 38 , knees 39 , and ankles 40 can also occur through manipulation of the rig 30 .
  • the rig 30 enables the model 10 to move with realistic changes of shape.
  • the rig 30 thus turns the model 10 into a virtual character commonly required to move and bend, such as at the knees or elbows, in order to convey a virtual performance.
  • the software and data used to deform (or transform) the ā€œneutral poseā€ model data into a specific ā€œactive poseā€ variation of the model is commonly called a ā€œrigā€ or ā€œIK rigā€ where (ā€œIKā€ is a shortened form of ā€œInverse Kinematicsā€).
  • IK Rig Inverse Kinematics
  • Force Kinematics is the term of art for computing joint locations based on a collection of joint angles and skeletal relationships.
  • a rig is a piece of software which has as its inputs a collection of joint rotations, joint angles and/or joint locations (ā€œthe right elbow is bent 30 degreesā€ or ā€œthe tip of the left index finger is positioned 2 cm above the center of the light switchā€), the skeletal relationships between the joints (ā€œthe head bone is connected to the neck boneā€) and a neutral pose representation of the virtual model, and has as its output a collection of spatial coordinates and connectivity data describing the shape that the virtual actor's body takes when posed as described by the input date.
  • a rig is a visual representation of the skeleton of the virtual actor, with graphical or other controls which allow the artist to manipulate the virtual actor.
  • an ā€œInverse Kinematics Rigā€ the artist might place a mouse on the left index finger of the virtual actor and dragging the left index finger across the screen so as to cause the virtual actor's arm to extend in a pointing motion.
  • a ā€œForward Kinematics Rigā€ the artist might click on the elbow of the virtual character and bend or straighten the rotation of the elbow joint by dragging the mouse across the screen or by typing a numeric angle on the keyboard.
  • texture 50 is illustrated here in the form of only a man's suit having a coat 51 and pants 52 .
  • texture 50 would include many other surfaces.
  • a human face, socks, shoes, hands would all be part of the illustrated texture 50 .
  • a virtual actor to be drawn (or ā€œrenderedā€) to the screen with, for example, blue eyes and a red jacket.
  • the virtual model described earlier contains purely spatial data. Additional data and/or software, commonly called ā€œTextures,ā€ ā€œMaps,ā€ ā€œShaders,ā€ or ā€œShadingā€ is employed to control the colors used to render the various parts of the model.
  • Vertex Color In the earliest forms of the prior art, the color (or ā€œtextureā€) information was encoded directly into the model, in the form of a ā€œVertex Color.ā€
  • a Vertex Color is commonly an RGB triplet (Red 0-255, Green 0-255, Blue 0-255) assigned to a specific vertex in the virtual model.
  • RGB triplet Red 0-255, Green 0-255, Blue 0-255
  • the model may be colored in such a way as to convey blue eyes and a red dress.
  • Texture coordinates are additional data that is recorded with the vertex location data and connectivity data in order to allow an image (or ā€œtextureā€) to be ā€œmappedā€ onto the surface.
  • a digital image of an eye will be acquired (possibly via a digital camera or possibly by an artist who paints such a picture using computer software).
  • the digital image of the eye is a ā€œtexture mapā€ (or ā€œtextureā€).
  • the vertices in the model that comprise the surface of the eye will be tagged with additional data (ā€œtexture coordinatesā€) that is analogous to latitude and longitude coordinates on a globe.
  • the texture is ā€œmappedā€ onto the surface by use of the texture coordinates.
  • the virtual character is instructed to look to the left (for example by rotating the neck controls 32 or eye controls 33 in the rig), the virtual model is deformed in a manner which rotates all of the vertices making up the head 11 to the left.
  • the head texture is then rendered in the desired location on the screen based upon the vertex locations and the texture coordinates of those vertices.
  • an animated scene comprises some or all of the following: a virtual set (set 60 being shown), one or more props (floor 69 , wall 70 , chair 61 and table 62 being shown), and one or more virtual characters (model 10 manipulated by rig 30 having texture 50 being shown [here designated only by the 10 ]).
  • Each of the virtual sets, props, and characters has a fiducial reference point with respect to which the location of the element may be specified.
  • the fiducial reference point is shown at 65 .
  • the virtual set 60 , props (see chair 61 and table 62 ), and character 10 are assembled together by specifying their spatial locations using a shared coordinate system from fiducial 65 .
  • the choice of coordinate system is arbitrary, but a common practice is to locate the virtual set at the origin of the coordinate system.
  • the background (or ā€œvirtual setā€) is essentially a virtual character with either no rig (in the case of a purely static virtual set) or what is commonly a very simple rig (where the joint angles might control the opening angles of a door 67 or the joint locations might control the opening height of a window 68 ). It is common to embellish the scene used in an action with a variety of props. As with the background, props are again essentially virtual characters which are used to represent inanimate objects.
  • animation data is associated with the elements of a scene to create an action.
  • a sequence of images may be constructed by providing the rig of character 10 with a sequence of input data (ā€œanimation dataā€) such as 24 sets of joint angles per second, so as to produce a 24 frame per second movie.
  • animation data such as 24 sets of joint angles per second, so as to produce a 24 frame per second movie.
  • the animation data provided to the rigs are commonly compressed through the use of various interpolation techniques.
  • the various families of mathematical formulas used to interpolate between key frame values are well known to practitioners of the art.
  • the artist can move an indicator on the timeline or press a ā€œplayā€ button to watch the animation that she has created.
  • the artist can create the desired performance.
  • motion capture In the ā€œmotion capture,ā€ method the artist performs the motion in some manner while the computer records the motion.
  • Common input devices for use with motion capture include a full-body suit equipped with sensing devices to record the physical joint angles of a human actor and so-called ā€œWaldoā€ devices which allow a skilled puppeteer to control a large number of switches and knobs with their hands (Waldo devices are most commonly used for recording facial animations). It is common to perform multiple captures of the same motion, during which sequence of captures the actor repeatedly reenacts the same motions until data is collected which is satisfactory both artistically and technically.
  • procedural animation is commonly used when animating non-human actors such as flocks of birds or falling rocks.
  • bird 63 illustrates this technique.
  • the motion capture and/or procedural method is used to specify the initial data.
  • the initial movement of bird 63 would be used in the procedural method.
  • the data obtained via the motion capture or procedural method is then compressed in a manner that makes it technically similar (or compatible with) data obtained via the animation method. For example, presuming that bird 63 was going to interact with character 10 in scene 60 , modification of the procedural image of bird 63 would occur.
  • the initial data is then in the hybrid method manipulated, re-timed, and/or extended through the use of the animation method.
  • the animation software often plays back previously specified animations by interpolating animation data at a specific point in time, providing the interpolated animation data to the rigs, making use of the rigs to deform the models, applying textures to the models, and presenting a rendered image on the display.
  • the animation software then advances to a different point in the timeline and repeats the process.
  • FIG. 2 illustrates a client/server animation system in accordance with a set of embodiments.
  • the system comprises a plurality of animation client computers 200 in communication (e.g., via a network) with server computer 210 as shown in FIG. 2 .
  • the network can be any suitable network, including without limitation a local area network, wide area network, wired and/or wireless network, the Internet, an intranet or extranet, etc.
  • models 10 , rigs 30 and/or textures 50 may be stored at the client 200 , e.g., at model, rig, and texture storage 201 . In this particular embodiment, such storage has advantages.
  • the animation client computer may, in some embodiments, include rendering software 203 operatively connected to model, rig, and texture storage 201 .
  • the rendering software may be part of an animation client application.
  • a controller 202 (here shown as a keyboard and mouse) operates through network connection 205 . It should be noted that any suitable controller, including those described in U.S. patent application Ser. No. ______, (attorney docket number 020071-000210), already incorporated by reference, can be used in accordance with various embodiments.
  • the animation server computer 210 includes animation data storage 211 , animation software 212 , and/or version tracking software 214 . Presuming that the artist or programmer has created the models 10 , rigs 30 , and textures 50 , manipulation of an Action on a scene 60 can either occur from the beginning (de novo) or, alternately, the artist and/or programmer may check out a previous version of the Action through the network connection 205 by accessing animation server 210 and retrieving from animation data storage 211 the desired data to animation software 212 .
  • input data (e.g., based from the controller 202 and/or an actuator thereof) is received by the client 200 .
  • the input data may be described by an auxiliary coordinate system, in which case the input data may be processed as described in U.S. patent application Ser. No. ______ (attorney docket number 020071-000210), already incorporated by reference. Other processing may be provided as well, as necessary to format the raw input data received from the controller 202 .
  • the animation client 200 transmits the input (either as raw input data and/or after processing by the animation client computer 200 ), is transmitted (e.g., via network connection 205 ) to the animation server computer 210 , and, more particularly, animation software 212 (which might be incorporated in an animation server software). Processing of the selected Action will occur at animation software 212 within animation server 210 . Such processing will utilize the techniques above described. In particular, a set of joint rotations may be calculated, based on the input data. The joint rotations will describe the position and/or motion desired in the Action.
  • Playback will occur by having animation software 212 emit return animation information through network connection 205 and then to rendering software 203 .
  • Rendering software 203 will access model, rig, and texture storage 201 to display at display 204 the end result of modifications introduced by the artist and or programmer at the client 200 .
  • animation server 210 (and/or another server in communication therewith) can provide a number of services.
  • the server 210 can provide access control; for instance, client 200 is required to log in to server 210 .
  • subscription may be required as a prerequisite for access to server 210 .
  • server 210 can deliver different capabilities for different users.
  • PC 200 A can be restricted to modification of character motion while PC 200 modifies animation of bird 68 .
  • the client 200 controls the time when playback starts and stops for any individual Action. Moreover, the client 200 may arbitrarily change the portion of the Action being worked on by simply referring to that Action at a specified time period.
  • the rendering software 203 and the model, rig, and/or texture software in storage 201 the data transmitted over the network through network connection 205 is maintained at a minimum. Specifically, just a small section of data need be transmitted. This data will include that which is needed to play the animation (e.g., a set of joint rotations, etc.). As the rendering software 203 and some or all of the model, rig, and/or texture storage 201 may be resident at PC 200 , only small batches of data need be transmitted over the Internet.
  • the server 210 is useful in serving multiple clients. Further, the server 210 can act as a studio, providing the artist and/or programmer at client 200 with a full range of services including storage and delivery of updated model, texture, and rig data to client 200 and client 200 A. In a set of embodiments, the server 210 will store all animation data. Furthermore, through version tracking software 214 , animation data storage 211 will provide animation data (such as joint rotations, etc.) to the respective client 200 A on an as needed basis.
  • client 300 includes a network connection 305 , a controller 302 , and a display 304 .
  • Server 310 includes animation data storage 211 , version tracking software 214 , and animation software 212 .
  • server 310 includes rendering software 303 .
  • the manipulation of the animation software from controller 302 through network connection 305 of the client 300 is identical to that shown in FIG. 2 .
  • the animation software for calculating joint rotations, etc. is resident on the server 310 .
  • the rendering component also resides on the server 310 .
  • rendering software 303 will generate actual images (e.g., bitmaps, etc.), which images will be sent through the network to network connection 305 and may be displayed thereafter at display 304 .
  • FIG. 6 provides a generalized schematic diagram of a client/server system in accordance with some embodiments of the invention.
  • the system 600 includes an animation server computer 605 , which may be a PC server, minicomputer, mainframe, etc. running any of a variety of available operating systems including UNIXTM (and/or any of its derivatives, such as Linux, BSD, etc.), various varieties of Microsoft WindowsTM (e.g., NTTM, XPTM, 2003, VistaTM, MobileTM, CETM, etc.), Apple's Macintosh OSTM and/or any other appropriate server operating system.
  • the animation server computer also includes animation server software 610 , which provides animation services in accordance with embodiments of the invention.
  • the animation server computer 605 may also comprise (and/or have associated therewith) one or more storage media 615 , which can include storage for the animation server software 610 , as well as a variety of associated databases (such as a database of animation data 615 a , a data store 615 b for model data, such as the polygons and textures that describe an animated character, a data store 615 c for scene data, and any other appropriate data stores).
  • storage media 615 can include storage for the animation server software 610 , as well as a variety of associated databases (such as a database of animation data 615 a , a data store 615 b for model data, such as the polygons and textures that describe an animated character, a data store 615 c for scene data, and any other appropriate data stores).
  • the system 600 further comprises one or more animation client computers 620 , one or more of which may include local storage (not shown), as well as animation client software 625 .
  • the rendering subsystem may reside on the animation server 620 , as described with respect to FIG. 3 , for example. In this way, thin clients, such as wireless phones, PDAs, etc. may be used to provide input even if they have insufficient processing power to render the objects).
  • the animation client computer thus 620 may be, inter alia, a PC, workstation, laptop, tablet computer, PDA, wireless phone, etc. running any appropriate operating system (such as Apple's Macintosh OSTM, UNIX and/or its derivatives, Microsoft WindowsTM, etc.)
  • Each animation client 620 may also include one or more display devices 630 (such as monitors, LCD panels, projectors, etc.) and/or one or more input devices 635 (such as the controllers described above and in U.S. patent application Ser. No. ______ (attorney docket number 020071-000210), already incorporated by reference, as well as, to name but a few examples, a telephone keypad, a stylus, etc.).
  • the system 600 may operate in the following exemplary manner, which is described by additional reference to FIG. 7 , which illustrates a method 700 of creating an animated work in accordance with some embodiments of the invention.
  • FIG. 7 illustrates a method 700 of creating an animated work in accordance with some embodiments of the invention.
  • the animation client software 625 comprises instructions executable by the animation client computer 620 to accept a set of input data from one or more input devices (block 705 ).
  • the input data may, for example, indicate a desired position of an object in a scene (which may be a virtual scene, a physical set, etc.)
  • the object may be an animated object, which may comprise a plurality of polygons and/or textures, as described above.
  • the animation client software optionally may process the input data, for example as described above.
  • the animation client software then transmits the set of input data for reception by the animation server computer (block 710 ).
  • the animation server computer 605 receives the input data (block 715 ).
  • the animation server software 610 calculates a set of position data (block 720 ), based on the received input data.
  • calculating the set of position data can include processing the input data to determine a desired position of an animated object and/or calculating a set of joint rotations defining that desired position (and/or defining the deformation of a rig defining the character, in order to place the character in the desired position).
  • the position can be determined based solely on the input data, perhaps in conjunction with a current position of the object.
  • the object may be an animated character (or other object in a virtual scene), and the position of the object in the scene may be affected by the position of a virtual camera and/or light source.
  • the position data might comprise data about the position and/or orientation of the virtual camera/light.
  • the animation server computer 605 (perhaps based on instructions from the server software 610 ) then transmits the set of position data (e.g., joint rotations, etc.) for reception by the animation client 620 (block 725 ).
  • the animation client computer receives the set of position data (block 730 )
  • the animation client software 625 is responsible for placing the object in the desired position (block 735 ).
  • This procedure necessarily will vary according to the nature of the object.
  • placing the object in the desired position generally will comprise rendering the animated character in the desired position, for example by calculating a set of positions for the polygons that describe the character and/or by applying any necessary textures to the model.
  • placing the object in the desired position may require interfacing with a movement system, which is not illustrated on FIG. 6 but examples of which are described in detail in U.S. patent application Ser. No. ______ (attorney docket number 020071-000210), already incorporated by reference).
  • the object (for instance, if the object is a virtual object) may be displayed on a display device 630 (block 740 ).
  • the object may be displayed in the desired position.
  • the client 620 may be configured to upload the rendered object to the animation server 605 for storage and/or distribution to other computers (which might be, inter alia, other animation servers and/or clients).
  • the system 600 may provide a number of other features, some of which are described above.
  • the animation server 605 can provide animation services to a plurality of animation client computers (e.g., 620 a , 620 b ).
  • input may be received at a first client 620 a
  • the position data may be transmitted to a second client 620 b for rendering and/or display.
  • the plurality of client computers 620 may perform rendering tasks in parallel for a given scene).
  • each client 620 a , 620 b accepts input and receives position data, such that two artists may collaborate on a given character and/or scene, each being able to view changes made by the other.
  • each client 620 a , 620 b may interact individually with the server 605 , with each client 620 providing its own input and receiving position data based on that input. (That is, the position data received by one client has no impact on the rendering of an object on another client.)
  • the animation server software 610 may be configured not only to calculate the position data, but also to render the object (which can include, merely by way of example, not only applying one or more textures to a model of the object, but also to calculating the positions of the polygons that make up the model, based on the position data).
  • the rendered object may be provided to an animation client computer 620 (which may or may not be the same client computer that provided the input on which the position data is based), which then can display the object in the desired position.
  • the animation server 605 might render a first object for a first client 620 and might merely provide to a second client a set of position data describing a desired position of a second object.
  • one or more of the data stores may be used to store object definition files, which can include some or all of the information necessary for rending a given object, such as the model, rig, polygons, textures, etc. describing that object.
  • An animation client 620 then can be configured to download from the server 605 the object definition files (and/or a subset therefore) to perform the rendering of the object in accordance with embodiments of the invention. It should be noted, however, that for security, the downloaded object definition files (and/or portions thereof) may be insufficient to allow a user of the client 620 to independently recreate the object without additional data resident on the server.
  • the system 600 may be configured such that a user of the client 620 is not allowed to modify these object definition files locally at the client 620 and/or, if local modification is allowed, the client 620 may not be allowed to upload modified object definition files. In this way, the system 600 can prevent the unauthorized modification of a ā€œmaster copyā€ of the object definition files.
  • the server software 610 may be configured to allow modified object definition files to be uploaded (and thus to receive such files), perhaps based on an identification of the user of the animation client computerā€”that is, the server 620 may be configured to identify the user and determine whether the user has sufficient privileges to upload modified files. (It should be noted that the identification, authentication and/or authorization of users may be performed either by the animation server 605 and/or by another server, which might communicate such identification, authorization and/or authentication data to the animation server 605 .)
  • the animation server software 610 may be configured to determine whether to allow an animation client 620 to interact with the server software 610 .
  • the animation server software 620 may control access to rendered objects, object definition files, the position data, and/or the software components used to create either of these, based on any number of factors.
  • the server software 610 (and/or another component) may be configured to identify, authenticate and/or authorize a user of the animation client 620 .
  • the animation server software 610 may determine whether it will receive input from the client computer 620 , whether it will provide position data to the animation client computer 620 and/or whether it will allow the animation client computer 620 to access files and/or animation services on the animation server 605 .
  • the animation server 605 may be configured to provide for-fee services.
  • the animation server software (and/or another component) may be configured to evaluate a set of payment and/or billing information (which may be, but is not necessarily, associated with an identity of a user of the animation client computer 620 ), and based on the set of payment and/or billing information, determine whether to allow the client 620 to interact with the server software 610 (including, as mentioned above, whether it will accept/provide data and/or allow access to files and/or services).
  • the set of billing and/or payment data can include, without limitation, information about whether a user has a subscription for animation services and/or files, whether the user has paid a per-use fee, whether the user's account is current, and/or any other relevant information.
  • various levels of interaction with the server software 610 may be allowed.
  • the animation server computer 605 stores a plurality of sets of rendered objects and/or object definition files (wherein, for example, each set of files comprises information describing a different animated character)
  • the animation server 605 may allow an unregistered user to download files for a few ā€œfreeā€ characters, while paid subscribers have access to files an entire library of characters (it should be appreciated that there may be various levels of subscription, with access to files for corresponding various numbers of characters).
  • a user may be allowed to pay a per-character fee for a particular character, upon which the user is allowed to download the set of files for that character.
  • a user may be given access to services and/or data on the animation server.
  • the user may use animation services (including without limitation those described above) for that animated character.
  • a user may have a monthly subscription to use files for a set of animated characters, and the user may use the animation server as part of the monthly subscription. Other for-fee uses are possible as well.
  • a user may pay, for example, a per-use and/or subscription fee for access to the services of an animation server, apart from any fees that might be paid for the use of object definition files.
  • the animation server software 610 may also be configured to perform change tracking and/or version management of object definition files (and/or rendered objects, position data, etc.).
  • any of several known methods of change tracking and/or version management may be used for this purpose.
  • the change tracking/version management functions may be configured to allow various levels of access to files based on an identity of a user and/or a project that the identified user is working on.
  • an artist in a group working on a particular character, scene, film, etc. may be authorized to access (as well, perhaps, as download) files related to that character, scene, film, while a manager or senior artist might be authorized to modify such files.
  • An artist working on another project might not have access to any such files.
  • the animation server software 605 may also be configured to distribute (e.g., to other clients and/or servers) a set of modified object definition files, such that each user has access to the most recent version of these files. As described above, access to a distribution of these modified files may be controlled based on an identity of the user, various payment or billing information, etc.
  • Embodiments of the invention can be configured to protect stored and/or transmitted data, including without limitation object definition files, rendered objects, input data, position data, and the like.
  • data can be protected in a variety of ways.
  • data may be protected with access control mechanisms, such as those described above.
  • other protection measures may be implemented as well.
  • data may be encrypted prior to being stored at an animation server and/or prior to being transmitted between an animation server and an animation client, to prevent unauthorized access to such data.
  • data may be digitally signed and/or certified before storage and/or before transmission between computers. Such signatures and/or certifications can be used, inter alia, to verify the identification of an entity that created and/or modified such data, which can also facilitate change tracking and/or version management of various data used by embodiments of the invention.
  • FIG. 8 provides a generalized schematic illustration of one embodiment of a computer system 800 that can perform the methods of the invention and/or the functions of computer, such as the animation server and client computers described above.
  • FIG. 8 is meant only to provide a generalized illustration of various components, any of which may be utilized as appropriate.
  • the computer system 800 can include hardware components that can be coupled electrically via a bus 805 , including one or more processors 810 ; one or more storage devices 815 , which can include without limitation a disk drive, an optical storage device, solid-state storage device such as a random access memory (ā€œRAMā€) and/or a read-only memory (ā€œROMā€), which can be programmable, flash-updateable and/or the like (and which can function as a data store, as described above).
  • RAM random access memory
  • ROM read-only memory
  • Also in communication with the bus 805 can be one or more input devices 820 , which can include without limitation a mouse, a keyboard and/or the like; one or more output devices 825 , which can include without limitation a display device, a printer and/or the like; and a communications subsystem 830 ; which can include without limitation a modem, a network card (wireless or wired), an infra-red communication device, and/or the like).
  • input devices 820 can include without limitation a mouse, a keyboard and/or the like
  • output devices 825 which can include without limitation a display device, a printer and/or the like
  • a communications subsystem 830 which can include without limitation a modem, a network card (wireless or wired), an infra-red communication device, and/or the like).
  • the computer system 800 also can comprise software elements, shown as being currently located within a working memory 835 , including an operating system 840 and/or other code 845 , such as the application programs (including without limitation the animation server and client software) described above and/or designed to implement methods of the invention.
  • an operating system 840 and/or other code 845 such as the application programs (including without limitation the animation server and client software) described above and/or designed to implement methods of the invention.
  • application programs including without limitation the animation server and client software
  • the computer system 800 also can comprise software elements, shown as being currently located within a working memory 835 , including an operating system 840 and/or other code 845 , such as the application programs (including without limitation the animation server and client software) described above and/or designed to implement methods of the invention.
  • application programs including without limitation the animation server and client software

Abstract

Various embodiments of the invention provide novel software, systems and methods for animation and/or filmmaking. In a set of embodiments, for example, a client-server system provides the ability to control various aspects of a live-action and/or an animated scene, including cameras and/or light sources (either real and/or virtual), animated characters, and other objects. This can include, merely by way of example, moving cameras, light and/or the like, as well as rendering animated objects (e.g., based on movements of the objects themselves and/or based on movements of cameras, lights, etc.).

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present disclosure may be related to the following commonly assigned applications/patents:
  • This application claims priority from co-pending U.S. Provisional Patent Application No. 60/623,414 filed Oct. 28, 2004 by Alvarez et al. and entitled ā€œClient/Sever-Based Animation Software.ā€
  • This application also claims priority from co-pending U.S. Provisional Patent Application No. 60/623,415 filed Oct. 28, 2004 by Alvarez et al. and entitled ā€œControl Having Interchangeable Coordinate Control Systems.ā€
  • This application is also related to co-pending U.S. patent application Ser. No. ______, filed on a date even herewith by Alvarez et al. and entitled ā€œCamera and Animation Controller, Systems and Methods.ā€
  • The respective disclosures of these applications/patents are incorporated herein by reference in their entirety for all purposes.
  • COPYRIGHT STATEMENT
  • A portion of the disclosure of this patent document contains material that is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
  • FIELD OF THE INVENTION
  • The present invention relates to the field of animation and filmmaking in general and, in particular, to software, systems and methods for creating and/or editing animations and/or films, including any type of film-based and/or digital still and/or video image production.
  • BACKGROUND OF THE INVENTION
  • Animated films have long been favorites of both children and adults. More recently, advances in computer animation have facilitated the process of making animated films (and in storyboarding and/or adding computer effects to live-action films). Generally, animation software has been used on a PC with display, keyboard, mouse, animation software and rendering software. Each PC is a standalone unit that contains the animation data to be worked on and has the animation software that will provide to the animation data the added contributions and movements imparted by the programmer or artist at the PC.
  • In some cases, especially in large organizations, networked computers have been used in the animation process. Referring to FIG. 1, a typical network might comprise a central server system S with version tracking software 100, which stores the animation data files in bulk storage 101. When a user wishes to perform an animation task, she accesses the server, checks out the relevant data files, and alters the animation data files with animation software 111 and actuators 115 (here represented as a keyboard and mouse) resident in the PC. In order to check her work, the artist replays the altered animation data locally through rendering software 112 resident on the PC, viewing the animation data movements at the PC's local display 114. Finally, and at the end of a working session, the user often will check in the altered data files as a new version added to bulk storage 101 and tracked by version tracking software 100.
  • Other systems perform animation and rendering at a server (and/or a server farm). These systems generally require very powerful servers, because the server(s) have to render (i.e., generate an image or sequence of images for) animations produced by many client workstations. Moreover, users may suffer long delays while waiting for a scene to be rendered by the server(s).
  • Such systems present significant limitations. For instance, the traditional server-based animation systems fail to take maximum advantage of available computing resources. While most artists have relatively high-performance workstations, systems that render animations at a server often fail to take full advantage of the processing available on the workstation. Conversely, systems that rely on the workstation to perform the animation and rendering fail to take advantage of a principle strength of server-class computers: high performance file input/output. While rendering generally is a very processor-intensive task, animating generally is less processor-intensive but involves accessing a large amount of data, a task at which severs generally excel.
  • Moreover, client-based animation systems make the management of data (including version control, security of intellectual property, etc.) quite cumbersome. Merely by way of example, if two or more programmers or artists are working upon otherwise substantially identical portions of animation, incompatible variations can be introduced. These incompatible variations are a direct result of the local temporary storage of the modified data. When it is remembered that the final animation project is typically composed of many man years of effort, the presence of incompatible variations can present severe complication.
  • Additionally, modern animation includes the use of expensive tools and processes to generate models of the three-dimensional shapes describing objects or characters used in the animation. Local unsupervised and inconsistent modification of the models in checked out animation data can occur. Further, presuming that model modification is made during the animation process, animation data previously recorded must, in the usual case, be completely reworked.
  • Furthermore, both animation software and the work product of the animators is subject to a high risk of piracy. By storing the suite of animation software at the local PC 110 and/or allowing a user to obtain all relevant files related to an animation, the producer of a movie exposes these assets to unauthorized copying.
  • Hence, existing systems, which generally concentrate the animation and rendering tasks together on either a server or a client, suffer significant drawbacks.
  • Definition of Terms
  • Certain terms, as used in this disclosure, have the following defined meanings:
  • Model. A three-dimensional shape, usually described in terms of coordinates and mathematical data, describing the shape of any character or object. Examples of characters include actors, animals, or other beings whose animation can tell or portray the story. In the usual case, the model is typically provided in a neutral pose (known in the art as a ā€œda Vinci poseā€), in which the model is shown standing with limbs spread apart and head looking forward. It is understood in the art that in, many situations, the generation of the model can be extraordinarily expensive. In some cases, the model is generated, scanned or otherwise digitized with recorded spatial coordinates of numerous points on its surface. A virtual representation of the model can occur when the data is reconstructed. Furthermore, the model may include connectivity data, such that the collection of points defining the model can be treated as the vertices of polygonal approximations of the surface shape of the model. The model may include various mathematical smoothing and/or interpolation algorithms. Such models can include collections of spatial points ranging from hundreds of points to hundreds of thousands or more points.
  • Render. To make a model viewable as an image, such as by applying textures to a model and/or imaging the model using a real or virtual camera or by photographing a real object.
  • Rig. In general, the term ā€œrigā€ is used to refer to a deformation engine that specifies how movement of the model should translate into animation of a character based on the model. This is the software and data used to deform or transform the ā€œneutral poseā€ of the model into a specific ā€œactive poseā€ variation of the model. Taking the example of the human figure, the rig would impart to the model the skeletal joint movement including shoulder, elbow, hand, finger, neck, head, hip, knee, and foot movement and the like. By having animation software manipulate a rig incorporated to a model, animated movement of the model is achieved.
  • Texture. In the usual modern case, one or more are mapped onto the surface of a model to provide a digital image portrayed by the model as manipulated by the rig.
  • Virtual Character. The model as deformed by the rig and presented by the texture in animation.
  • Virtual Set. The vicinity or fiducial reference point and coordinate system with respect to which the location of any element may be specified.
  • Prop. An object on the virtual set usually comprising a model without a rig.
  • Scene. A virtual set, one or more props, and one or more virtual characters.
  • Action. Animation associated with a scene. It should be noted that upon editing of the final animation story, portions of an action may be distributed without regard to time for example at the beginning, middle and end of the animation story.
  • Editing. The process by which portions of actions are assembled to construct a story, narrative, or other product.
  • Actuator. A device such as a mouse or keyboard on a personal computer enabling input to the animation software. This term includes our novel adaptation of a ā€œgame controllerā€ for imparting animation to characters.
  • BRIEF SUMMARY OF THE INVENTION
  • Various embodiments of the invention provide novel software, systems and methods for animation and/or filmmaking. In a set of embodiments, for example, a client-server system provides the ability to control various aspects of a live-action and/or an animated scene, including cameras and/or light sources (either real and/or virtual), animated characters, and other objects. This can include, merely by way of example, moving cameras, lights and/or the like, as well as rendering animated objects (e.g., based on movements of the objects themselves and/or based on movements of cameras, lights, etc.).
  • One set of embodiments, for example, provides systems that can be used in the filmmaking process and/or systems for producing animated works. An exemplary system, in accordance with some embodiments, includes an animation client computer, which may comprise a first processor, a display device, at least one input device, and/or animation client software. The system may further include an animation server computer comprising a processor and animation server software.
  • In certain embodiments, the animation client software may comprise instructions executable by the first processor to accept a set of input data from the at least one input device. The set of input data may indicate a desired position for an animated object, which might comprise a set of one or more polygons and/or a set of one or more textures to be applied to the set of one or more polygons. The animation client software might further comprise instructions executable by the first processor to transmit the set of input data for reception by the animation server computer.
  • The animation server software can comprise instructions executable by the second processor to receive the set of input data from the animation client computer and/or to process the input data to determine the desired position of the animated object. The animation server software may also comprise additional instructions executable by the second processor to calculate a set of joint rotations defining the desired position of the animated object and/or to transmit the set of joint rotations for reception by the animation client computer.
  • The animation client software, then, may comprise further instructions executable by the first processor to receive the set of joint rotations defining the position of the animated object and/or to calculate (perhaps based on the set of joint rotations) a set of positions for the set of one or more polygons. There may also be additional instructions executable by the first processor to apply to the set of one or more polygons at least one of the textures from the set of one or more textures to render the animated object in the desired position. The rendered animated object then may be displayed by the animation client, and/or the set of joint rotations may be stored at a data store associated with the animation server computer.
  • In a particular embodiment, the animation client computer may be a plurality of animation client computers including a first animation client computer and a second animation client computer. The first animation client computer might comprise the input device(s), while the second animation client computer might comprise the display device(s). The animation server computer then, might receive the set of input data from the first animation client computer and/or transmit the set of joint rotations for reception by the second animation client computer, which might be configured to receive the set of joint rotations, calculate a set of positions for the set of one or more polygons based on the set of joint rotations, apply to the set of one or more polygons at least one of the textures from the set of one or more textures to render the animated object in the desired position, and/or display on the display device the rendered animated object.
  • In another set of embodiments, the animation client software may comprise instructions executable by the first processor to accept a set of input data (which might indicate a desired position for an object) from the at least one input device and/or instructions executable by the first processor to transmit the set of input data for reception by the animation server computer. In some embodiments, the animation server software comprises instructions executable by the second processor to receive the set of input data from the animation client computer and/or to transmit for reception by the animation client computer a set of position data, perhaps based on the set of input data received from the animation client computer. The animation client software might further comprise instructions executable by the first processor to receive the set of position data from the animation server computer and/or to place the object in the desired position, based at least in part on the set of position data.
  • In various embodiments, the object can be a virtual object (including without limitation a virtual camera, a virtual light source, etc.) and/or a physical object (including without limitation a device, such as a camera, a light source, etc., in communication with the animation client computer, and/or any other appropriate object). Merely by way of example, the object may be an animated character, which might comprise a set of polygons and at least one texture, such that placing the object in the desired position comprises rendering the animated character in the desired position.
  • In one aspect, the set of position data might comprise data (such as joint rotations, joint angles, etc.) defining a position of the object and/or defining a deformation of a rig describing the object. In another aspect, the set of position data might comprise a position and/or orientation of a real or virtual camera; the position of the object in the scene may be affected by the position and/or orientation of the real or virtual camera, such that the placement of the object depends on the position and/or orientation of the real or virtual camera.
  • In certain embodiments, the animation server has an associated data store configured to hold a set of one or more object definition files for the animated object, the set of one or more object definition files collectively specifying a set of polygons and textures that define the object (e.g., the object definition files may comprise one or more textures associated with the object). Hence, the animation client software may comprise instructions executable by the first processor to download from the animation server computer at least a portion of the set of one or more object definition files necessary to render the object. In some cases, however, the downloaded portion of the set of one or more object definition files may be insufficient to independently recreate the animated object without additional data, which might be resident on the animation server computer. Similarly, in some configurations, the animation client computer might be unable to upload to the animation server computer any modifications of the at least a portion of the set of one or more object definition files.
  • In other configurations, the animation client software comprises further instructions executable by the first processor to modify the object definition files to produce a set of modified object definition files. Optionally, the animation server software comprises instructions executable by the second processor to receive the set of modified object definition files and/or to track changes to the set of object definition files. In some cases, the animation server computer may be configured to identify a user of the animation client computer and/or to determine whether to accept the set of modified object definition files, perhaps based on an identity of the user of the animation client computer. In other cases, the animation server computer may be configured to distribute the set of modified object definition files to a set of animation client computers comprising at least a second animation client computer.
  • In some embodiments, the data store is configured to hold a plurality of sets of one or more object definition files for a plurality of animated objects. Optionally, the animation server software might comprise further instructions executable by the second processor to determine whether to provide to the animation client computer one or more of the sets of the object definition files, based on, for example, a set of payment or billing information and/or an identity of a user of the animation client computer.
  • In other embodiments, the animation server software further comprises instructions executable by the second processor to identify a user of the animation client computer and/or to determine, (e.g., based on an identification of the user and/or a set of payment or billing information) whether to allow the animation client computer to interact with the animation server software.
  • In further embodiments, the animation server software comprises instructions executable by the second processor to store the set of position data at a data store (which might be associated with the animation server computer). In an aspect, the animation server software comprises instructions to store a plurality of sets of position data (each of which may be, but need not be, based on a separate set of input data) and/or to track a series of changes to a position of the object, based on the plurality of sets of position data.
  • In a particular set of embodiments, the animation client computer is a first animation client computer, and the system comprises a second animation client computer in communication with the animation server computer. The second animation client computer may comprise a third processor, a second display device, a second input device, and/or second animation client software.
  • The second animation software may comprise instructions executable by the third processor to accept a second set of input data (which may indicate a desired position for a second object) from the second input device and/or to transmit the second set of input data for reception by the animation server computer. The animation server software may comprise instructions executable by the second processor to receive the second set of input data from the animation client computer and/or to transmit (e.g., for reception by the second animation client computer) a second set of position data, which may be based on the second set of input data received from the second animation client computer. The second animation client software may further comprise instructions executable by the third processor to receive the second set of position data from the animation server computer and/or to place the second object in the desired position, perhaps based on the second set of position data.
  • The first object and the second object may be the same object. Accordingly, in some cases, the animation server software might comprise instructions to transmit the second set of position data for reception by the first animation client computer, and the animation client software on the first animation client computer might further comprise instructions to place the object in a position defined by the second set of position data, such that the first display displays the object in a position desired by a user of the second animation client computer. In other cases (e.g., if the first object and the second object are not the same object), the second set of position data might have no impact on a rendering of the first object on the first client computer, and/or the first set of position data might have no impact on a rendering of the second object on the second client computer.
  • A variety of input devices may be used. Exemplary devices include a joystick, a game controller, a mouse, a keyboard, a steering wheel, an inertial control system, an optical control system, a full or partial body motion capture unit, an optical, mechanical or electromagnetic system configured to capture the position or motion of an actor, puppet or prop, and/or the like.
  • In another set of embodiments, a system for producing animated works comprises a first animation client computer comprising a first processor, a first display device, at least one first input device, and first animation client software. The system further comprises an animation server computer in communication with the animation client computer and comprising a second processor and animation server software.
  • The first animation client software comprises instructions executable by the first processor to accept a first set of input data from the at least one input device; the first set of input data indicates a desired position for a first object. The first animation client software also comprises instructions to transmit the first set of input data for reception by the animation server computer. The animation server software comprises instructions executable by the second processor to receive the first set of input data from the first animation client computer, to calculate a first set of position data (perhaps based on the first set of input data received from the first animation client computer) and to render the first object, based at least in part on the first set of position data. The first animation client software further comprises instructions to display the first object in the desired position.
  • The system may further comprise a second animation client computer comprising a third processor, a second display device, at least one second input device, and second animation client software. The second animation client software can comprise instructions executable by the third processor to accept a second set of input data from the input device, the set of input data indicating a desired position for a second object and/or to transmit the second set of input data for reception by the animation server computer. The animation server software may further comprise instructions to receive the second set of input data from the second animation client computer and/or instructions to transmit for reception by the second animation client computer a second set of position data, based on the second set of input data received from the second animation client computer.
  • In some cases, the second animation client software comprises instructions to receive the second set of position data from the animation server computer. The second animation client software may also comprise instructions to place the second object in the desired position that object, based at least in part on the second set of position data.
  • Another set of embodiments provides animation client computers and/or animation server computers, which may be similar to those described above.
  • A further set of embodiments provides animation software, including software that can be used to operate the systems described above. An exemplary animation software package may be embodied on at least one computer readable medium and may comprise an animation client component and an animation server component. The animation client component might comprise instructions executable by a first computer to accept a set of input data from at least one input device at the first computer and/or to transmit the set of input data for reception by a second computer. The input data may indicate a desired position for an object.
  • The animation server component may comprise instructions executable by a second computer to receive the set of input data from the first computer and/or to transmit for reception by the first computer a set of position data, based on the set of input data received from the first computer. The animation client component, then, may comprise further instructions executable by the first computer to receive the set of position data from the second computer and/or to place the animated object in the desired position, based at least in part on the set of position data.
  • Still another set of embodiments provides methods, including without limitation methods that can be implemented by the systems and/or software described above. An exemplary method of creating an animated work comprises accepting at an animation client computer a set of input data (which might indicate a desired position for an object) from at least one input device, and/or transmitting the set of input data for reception by an animation server computer. In some cases, the method further comprises receiving at the animation server computer the set of input data from the animation client computer and/or transmitting for reception by the animation client computer a set of position data, based on the set of input data received from the animation client computer. The set of position data from the animation server computer may be received at the client computer. The method can further include placing the object in the desired position, based at least in part on the set of position data.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • A further understanding of the nature and advantages of the present invention may be realized by reference to the remaining portions of the specification and the drawings wherein like reference numerals are used throughout the several drawings to refer to similar components. In some instances, a sublabel is associated with a reference numeral to denote one of multiple similar components. When reference is made to a reference numeral without specification to an existing sublabel, it is intended to refer to all such multiple similar components.
  • FIG. 1 is a block diagram of the prior art animation software design illustrating artist and/or programmer PCs connected to a server system for checking out animation data files, processing the animation data files, and returning the animation data files to bulk storage of the animation data at the server, the exemplary server here being shown with version tracking software;
  • FIG. 2 is a block diagram of an animation system in accordance with one set of embodiments;
  • FIG. 3 is a block diagram an animation system in accordance with another set of embodiments;
  • FIG. 4A is a representation of a model that can be animated by various embodiments of the invention;
  • FIG. 4B is a schematic representation of a rig suitable for deforming the model of FIG. 4A, the rig here having manipulation at the neck, shoulders, elbow, hand, hips, knees, and ankles;
  • FIG. 4C is a schematic representation of texture for placement over the model of FIG. 4A to impart a texture to a portion of the exterior of the model in the form of a man's suit;
  • FIG. 5 is a representation of a scene;
  • FIG. 6 is a generalized schematic drawing illustrating various components of a client/server animation system, in accordance with embodiments of the invention;
  • FIG. 7 is a flow diagram illustrating a method of creating an animated work, in accordance with various embodiments of the invention; and
  • FIG. 8 is a generalized schematic drawing of a computer architecture that can be used in various embodiments of the invention.
  • DETAILED DESCRIPTION
  • Various embodiments of the invention provide novel software, systems and methods for animation and/or filmmaking (the term ā€œfilmmakingā€ is used broadly herein to connote creating and/or producing any type of film-based and/or digital still and/or video image production, including without limitation feature-length films, short films, television programs, etc.). In a set of embodiments, for example, a client-server system provides the ability to control various aspects of a live-action and/or an animated scene, including cameras and/or light sources (either real and/or virtual), animated characters, and other objects. This can include, merely by way of example, moving cameras, lights and/or the like, as well as rendering animated objects (e.g., based on movements of the objects themselves and/or based on movements of cameras, lights, etc.).
  • Merely by way of example, in a set of embodiments, a client animation computer accepts input (e.g., via one or more input devices) and provides that input to an animation server computer. (In some cases, the client animation may provide raw input from the input device.) The input indicates a desired movement and/or position of an animated character, relative to other objects in a virtual scene. The animation server computer, after receiving the input, calculates a set of data (including, merely by way of example, data describing a deformation of a model, such as joint rotations and/or joint angles) that describe the desired movement and/or position of the character. (The use of joint rotations in animation is described below.) After calculating the set of joint angles, the animation server computer transmits the set of joint angles to the animation client computer. The animation client computer then renders the animated character in the desired position, based on the set of joint angles, as well as a set of polygons and one or more textures defining the animated character. (As used herein, the term ā€œpolygonsā€ broadly refers not only to the traditional polygons used to form a model of an object, but also to any other structures that commonly are used to form a model of an object, including merely by way of example, NURBS surfaces, subdivision surfaces, level sets, volumetric representations, and point sets, among others.)
  • In this way, the animation client computer can store some of the files necessary to render the character, and can in fact render the character if provided the proper joint angles. This is beneficial, in many situations, because it relieves the animation server of the relatively processor-intensive task of rendering the animation. This arrangement, however, also allows the server to perform the joint calculations, which, while generally not as processor-intensive as the rendering process, often impose relatively high file input/output (ā€œI/Oā€) requirements, due to the extensive size of the databases used to hold data for performing the calculation of joint angles.
  • This exemplary system, then, provides a distribution of work that takes advantage of the strength of the animation client (that is, the ability to provide a plurality of animation client computers for performing the processor-intensive rendering tasks for various animation projects), while also taking advantage of the strength of typical server computers (that is, the ability to accommodate relatively high file I/O requirements). By contrast, in systems where a central server provides rendering services, extremely powerful (and therefore expensive) servers (and in many cases, server farms) are required to provide the rendering services. Ironically, such systems often also feature relatively powerful workstations as animation clients, but the processing power of the workstations is not harnessed for the rendering.
  • This exemplary system provides additional advantages, especially when compared with systems on which the animation (i.e., joint rotation calculation) and rendering processes occur on the animation client. Merely by way of example, the exemplary system described above facilitates the maintenance of data. For instance, since the joint rotations for a particular animation are calculated at the animation server, they can easily be stored there as well, and a variety of version-tracking and change-management protocols may be employed. By contrast, when individual clients (as opposed to an animation server) calculate joint rotations (and/or other position data), such data must either be stored at the several client machines or uploaded to the server after calculation, and management of that data therefore becomes much more burdensome.
  • Moreover, because, in the exemplary system described above, the physics engine that calculates the joint rotations remains on the server, the system can be configured to prevent the animation client from accessing sufficient data to independently perform the animation process, preventing unauthorized copying of animations and thereby providing greater security for that intellectual property.
  • Because various embodiments of the invention can be used to create animated works, it is helpful to provide a brief overview of the animation process. Referring first to FIG. 4A, a model 10 the form of a human figure is disclosed. Model 10 includes face 11, neck 12, arms 14 with elbow 15 and wrist 16 leading to hand 17. The model further includes hip 18 knees 19 and ankles 20. In a virtual character, the ā€œmodelā€ (or ā€œvirtual modelā€) is a geometric description of the shape of the character in one specific pose (commonly called the ā€œmodel pose,ā€ ā€œneutral pose,ā€ or ā€œreference pose.ā€ The neutral pose used in the model is commonly a variation on the so called ā€œda Vinci poseā€ in which the model is shown standing with eyes and head looking forward, arms outstretched, legs straight with feet approximately shoulder width apart.
  • The model can be duplicated in any number of ways. In one common prior art process, a clay model or human model is scanned or digitized, recording the spatial coordinates of a numerous points on the surface of the physical model so that a virtual representation of the model may be reconstructed from the data. It is to be understood that such models can be the product of great effort, taking man years to construct.
  • The model also includes connectivity data (also called an ā€œedge listā€). This data is recorded at the time of scanning or inferred from the locations of the points, so that the collection of points can be treated as the vertices of a polygonal approximation of the surface shape of the original physical model. It is common, but not required, in the prior art for various mathematical smoothing and interpolation algorithms to be performed on the virtual model, so as to provide for a smoother surface representation than is achieved with a pure polygonal representation. One skilled in the art will appreciate that virtual models commonly include collections of spatial coordinates ranging from hundreds of points to hundreds of thousands or more points.
  • Referring to FIG. 4B, a rig 30 is illustrated which is compatible with model 10 shown in FIG. 4A. Rig 30 includes head 31, neck 32, eyes 33, shoulders 34 elbows 35 and wrist 36. Further, hips 38 knees 39 and ankles 40 are also disclosed. Simply stated rig 30 is mathematically disposed on model 10 so that animation can move the rig 30 at neck 32, shoulders 34 elbows 35 and wrist 36. Further, movement of hips 38, knees 39, and ankles 40 can also occur through manipulation of the rig 30.
  • The rig 30 enables the model 10 to move with realistic changes of shape. The rig 30 thus turns the model 10 into a virtual character commonly required to move and bend, such as at the knees or elbows, in order to convey a virtual performance. The software and data used to deform (or transform) the ā€œneutral poseā€ model data into a specific ā€œactive poseā€ variation of the model is commonly called a ā€œrigā€ or ā€œIK rigā€ where (ā€œIKā€ is a shortened form of ā€œInverse Kinematicsā€).
  • ā€œInverse Kinematicsā€ (as in ā€œIK Rigā€) is a body of mathematics that enables the computation of joint angles (or joint rotations) from joint locations and skeletal relationships. ā€œForward Kinematicsā€ is the term of art for computing joint locations based on a collection of joint angles and skeletal relationships.
  • To a programmer skilled in the art, a rig is a piece of software which has as its inputs a collection of joint rotations, joint angles and/or joint locations (ā€œthe right elbow is bent 30 degreesā€ or ā€œthe tip of the left index finger is positioned 2 cm above the center of the light switchā€), the skeletal relationships between the joints (ā€œthe head bone is connected to the neck boneā€) and a neutral pose representation of the virtual model, and has as its output a collection of spatial coordinates and connectivity data describing the shape that the virtual actor's body takes when posed as described by the input date.
  • To an artist skilled in the prior art, a rig is a visual representation of the skeleton of the virtual actor, with graphical or other controls which allow the artist to manipulate the virtual actor. In the case of an ā€œInverse Kinematics Rigā€ the artist might place a mouse on the left index finger of the virtual actor and dragging the left index finger across the screen so as to cause the virtual actor's arm to extend in a pointing motion. In the case of a ā€œForward Kinematics Rigā€ the artist might click on the elbow of the virtual character and bend or straighten the rotation of the elbow joint by dragging the mouse across the screen or by typing a numeric angle on the keyboard.
  • Referring to FIG. 4C, texture 50 is illustrated here in the form of only a man's suit having a coat 51 and pants 52. In actual fact, texture 50 would include many other surfaces. For example, a human face, socks, shoes, hands (possibly with gloves) would all be part of the illustrated texture 50. It is common for a virtual actor to be drawn (or ā€œrenderedā€) to the screen with, for example, blue eyes and a red jacket. The virtual model described earlier contains purely spatial data. Additional data and/or software, commonly called ā€œTextures,ā€ ā€œMaps,ā€ ā€œShaders,ā€ or ā€œShadingā€ is employed to control the colors used to render the various parts of the model.
  • In the earliest forms of the prior art, the color (or ā€œtextureā€) information was encoded directly into the model, in the form of a ā€œVertex Color.ā€ A Vertex Color is commonly an RGB triplet (Red 0-255, Green 0-255, Blue 0-255) assigned to a specific vertex in the virtual model. By assigning different RGB triplets to different vertices, the model may be colored in such a way as to convey blue eyes and a red dress.
  • While vertex colors are still used in the creation of virtual characters, in the current state of the art it is more common to make use of ā€œtexture coordinatesā€ and ā€œtexture maps.ā€ ā€œTexture coordinatesā€ are additional data that is recorded with the vertex location data and connectivity data in order to allow an image (or ā€œtextureā€) to be ā€œmappedā€ onto the surface.
  • In order to provide realistic coloring to the eyes of the virtual character (perhaps including veins in the whites of the eyes and/or color striations in the iris), a digital image of an eye will be acquired (possibly via a digital camera or possibly by an artist who paints such a picture using computer software). The digital image of the eye is a ā€œtexture mapā€ (or ā€œtextureā€). The vertices in the model that comprise the surface of the eye will be tagged with additional data (ā€œtexture coordinatesā€) that is analogous to latitude and longitude coordinates on a globe. The texture is ā€œmappedā€ onto the surface by use of the texture coordinates.
  • Referring back to FIG. 4B, the virtual character is instructed to look to the left (for example by rotating the neck controls 32 or eye controls 33 in the rig), the virtual model is deformed in a manner which rotates all of the vertices making up the head 11 to the left. The head texture is then rendered in the desired location on the screen based upon the vertex locations and the texture coordinates of those vertices.
  • Referring to FIG. 5, an animated scene comprises some or all of the following: a virtual set (set 60 being shown), one or more props (floor 69, wall 70, chair 61 and table 62 being shown), and one or more virtual characters (model 10 manipulated by rig 30 having texture 50 being shown [here designated only by the 10]).
  • Each of the virtual sets, props, and characters has a fiducial reference point with respect to which the location of the element may be specified. Here the fiducial reference point is shown at 65. The virtual set 60, props (see chair 61 and table 62), and character 10 are assembled together by specifying their spatial locations using a shared coordinate system from fiducial 65. The choice of coordinate system is arbitrary, but a common practice is to locate the virtual set at the origin of the coordinate system.
  • Often, the background (or ā€œvirtual setā€) is essentially a virtual character with either no rig (in the case of a purely static virtual set) or what is commonly a very simple rig (where the joint angles might control the opening angles of a door 67 or the joint locations might control the opening height of a window 68). It is common to embellish the scene used in an action with a variety of props. As with the background, props are again essentially virtual characters which are used to represent inanimate objects.
  • For the purposes of creating a motion picture sequence, animation data is associated with the elements of a scene to create an action. A sequence of images may be constructed by providing the rig of character 10 with a sequence of input data (ā€œanimation dataā€) such as 24 sets of joint angles per second, so as to produce a 24 frame per second movie. In the prior art, the animation data provided to the rigs are commonly compressed through the use of various interpolation techniques.
  • For example, it is common in the prior art to compress the animation data into ā€œkey frames.ā€ A key frame is typically associated with a specific point in the timeline of the sequence of images (ā€œt=2.4 secondsā€) and specifies joint angles or joint locations for some or all of the joints in the rig. Any joints (or more generally input parameters) whose values are not specified in this key frame interpolate their values at t=2.4 seconds from other preceding and following key frames that do specify values for those joints or input parameters. The various families of mathematical formulas used to interpolate between key frame values (such as ā€œBezier curvesā€ and ā€œb-Splinesā€) are well known to practitioners of the art.
  • Several methods are commonly used by artists to specify the input data provided to the rig. Merely by way of example, in the ā€œanimationā€ method, the artist indicates a specific point in the timeline (ā€œt=2.4 secondsā€), adjusts one or more joint angles or locations (for example using the keyboard or by manipulating on-screen controls using a mouse), and ā€œsets a key frameā€ on those joint angles or locations. The artist then moves to a different point on the timeline (ā€œt=3.0 secondsā€) and again adjusts joint angles or locations before ā€œsetting a key frame.ā€
  • Once the artist has set two or more key frames, the artist can move an indicator on the timeline or press a ā€œplayā€ button to watch the animation that she has created. By repeatedly adding, modifying, or moving key frames while repeatedly watching the playback of the animation, the artist can create the desired performance.
  • In the ā€œmotion capture,ā€ method the artist performs the motion in some manner while the computer records the motion. Common input devices for use with motion capture include a full-body suit equipped with sensing devices to record the physical joint angles of a human actor and so-called ā€œWaldoā€ devices which allow a skilled puppeteer to control a large number of switches and knobs with their hands (Waldo devices are most commonly used for recording facial animations). It is common to perform multiple captures of the same motion, during which sequence of captures the actor repeatedly reenacts the same motions until data is collected which is satisfactory both artistically and technically.
  • In the ā€œproceduralā€ method, custom software is developed which generates animation data from high-level input data. Procedural animation is commonly used when animating non-human actors such as flocks of birds or falling rocks. In FIG. 5, bird 63 illustrates this technique.
  • In the ā€œhybridā€ method, the motion capture and/or procedural method is used to specify the initial data. For example, the initial movement of bird 63 would be used in the procedural method. The data obtained via the motion capture or procedural method is then compressed in a manner that makes it technically similar (or compatible with) data obtained via the animation method. For example, presuming that bird 63 was going to interact with character 10 in scene 60, modification of the procedural image of bird 63 would occur. Once the initial data has been compressed, it is then in the hybrid method manipulated, re-timed, and/or extended through the use of the animation method.
  • The animation software often plays back previously specified animations by interpolating animation data at a specific point in time, providing the interpolated animation data to the rigs, making use of the rigs to deform the models, applying textures to the models, and presenting a rendered image on the display. The animation software then advances to a different point in the timeline and repeats the process.
  • Based on this general description of the animation process, we turn now to FIG. 2, which illustrates a client/server animation system in accordance with a set of embodiments. The system comprises a plurality of animation client computers 200 in communication (e.g., via a network) with server computer 210 as shown in FIG. 2. (The network can be any suitable network, including without limitation a local area network, wide area network, wired and/or wireless network, the Internet, an intranet or extranet, etc. Those skilled in the art will appreciate that any of a variety of connection facilities, including without limitation such networks, can provide communication between the server 210 and the clients 200.) In accordance with some embodiments, models 10, rigs 30 and/or textures 50 (and/or portions thereof) may be stored at the client 200, e.g., at model, rig, and texture storage 201. In this particular embodiment, such storage has advantages.
  • Presuming that a major studio is either subscribing or alternately maintaining models 10, rigs 30, and textures 50, there can be great expense and effort in developing these discrete components of a character. Further, and because of the effort and expense required, the owner or operator of the client 200 may not choose to share the contents of texture storage 201 with anyone, including the provider of animation server 210.
  • The animation client computer may, in some embodiments, include rendering software 203 operatively connected to model, rig, and texture storage 201. The rendering software may be part of an animation client application. Furthermore, a controller 202 (here shown as a keyboard and mouse) operates through network connection 205. It should be noted that any suitable controller, including those described in U.S. patent application Ser. No. ______, (attorney docket number 020071-000210), already incorporated by reference, can be used in accordance with various embodiments.
  • The animation server computer 210 includes animation data storage 211, animation software 212, and/or version tracking software 214. Presuming that the artist or programmer has created the models 10, rigs 30, and textures 50, manipulation of an Action on a scene 60 can either occur from the beginning (de novo) or, alternately, the artist and/or programmer may check out a previous version of the Action through the network connection 205 by accessing animation server 210 and retrieving from animation data storage 211 the desired data to animation software 212.
  • In either event, utilizing the animation techniques described above, input data (e.g., based from the controller 202 and/or an actuator thereof) is received by the client 200. In some cases, the input data may be described by an auxiliary coordinate system, in which case the input data may be processed as described in U.S. patent application Ser. No. ______ (attorney docket number 020071-000210), already incorporated by reference. Other processing may be provided as well, as necessary to format the raw input data received from the controller 202.
  • The animation client 200 in turn transmits the input (either as raw input data and/or after processing by the animation client computer 200), is transmitted (e.g., via network connection 205) to the animation server computer 210, and, more particularly, animation software 212 (which might be incorporated in an animation server software). Processing of the selected Action will occur at animation software 212 within animation server 210. Such processing will utilize the techniques above described. In particular, a set of joint rotations may be calculated, based on the input data. The joint rotations will describe the position and/or motion desired in the Action.
  • Playback will occur by having animation software 212 emit return animation information through network connection 205 and then to rendering software 203. Rendering software 203 will access model, rig, and texture storage 201 to display at display 204 the end result of modifications introduced by the artist and or programmer at the client 200.
  • Thus, it will be seen that when an Action is modified at a client 200, if the models, rigs and/or textures are only available at client 200 then replay can only occur at client 200. This replay is not possible with the information possessed by server 210 because the model, rig and/or texture storage 201 is not resident in or available to animation server 210. (It should be noted, however, that in various embodiments, some or all portions of the models, rigs, and/or textures may be stored at the animation server in addition toā€”or instead ofā€”at the animation client). Alternatively, if the models rigs and textures are available at client 200 and PC 200A then replay can occur at both client 200 and client 200A, allowing for cooperative work activities between two users.
  • It should be understood that animation server 210 (and/or another server in communication therewith) can provide a number of services. For example, the server 210 can provide access control; for instance, client 200 is required to log in to server 210. Furthermore, subscription may be required as a prerequisite for access to server 210. In some cases, server 210 can deliver different capabilities for different users. By way of example, PC 200A can be restricted to modification of character motion while PC 200 modifies animation of bird 68.
  • Generally, the client 200 controls the time when playback starts and stops for any individual Action. Moreover, the client 200 may arbitrarily change the portion of the Action being worked on by simply referring to that Action at a specified time period.
  • It will further be realized that with the rendering software 203 and the model, rig, and/or texture software in storage 201, the data transmitted over the network through network connection 205 is maintained at a minimum. Specifically, just a small section of data need be transmitted. This data will include that which is needed to play the animation (e.g., a set of joint rotations, etc.). As the rendering software 203 and some or all of the model, rig, and/or texture storage 201 may be resident at PC 200, only small batches of data need be transmitted over the Internet.
  • It will be understood that the server 210 is useful in serving multiple clients. Further, the server 210 can act as a studio, providing the artist and/or programmer at client 200 with a full range of services including storage and delivery of updated model, texture, and rig data to client 200 and client 200A. In a set of embodiments, the server 210 will store all animation data. Furthermore, through version tracking software 214, animation data storage 211 will provide animation data (such as joint rotations, etc.) to the respective client 200A on an as needed basis.
  • Referring now to FIG. 3, a system in accordance with another set of embodiments is illustrated. Specifically, client 300 includes a network connection 305, a controller 302, and a display 304. Server 310 includes animation data storage 211, version tracking software 214, and animation software 212. Additionally, server 310 includes rendering software 303.
  • In this case, the manipulation of the animation software from controller 302 through network connection 305 of the client 300 is identical to that shown in FIG. 2. As well, the animation software for calculating joint rotations, etc. is resident on the server 310. In the embodiments illustrated by FIG. 3, however, the rendering component also resides on the server 310. Specifically, rendering software 303 will generate actual images (e.g., bitmaps, etc.), which images will be sent through the network to network connection 305 and may be displayed thereafter at display 304.
  • FIG. 6 provides a generalized schematic diagram of a client/server system in accordance with some embodiments of the invention. The system 600 includes an animation server computer 605, which may be a PC server, minicomputer, mainframe, etc. running any of a variety of available operating systems including UNIXā„¢ (and/or any of its derivatives, such as Linux, BSD, etc.), various varieties of Microsoft Windowsā„¢ (e.g., NTā„¢, XPā„¢, 2003, Vistaā„¢, Mobileā„¢, CEā„¢, etc.), Apple's Macintosh OSā„¢ and/or any other appropriate server operating system. The animation server computer also includes animation server software 610, which provides animation services in accordance with embodiments of the invention. The animation server computer 605 may also comprise (and/or have associated therewith) one or more storage media 615, which can include storage for the animation server software 610, as well as a variety of associated databases (such as a database of animation data 615 a, a data store 615 b for model data, such as the polygons and textures that describe an animated character, a data store 615 c for scene data, and any other appropriate data stores).
  • The system 600 further comprises one or more animation client computers 620, one or more of which may include local storage (not shown), as well as animation client software 625. (In some cases, such as a case in which the animation client computer 620 is designed only to provide input and display a rendered image, the rendering subsystem may reside on the animation server 620, as described with respect to FIG. 3, for example. In this way, thin clients, such as wireless phones, PDAs, etc. may be used to provide input even if they have insufficient processing power to render the objects).
  • The animation client computer thus 620 may be, inter alia, a PC, workstation, laptop, tablet computer, PDA, wireless phone, etc. running any appropriate operating system (such as Apple's Macintosh OSā„¢, UNIX and/or its derivatives, Microsoft Windowsā„¢, etc.) Each animation client 620 may also include one or more display devices 630 (such as monitors, LCD panels, projectors, etc.) and/or one or more input devices 635 (such as the controllers described above and in U.S. patent application Ser. No. ______ (attorney docket number 020071-000210), already incorporated by reference, as well as, to name but a few examples, a telephone keypad, a stylus, etc.).
  • In accordance with a set of embodiments, the system 600 may operate in the following exemplary manner, which is described by additional reference to FIG. 7, which illustrates a method 700 of creating an animated work in accordance with some embodiments of the invention. (It should be noted that, while the method 700 of FIG. 7 is described in conjunction with the system 600 of FIG. 6, that description is provided for exemplary purposes only, and the methods of the invention are not limited to any particular hardware or software implementation. Likewise, the operation of the system 600 of FIG. 6 is not limited to the described methods.)
  • The animation client software 625 comprises instructions executable by the animation client computer 620 to accept a set of input data from one or more input devices (block 705). The input data may, for example, indicate a desired position of an object in a scene (which may be a virtual scene, a physical set, etc.) In particular embodiments, the object may be an animated object, which may comprise a plurality of polygons and/or textures, as described above. The animation client software optionally may process the input data, for example as described above. The animation client software then transmits the set of input data for reception by the animation server computer (block 710).
  • The animation server computer 605 (and, more particularly in some cases, the animation server software 610) receives the input data (block 715). The animation server software 610 calculates a set of position data (block 720), based on the received input data. In some cases, calculating the set of position data can include processing the input data to determine a desired position of an animated object and/or calculating a set of joint rotations defining that desired position (and/or defining the deformation of a rig defining the character, in order to place the character in the desired position). In other cases, including merely by way of example, if the object is a light or a camera on a physical set, there may be no need to calculate any animation dataā€”the position can be determined based solely on the input data, perhaps in conjunction with a current position of the object.
  • In yet other cases, the object may be an animated character (or other object in a virtual scene), and the position of the object in the scene may be affected by the position of a virtual camera and/or light source. In these cases, the position data might comprise data about the position and/or orientation of the virtual camera/light.
  • The animation server computer 605 (perhaps based on instructions from the server software 610) then transmits the set of position data (e.g., joint rotations, etc.) for reception by the animation client 620 (block 725). When the animation client computer receives the set of position data (block 730), the animation client software 625 is responsible for placing the object in the desired position (block 735). This procedure necessarily will vary according to the nature of the object. Merely by way of example, if the object is an animated character, placing the object in the desired position generally will comprise rendering the animated character in the desired position, for example by calculating a set of positions for the polygons that describe the character and/or by applying any necessary textures to the model. If the object is a physical object, such as a light, placing the object in the desired position may require interfacing with a movement system, which is not illustrated on FIG. 6 but examples of which are described in detail in U.S. patent application Ser. No. ______ (attorney docket number 020071-000210), already incorporated by reference).
  • In some cases, the object (for instance, if the object is a virtual object) may be displayed on a display device 630 (block 740). In particular, the object may be displayed in the desired position. In other cases, the client 620 may be configured to upload the rendered object to the animation server 605 for storage and/or distribution to other computers (which might be, inter alia, other animation servers and/or clients).
  • The system 600 may provide a number of other features, some of which are described above. In some cases, the animation server 605 can provide animation services to a plurality of animation client computers (e.g., 620 a, 620 b). In an exemplary embodiment, input may be received at a first client 620 a, and the position data may be transmitted to a second client 620 b for rendering and/or display. (Optionally, the plurality of client computers 620 may perform rendering tasks in parallel for a given scene). In another embodiment, each client 620 a, 620 b accepts input and receives position data, such that two artists may collaborate on a given character and/or scene, each being able to view changes made by the other. In yet another embodiment, each client 620 a, 620 b may interact individually with the server 605, with each client 620 providing its own input and receiving position data based on that input. (That is, the position data received by one client has no impact on the rendering of an object on another client.)
  • As noted above, in some cases, the animation server software 610 may be configured not only to calculate the position data, but also to render the object (which can include, merely by way of example, not only applying one or more textures to a model of the object, but also to calculating the positions of the polygons that make up the model, based on the position data). Hence, in such cases, the rendered object may be provided to an animation client computer 620 (which may or may not be the same client computer that provided the input on which the position data is based), which then can display the object in the desired position. In some cases, the animation server 605 might render a first object for a first client 620 and might merely provide to a second client a set of position data describing a desired position of a second object.
  • In some embodiments, one or more of the data stores (e.g., data store 615 c) may be used to store object definition files, which can include some or all of the information necessary for rending a given object, such as the model, rig, polygons, textures, etc. describing that object. An animation client 620 then can be configured to download from the server 605 the object definition files (and/or a subset therefore) to perform the rendering of the object in accordance with embodiments of the invention. It should be noted, however, that for security, the downloaded object definition files (and/or portions thereof) may be insufficient to allow a user of the client 620 to independently recreate the object without additional data resident on the server.
  • The system 600 may be configured such that a user of the client 620 is not allowed to modify these object definition files locally at the client 620 and/or, if local modification is allowed, the client 620 may not be allowed to upload modified object definition files. In this way, the system 600 can prevent the unauthorized modification of a ā€œmaster copyā€ of the object definition files. Alternatively, the server software 610 may be configured to allow modified object definition files to be uploaded (and thus to receive such files), perhaps based on an identification of the user of the animation client computerā€”that is, the server 620 may be configured to identify the user and determine whether the user has sufficient privileges to upload modified files. (It should be noted that the identification, authentication and/or authorization of users may be performed either by the animation server 605 and/or by another server, which might communicate such identification, authorization and/or authentication data to the animation server 605.)
  • Similarly, in other embodiments, the animation server software 610 may be configured to determine whether to allow an animation client 620 to interact with the server software 610. Merely by way of example, the animation server software 620 may control access to rendered objects, object definition files, the position data, and/or the software components used to create either of these, based on any number of factors. For instance, the server software 610 (and/or another component) may be configured to identify, authenticate and/or authorize a user of the animation client 620. Based on an identity of the user of a client computer 620 (as well, in some cases, as the authentication status of the user and/or the user's authorization) the animation server software 610 may determine whether it will receive input from the client computer 620, whether it will provide position data to the animation client computer 620 and/or whether it will allow the animation client computer 620 to access files and/or animation services on the animation server 605.
  • Alternatively and/or in addition, the animation server 605 may be configured to provide for-fee services. Hence, the animation server software (and/or another component) may be configured to evaluate a set of payment and/or billing information (which may be, but is not necessarily, associated with an identity of a user of the animation client computer 620), and based on the set of payment and/or billing information, determine whether to allow the client 620 to interact with the server software 610 (including, as mentioned above, whether it will accept/provide data and/or allow access to files and/or services). The set of billing and/or payment data can include, without limitation, information about whether a user has a subscription for animation services and/or files, whether the user has paid a per-use fee, whether the user's account is current, and/or any other relevant information.
  • In some cases, various levels of interaction with the server software 610 may be allowed. Merely by way of example, if the animation server computer 605 stores a plurality of sets of rendered objects and/or object definition files (wherein, for example, each set of files comprises information describing a different animated character), the animation server 605 may allow an unregistered user to download files for a few ā€œfreeā€ characters, while paid subscribers have access to files an entire library of characters (it should be appreciated that there may be various levels of subscription, with access to files for corresponding various numbers of characters). Similarly, a user may be allowed to pay a per-character fee for a particular character, upon which the user is allowed to download the set of files for that character. (Such commerce functionality may be provided by a separate server, third-party service, etc.) In some cases, if a user has a subscription (and/or pays a per-use fee), the user (and/or an animation client computer operated by the user) may be given access to services and/or data on the animation server. Merely by way of example, if the user pays a per-use fee to obtain object definition files for a given animated character, that user may use animation services (including without limitation those described above) for that animated character. As another example, a user may have a monthly subscription to use files for a set of animated characters, and the user may use the animation server as part of the monthly subscription. Other for-fee uses are possible as well. A user may pay, for example, a per-use and/or subscription fee for access to the services of an animation server, apart from any fees that might be paid for the use of object definition files.
  • The animation server software 610 (and/or another software component) may also be configured to perform change tracking and/or version management of object definition files (and/or rendered objects, position data, etc.). In some embodiments, any of several known methods of change tracking and/or version management may be used for this purpose. In particular, the change tracking/version management functions may be configured to allow various levels of access to files based on an identity of a user and/or a project that the identified user is working on. Merely by way of example, an artist in a group working on a particular character, scene, film, etc. may be authorized to access (as well, perhaps, as download) files related to that character, scene, film, while a manager or senior artist might be authorized to modify such files. An artist working on another project might not have access to any such files.
  • The animation server software 605 may also be configured to distribute (e.g., to other clients and/or servers) a set of modified object definition files, such that each user has access to the most recent version of these files. As described above, access to a distribution of these modified files may be controlled based on an identity of the user, various payment or billing information, etc.
  • Embodiments of the invention can be configured to protect stored and/or transmitted data, including without limitation object definition files, rendered objects, input data, position data, and the like. Such data can be protected in a variety of ways. As but one example, data may be protected with access control mechanisms, such as those described above. In addition, other protection measures may be implemented as well. Merely by way of example, such data may be encrypted prior to being stored at an animation server and/or prior to being transmitted between an animation server and an animation client, to prevent unauthorized access to such data. As another example, data may be digitally signed and/or certified before storage and/or before transmission between computers. Such signatures and/or certifications can be used, inter alia, to verify the identification of an entity that created and/or modified such data, which can also facilitate change tracking and/or version management of various data used by embodiments of the invention.
  • While a few examples of the data management services that can be provided by various embodiments of the invention are described above, one skilled in the art should appreciate, based on the disclosure herein, that a variety of additional services may be enabled by certain features of the disclosed embodiments.
  • FIG. 8 provides a generalized schematic illustration of one embodiment of a computer system 800 that can perform the methods of the invention and/or the functions of computer, such as the animation server and client computers described above. FIG. 8 is meant only to provide a generalized illustration of various components, any of which may be utilized as appropriate. The computer system 800 can include hardware components that can be coupled electrically via a bus 805, including one or more processors 810; one or more storage devices 815, which can include without limitation a disk drive, an optical storage device, solid-state storage device such as a random access memory (ā€œRAMā€) and/or a read-only memory (ā€œROMā€), which can be programmable, flash-updateable and/or the like (and which can function as a data store, as described above). Also in communication with the bus 805 can be one or more input devices 820, which can include without limitation a mouse, a keyboard and/or the like; one or more output devices 825, which can include without limitation a display device, a printer and/or the like; and a communications subsystem 830; which can include without limitation a modem, a network card (wireless or wired), an infra-red communication device, and/or the like).
  • The computer system 800 also can comprise software elements, shown as being currently located within a working memory 835, including an operating system 840 and/or other code 845, such as the application programs (including without limitation the animation server and client software) described above and/or designed to implement methods of the invention. Those skilled in the art will appreciate that substantial variations may be made in accordance with specific embodiments and/or requirements. For example, customized hardware might also be used, and/or particular elements might be implemented in hardware, software (including portable software, such as applets), or both.
  • While the invention has been described with respect to exemplary embodiments, one skilled in the art will recognize that numerous modifications are possible. For example, the processes described herein may be implemented using hardware components, software components, and/or any combination thereof. Thus, although the invention has been described with respect to exemplary embodiments, it will be appreciated that the invention is intended to cover all modifications and equivalents within the scope of the following claims.

Claims (46)

1. A system for producing animated works, the system comprising:
an animation client computer in communication with the animation server computer, the animation client computer comprising a first processor, a display device, at least one input device, and animation client software comprising:
a) instructions executable by the first processor to accept a set of input data from the at least one input device, the set of input data indicating a desired position for an animated object, the animated object comprising a set of one or more polygons and a set of one or more textures to be applied to the set of one or more polygons; and
b) instructions executable by the first processor to transmit the set of input data for reception by an animation server computer; and
an animation server computer comprising a second processor and animation server software comprising:
a) instructions executable by the second processor to receive the set of input data from the animation client computer;
b) instructions executable by the second processor to process the set of input data to determine the desired position of the animated object;
c) instructions executable by the second processor to calculate a set of joint rotations defining the desired position of the animated object; and
d) instructions executable by the second processor to transmit the set of joint rotations for reception by the animation client computer;
wherein the animation client software further comprises:
a) instructions executable by the first processor to receive the set of joint rotations defining the position of the animated object;
b) instructions executable by the first processor to calculate, based on the set of joint rotations, a set of positions for the set of one or more polygons; and
c) instructions executable by the first processor to apply to the set of one or more polygons at least one of the textures from the set of one or more textures to render the animated object in the desired position.
2. A system as recited by claim 1, wherein the animation client software comprises further instructions executable by the first processor to:
display on the display device the rendered animated object.
3. A system as recited by claim 1, wherein the animation server software comprises further instructions executable by the second processor to:
store at a data store associated with the animation server computer the set of joint rotations.
4. A system as recited by claim 1, wherein:
the animation client computer is a plurality of animation client computers comprising a first animation client computer and a second animation client computer;
the first animation client computer comprises the at least one input device;
the second animation client computer comprises the display device;
the animation server computer receives the set of input data from the first animation client computer;
the animation server computer transmits the set of joint rotations for reception by the second animation client computer, and
the second animation client computer is configured to:
a) receive the set of joint rotations;
b) based on the set of joint rotations, calculate a set of positions for the set of one or more polygons;
c) apply to the set of one or more polygons at least one of the textures from the set of one or more textures to render the animated object in the desired position; and
d) display on the display device the rendered animated object.
5. A system for producing animated works, the system comprising:
an animation client computer, the animation client computer comprising a first processor, a display device, at least one input device, and animation client software comprising:
a) instructions executable by the first processor to accept a set of input data from the at least one input device, the set of input data indicating a desired position for an object; and
b) instructions executable by the first processor to transmit the set of input data for reception by an animation server computer; and
an animation server computer in communication with the animation client computer, the animation server computer comprising a second processor and animation server software comprising:
a) instructions executable by the second processor to receive the set of input data from the animation client computer;
b) instructions executable by the second processor to transmit for reception by the animation client computer a set of position data, based on the set of input data received from the animation client computer;
wherein the animation client software further comprises:
a) instructions executable by the first processor to receive the set of position data from the animation server computer; and
b) instructions executable by the first processor to place the object in the desired position, based at least in part on the set of position data.
6. A system as recited by claim 5, wherein the object is a virtual object.
7. A system as recited by claim 5, wherein the scene is physical object.
8. A system as recited by claim 5, wherein the object is an animated character.
9. A system as recited by claim 8, wherein the animated character comprises a set of polygons and at least one texture, and wherein placing the object in the desired position comprises rendering the animated character in the desired position.
10. A system as recited by claim 5, wherein the object is a device in communication with the client computer and is selected from the group consisting of a camera and a light source.
11. A system as recited by claim 5, wherein the object is selected from a group consisting of a virtual camera and a virtual light source.
12. A system as recited by claim 5, wherein the set of position data comprises a set of joint rotations defining a position of the object.
13. A system as recited by claim 5, wherein the set of position data comprises a set of joint rotations defining a deformation of a rig describing the object.
14. A system as recited by claim 5, wherein the set of position data comprises a position and/or orientation of a real or virtual camera, and wherein the position of the object in the scene is affected by the position and/or orientation of the real or virtual camera.
15. A system as recited by claim 5, wherein the animation client software further comprises instructions executable by the first processor to process a set of raw input data received from the at least one input device to produce a set of processed input data, such that the set of processed input data is transmitted for reception by the animation server computer.
16. A system as recited by claim 5, wherein the set of input data and/or the set of position data is protected prior to transmission.
17. A system as recited by claim 5, wherein the animation server has an associated data store configured to hold a set of one or more object definition files for the animated object, the set of one or more object definition files collectively specifying a set of polygons and textures that define the object.
18. A system as recited by claim 17, wherein the animation client software further comprises instructions executable by the first processor to download from the animation server computer at least a portion of the set of one or more object definition files necessary to render the object.
19. A system as recited by claim 18, wherein the object definition files are protected prior to transmission.
20. A system as recited by claim 18, wherein the object definition files comprise one or more textures associated with the object.
21. A system as recited by claim 18, wherein the at least a portion of the set of one or more object definition files is insufficient to independently recreate the animated object without additional data resident on the animation server computer.
22. A system as recited by claim 18, wherein the animation client computer is unable to upload to the animation server computer any modifications of the at least a portion of the set of one or more object definition files.
23. A system as recited by claim 18, wherein:
the animation client software comprises further instructions executable by the first processor to modify the at least a portion of the set of one or more object definition files to produce a set of modified object definition files; and
the animation server software comprises further instructions executable by the second processor to receive the set of modified object definition files.
24. A system as recited by claim 23, wherein the animation server software comprises further instructions executable by the second processor to track changes to the set of object definition files.
25. A system as recited by claim 23, wherein the animation server software comprises further instructions executable by the second processor to:
identify a user of the animation client computer; and
based on an identity of the user of the animation client computer, determine whether to accept the set of modified object definition files.
26. A system as recited by claim 23, wherein:
the animation client computer is a first animation client computer; and
the animation server software comprises further instructions executable by the second processor to distribute the set of modified object definition files to a set of animation client computers comprising at least a second animation client computer.
27. A system as recited by claim 17, wherein the data store is configured to hold a plurality of sets of one or more object definition files for a plurality of animated objects.
28. A system as recited by claim 27, wherein the animation server software comprises further instructions executable by the second processor to determine whether to provide to the animation client computer one or more of sets of the object definition files based on (a) a set of payment or billing information, or (b) an identity of a user of the animation client computer.
29. A system as recited by claim 5, wherein the animation server software further comprises:
instructions executable by the second processor to identify a user of the animation client computer; and
instructions executable by the second processor to determine, based at least on an identification of the user, whether to allow the animation client computer to interact with the animation server software.
30. A system as recited by claim 5, wherein the animation server software comprises further instructions executable by the second processor to determine whether to allow the animation client computer to interact with the animation server software based on a set of payment or billing information.
31. A system as recited by claim 5, wherein the animation server software comprises further instructions executable by the second processor to store at a data store associated with the animation server computer the set of position data.
32. A system as recited by claim 31, wherein the animation server software further comprises:
instructions executable by the second processor to store a plurality of sets of position data, each of the sets of position data based on a separate set of input data; and
instructions executable by the second processor to track a series of changes to a position of the object, based on the plurality of sets of position data.
33. A system as recited by claim 5, wherein:
the animation client computer is a first animation client computer; and
the system comprises a second animation client computer in communication with the animation server computer, the second animation client computer comprising a third processor, a second display device, a second input device, and a second animation client software comprising:
a) instructions executable by the third processor to accept a second set of input data from the second input device, the second set of input data indicating a desired position for a second object; and
b) instructions executable by the third processor to transmit the second set of input data for reception by the animation server computer;
wherein:
the animation server software further comprises:
a) instructions executable by the second processor to receive the second set of input data from the animation client computer; and
b) instructions executable by the second processor to transmit for reception by the second animation client computer a second set of position data, based on the second set of input data received from the second animation client computer; and
the second animation client software further comprises:
a) instructions executable by the third processor to receive the second set of position data from the animation server computer; and
b) instructions executable by the third processor to place the second object in the desired position, based at least in part on the second set of position data.
34. A system as recited by claim 33, wherein the second object and the first object are the same object.
35. A system as recited by claim 34, wherein:
the animation server software further comprises:
a) instructions executable by the second processor to transmit the second set of position data for reception by the first animation client computer; and
the animation client software on the first animation client computer further comprises:
a) instructions executable by the first processor to place the object in a position defined by the second set of position data, such that the first display displays the object in a position desired by a user of the second animation client computer.
36. A system as recited by claim 33, wherein the second set of position data has no impact on a rendering of the first object on the first client computer, and wherein the first set of position data has no impact on a rendering of the second object on the second client computer.
37. A system as recited by claim 5, wherein the at least one input device comprises a device selected from the group consisting of a joystick, a game controller, a mouse, a keyboard, a steering wheel, an inertial control system, and an optical control system.
38. A system as recited by claim 5, wherein the at least one input device comprises a full or partial body motion capture unit.
39. A system as recited by claim 5, wherein the at least one input device comprises an optical, mechanical or electromagnetic system configured to capture the position or motion of an actor, puppet or prop.
40. An animation client computer for producing animated works, the animation client computer comprising at least one input device, a processor and animation client software comprising:
instructions executable by the processor to accept a set of input data from the at least one input device, the set of input data indicating a desired position for an animated object comprising a set of one or more polygons and a set of one or more textures to be applied to the set of one or more polygons;
instructions executable by the processor to transmit the set of input data for reception by an animation server computer; and
instructions executable by the processor to receive from the animation server computer a set of joint rotations defining the desired position of the animated object, the set of joint rotations being calculated based on the set of input data;
instructions executable by the processor to calculate, based on the set of joint rotations, a set of positions for the set of one or more polygons; and
instructions executable by the processor to apply to the set of one or more polygons at least one of the textures from the set of one or more textures to render the animated object in the desired position.
41. An animation client computer as recited in claim 40, further comprising a display device, wherein the animation client software further comprises:
instructions executable by the processor to display on the display device the rendered animated object.
42. An animation server computer for producing animated works, the animation server computer comprising a processor and animation server software comprising:
instructions executable by the processor to receive from an animation client computer a set of input data obtained from at least one input device, the set of input data indicating a desired position for an animated object comprising a set of one or more polygons and a set of one or more textures to be applied to the set of one or more polygons;
instructions executable by the processor to process the set of input data to determine a desired position of the animated object;
instructions executable by the processor to calculate a set of joint rotations defining the desired position of the animated object; and
instructions executable by the processor to transmit for reception by the animation client computer the set of joint rotations, such that the animation client computer can use the set of joint rotations to render the animated object in the desired position.
43. A system for producing animated works, the system comprising:
a first animation client computer, the first animation client computer comprising a first processor, a first display device, at least one first input device, and first animation client software comprising:
a) instructions executable by the first processor to accept a first set of input data from the at least one input device, the first set of input data indicating a desired position for a first object; and
b) instructions executable by the first processor to transmit the first set of input data for reception by an animation server computer; and
an animation server computer in communication with the animation client computer, the animation server computer comprising a second processor and animation server software comprising:
a) instructions executable by the second processor to receive the first set of input data from the first animation client computer;
b) instructions executable by the second processor to calculate a first set of position data, based on the first set of input data received from the first animation client computer;
d) instructions executable by the second processor to render the first object, based at least in part on the first set of position data;
wherein the first animation client software further comprises:
a) instructions executable by the first processor to display the first object in the desired position.
44. A system as recited by claim 43, wherein:
the system comprises a second animation client computer comprising a third processor, a second display device, at least one second input device, and second animation client software comprising:
a) instructions executable by the third processor to accept a second set of input data from the at least one input device, the set of input data indicating a desired position for a second object; and
b) instructions executable by the third processor to transmit the second set of input data for reception by the animation server computer;
the animation server software further comprises:
a) instructions executable by the second processor to receive the second set of input data from the second animation client computer; and
b) instructions executable by the second processor to transmit for reception by the second animation client computer a second set of position data, based on the second set of input data received from the second animation client computer; and
the second animation client software further comprises:
a) instructions executable by the third processor to receive the second set of position data from the animation server computer; and
b) instructions executable by the third processor to place the second object in the desired position for the second object, based at least in part on the second set of position data.
45. An animation software package embodied on at least one computer readable medium, the animation software package comprising:
an animation client component comprising:
a) instructions executable by a first computer to accept a set of input data from at least one input device at the first computer, the input data indicating a desired position for an object; and
b) instructions executable by a first computer to transmit the set of input data for reception by a second computer; and
an animation server component comprising:
a) instructions executable by a second computer to receive the set of input data from the first computer;
b) instructions executable by a second computer to transmit for reception by the first a set of position data, based on the set of input data received from the animation client computer;
wherein the animation client component further comprises:
a) instructions executable by the first computer to receive the set of position data from the second computer; and
b) instructions executable by the first computer to place the animated object in the desired position, based at least in part on the set of position data.
46. A method of creating an animated work, the method comprising:
accepting at an animation client computer a set of input data from at least one input device, the input data indicating a desired position for an object;
transmitting the set of input data for reception by an animation server computer;
receiving at the animation server computer the set of input data from the animation client computer;
transmitting for reception by the animation client computer a set of position data, based on the set of input data received from the animation client computer;
receiving at the client computer the set of position data from the animation server computer; and
based at least in part on the set of position data, placing the object in the desired position.
US11/262,492 2004-10-28 2005-10-28 Client/server-based animation software, systems and methods Abandoned US20060109274A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/262,492 US20060109274A1 (en) 2004-10-28 2005-10-28 Client/server-based animation software, systems and methods

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US62341504P 2004-10-28 2004-10-28
US62341404P 2004-10-28 2004-10-28
US11/262,492 US20060109274A1 (en) 2004-10-28 2005-10-28 Client/server-based animation software, systems and methods

Publications (1)

Publication Number Publication Date
US20060109274A1 true US20060109274A1 (en) 2006-05-25

Family

ID=36319712

Family Applications (3)

Application Number Title Priority Date Filing Date
US11/262,492 Abandoned US20060109274A1 (en) 2004-10-28 2005-10-28 Client/server-based animation software, systems and methods
US11/261,441 Expired - Fee Related US7433760B2 (en) 2004-10-28 2005-10-28 Camera and animation controller, systems and methods
US12/197,397 Abandoned US20080312770A1 (en) 2004-10-28 2008-08-25 Camera and animation controller, systems and methods

Family Applications After (2)

Application Number Title Priority Date Filing Date
US11/261,441 Expired - Fee Related US7433760B2 (en) 2004-10-28 2005-10-28 Camera and animation controller, systems and methods
US12/197,397 Abandoned US20080312770A1 (en) 2004-10-28 2008-08-25 Camera and animation controller, systems and methods

Country Status (2)

Country Link
US (3) US20060109274A1 (en)
WO (2) WO2006050198A2 (en)

Cited By (41)

* Cited by examiner, ā€  Cited by third party
Publication number Priority date Publication date Assignee Title
US20070109304A1 (en) * 2005-11-17 2007-05-17 Royi Akavia System and method for producing animations based on drawings
US20080024615A1 (en) * 2006-07-28 2008-01-31 Accelerated Pictures, Inc. Camera control
US20080028312A1 (en) * 2006-07-28 2008-01-31 Accelerated Pictures, Inc. Scene organization in computer-assisted filmmaking
US7433760B2 (en) 2004-10-28 2008-10-07 Accelerated Pictures, Inc. Camera and animation controller, systems and methods
US20090002377A1 (en) * 2007-06-26 2009-01-01 Samsung Electronics Co., Ltd. Apparatus and method for synchronizing and sharing virtual character
US20090027302A1 (en) * 2007-07-25 2009-01-29 Lenovo (Beijing) Limited Method for operating object between terminals and terminal using the method
US20090046097A1 (en) * 2007-08-09 2009-02-19 Scott Barrett Franklin Method of making animated video
US20100029377A1 (en) * 2006-10-03 2010-02-04 Canterbury Stephen A Shared physics engine in a wagering game system
US20100073379A1 (en) * 2008-09-24 2010-03-25 Sadan Eray Berger Method and system for rendering real-time sprites
US20100073361A1 (en) * 2008-09-20 2010-03-25 Graham Taylor Interactive design, synthesis and delivery of 3d character motion data through the web
US20100110081A1 (en) * 2008-10-30 2010-05-06 Microsoft Corporation Software-aided creation of animated stories
US20100134490A1 (en) * 2008-11-24 2010-06-03 Mixamo, Inc. Real time generation of animation-ready 3d character models
US20100146422A1 (en) * 2008-12-08 2010-06-10 Samsung Electronics Co., Ltd. Display apparatus and displaying method thereof
US20100149179A1 (en) * 2008-10-14 2010-06-17 Edilson De Aguiar Data compression for real-time streaming of deformable 3d models for 3d animation
US20100231582A1 (en) * 2009-03-10 2010-09-16 Yogurt Bilgi Teknolojileri A.S. Method and system for distributing animation sequences of 3d objects
US20100259547A1 (en) * 2009-02-12 2010-10-14 Mixamo, Inc. Web platform for interactive design, synthesis and delivery of 3d character motion data
US20100285877A1 (en) * 2009-05-05 2010-11-11 Mixamo, Inc. Distributed markerless motion capture
US20110081959A1 (en) * 2009-10-01 2011-04-07 Wms Gaming, Inc. Representing physical state in gaming systems
US20120019517A1 (en) * 2010-07-23 2012-01-26 Mixamo, Inc. Automatic generation of 3d character animation from 3d meshes
US8169438B1 (en) * 2008-03-31 2012-05-01 Pixar Temporally coherent hair deformation
US20120162217A1 (en) * 2010-12-22 2012-06-28 Electronics And Telecommunications Research Institute 3d model shape transformation method and apparatus
US20120188256A1 (en) * 2009-06-25 2012-07-26 Samsung Electronics Co., Ltd. Virtual world processing device and method
WO2012119215A1 (en) * 2011-03-04 2012-09-13 Eski Inc. Devices and methods for providing a distributed manifestation in an environment
US8281281B1 (en) * 2006-06-07 2012-10-02 Pixar Setting level of detail transition points
EP2538330A1 (en) * 2011-06-21 2012-12-26 Unified Computing Limited A method of rendering a scene file in a cloud-based render farm
US20130314749A1 (en) * 2012-05-28 2013-11-28 Ian A. R. Boyd System and method for the creation of an e-enhanced multi-dimensional pictokids presentation using pictooverlay technology
US20140289703A1 (en) * 2010-10-01 2014-09-25 Adobe Systems Incorporated Methods and Systems for Physically-Based Runtime Effects
US8928672B2 (en) 2010-04-28 2015-01-06 Mixamo, Inc. Real-time automatic concatenation of 3D animation sequences
US8982122B2 (en) 2008-11-24 2015-03-17 Mixamo, Inc. Real time concurrent design of shape, texture, and motion for 3D character animation
WO2015143303A1 (en) 2014-03-20 2015-09-24 Digizyme, Inc. Systems and methods for providing a visualization product
US20150269870A1 (en) * 2014-03-20 2015-09-24 Digizyme, Inc. Visual cell
US20150269855A1 (en) * 2014-03-20 2015-09-24 Digizyme, Inc. Systems and methods for interacting with a visual cell
US9626788B2 (en) 2012-03-06 2017-04-18 Adobe Systems Incorporated Systems and methods for creating animations using human faces
US9786084B1 (en) 2016-06-23 2017-10-10 LoomAi, Inc. Systems and methods for generating computer ready animation models of a human head from captured data images
US10049482B2 (en) 2011-07-22 2018-08-14 Adobe Systems Incorporated Systems and methods for animation recommendations
US10198845B1 (en) 2018-05-29 2019-02-05 LoomAi, Inc. Methods and systems for animating facial expressions
US20190068791A1 (en) * 2016-02-25 2019-02-28 Kddi Corporation Device controller, communication terminal, device control method, compensation calculation method, and device control system
US10559111B2 (en) 2016-06-23 2020-02-11 LoomAi, Inc. Systems and methods for generating computer ready animation models of a human head from captured data images
US10748325B2 (en) 2011-11-17 2020-08-18 Adobe Inc. System and method for automatic rigging of three dimensional characters for facial animation
US10863607B2 (en) 2016-09-07 2020-12-08 Eski Inc. Projection systems for distributed manifestation and related methods
US11551393B2 (en) 2019-07-23 2023-01-10 LoomAi, Inc. Systems and methods for animation generation

Families Citing this family (29)

* Cited by examiner, ā€  Cited by third party
Publication number Priority date Publication date Assignee Title
US7266425B2 (en) * 2004-09-30 2007-09-04 Rockwell Automation Technologies, Inc. Systems and methods that facilitate motion control through coordinate system transformations
DE102005058867B4 (en) * 2005-12-09 2018-09-27 Cine-Tv Broadcast Systems Gmbh Method and device for moving a camera arranged on a pan and tilt head along a predetermined path of movement
JP5204381B2 (en) 2006-05-01 2013-06-05 任天堂ę Ŗ式会ē¤¾ GAME PROGRAM, GAME DEVICE, GAME SYSTEM, AND GAME PROCESSING METHOD
WO2008011353A2 (en) * 2006-07-16 2008-01-24 The Jim Henson Company System and method of producing an animated performance utilizing multiple cameras
US7791608B2 (en) * 2006-07-16 2010-09-07 The Jim Henson Company, Inc. System and method of animating a character through a single person performance
US8443284B2 (en) 2007-07-19 2013-05-14 Apple Inc. Script-integrated storyboards
JP4760896B2 (en) * 2008-11-04 2011-08-31 ć‚½ćƒ‹ćƒ¼ę Ŗ式会ē¤¾ Camera control apparatus and camera control method
US8698898B2 (en) * 2008-12-11 2014-04-15 Lucasfilm Entertainment Company Ltd. Controlling robotic motion of camera
US20120188333A1 (en) * 2009-05-27 2012-07-26 The Ohio State University Spherical view point controller and method for navigating a network of sensors
US8527217B2 (en) * 2009-09-08 2013-09-03 Dynamic Athletic Research Institute, Llc Apparatus and method for physical evaluation
EP2593197A2 (en) * 2010-07-14 2013-05-22 University Court Of The University Of Abertay Dundee Improvements relating to viewing of real-time, computer-generated environments
TWI448999B (en) * 2010-10-13 2014-08-11 Univ Nat Cheng Kung Sketch alive method and system using the method
CA2818000A1 (en) * 2010-11-24 2012-05-31 Aquadownunder Pty Ltd Apparatus and method for environmental monitoring
US8333520B1 (en) 2011-03-24 2012-12-18 CamMate Systems, Inc. Systems and methods for detecting an imbalance of a camera crane
US8540438B1 (en) 2011-03-24 2013-09-24 CamMate Systems. Inc. Systems and methods for positioning a camera crane
US9724600B2 (en) * 2011-06-06 2017-08-08 Microsoft Technology Licensing, Llc Controlling objects in a virtual environment
US9044857B2 (en) * 2012-02-14 2015-06-02 Jerry Neal Sommerville Control system that guides a robot or articulated device with a laser distance meter for 3D motion, or guides a robot or articulated device with a computer pointing device (such as a mouse) for 2D motion
US9892539B2 (en) 2013-01-11 2018-02-13 Disney Enterprises, Inc. Fast rig-based physics simulation
US9659397B2 (en) * 2013-01-11 2017-05-23 Disney Enterprises, Inc. Rig-based physics simulation
US9130492B2 (en) * 2013-04-22 2015-09-08 Thermadyne, Inc. Animatronic system with unlimited axes
JP6385725B2 (en) * 2014-06-06 2018-09-05 任天堂ę Ŗ式会ē¤¾ Information processing system and information processing program
US10073488B2 (en) 2014-09-11 2018-09-11 Grayhill, Inc. Multifunction joystick apparatus and a method for using same
US10025550B2 (en) * 2016-03-15 2018-07-17 Intel Corporation Fast keyboard for screen mirroring
US10479288B2 (en) 2016-08-05 2019-11-19 MotoCrane, LLC Releasable vehicular camera mount
US10847330B2 (en) 2017-10-06 2020-11-24 Grayhill, Inc. No/low-wear bearing arrangement for a knob system
WO2019136075A1 (en) 2018-01-03 2019-07-11 Grayhill, Inc. Touch encoder, touch panel, and input method editor with integrated development environment and methods thereof
JP7341685B2 (en) * 2019-03-19 2023-09-11 ć‚­ćƒ¤ćƒŽćƒ³ę Ŗ式会ē¤¾ Electronic equipment, electronic equipment control method, program, and storage medium
CN110941216B (en) * 2019-11-25 2021-03-12 ę±Ÿč‹å¾å·„å·„ē؋ęœŗę¢°ē ”ē©¶é™¢ęœ‰é™å…¬åø Wireless emergency stop system and method
US20220002128A1 (en) * 2020-04-09 2022-01-06 Chapman/Leonard Studio Equipment, Inc. Telescoping electric camera crane

Citations (77)

* Cited by examiner, ā€  Cited by third party
Publication number Priority date Publication date Assignee Title
US3574954A (en) * 1968-02-09 1971-04-13 Franckh Sche Verlagshandlung W Optical educational toy
US5032842A (en) * 1989-02-07 1991-07-16 Furuno Electric Co., Ltd. Detection system with adjustable target area
US5129044A (en) * 1988-03-01 1992-07-07 Hitachi Construction Machinery Co., Ltd. Position/force controlling apparatus for working machine with multiple of degrees of freedom
US5268996A (en) * 1990-12-20 1993-12-07 General Electric Company Computer image generation method for determination of total pixel illumination due to plural light sources
US5617515A (en) * 1994-07-11 1997-04-01 Dynetics, Inc. Method and apparatus for controlling and programming a robot or other moveable object
US5644722A (en) * 1994-11-17 1997-07-01 Sharp Kabushiki Kaisha Schedule-managing apparatus being capable of moving or copying a schedule of a date to another date
US5658238A (en) * 1992-02-25 1997-08-19 Olympus Optical Co., Ltd. Endoscope apparatus capable of being switched to a mode in which a curvature operating lever is returned and to a mode in which the curvature operating lever is not returned
US5752244A (en) * 1996-07-15 1998-05-12 Andersen Consulting Llp Computerized multimedia asset management system
US5764276A (en) * 1991-05-13 1998-06-09 Interactive Pictures Corporation Method and apparatus for providing perceived video viewing experiences using still images
US5764980A (en) * 1988-10-24 1998-06-09 The Walt Disney Company Method for coordinating production of an animated feature using a logistics system
US5790124A (en) * 1995-11-20 1998-08-04 Silicon Graphics, Inc. System and method for allowing a performer to control and interact with an on-stage display device
US5852435A (en) * 1996-04-12 1998-12-22 Avid Technology, Inc. Digital multimedia editing and data management system
US5864404A (en) * 1996-12-31 1999-01-26 Datalogic S.P.A. Process and apparatus for measuring the volume of an object by means of a laser scanner and a CCD detector
US5909218A (en) * 1996-04-25 1999-06-01 Matsushita Electric Industrial Co., Ltd. Transmitter-receiver of three-dimensional skeleton structure motions and method thereof
US5921659A (en) * 1993-06-18 1999-07-13 Light & Sound Design, Ltd. Stage lighting lamp unit and stage lighting system including such unit
US6005548A (en) * 1996-08-14 1999-12-21 Latypov; Nurakhmed Nurislamovich Method for tracking and displaying user's spatial position and orientation, a method for representing virtual reality for a user, and systems of embodiment of such methods
US6058397A (en) * 1997-04-08 2000-05-02 Mitsubishi Electric Information Technology Center America, Inc. 3D virtual environment creation management and delivery system
US6084590A (en) * 1997-04-07 2000-07-04 Synapix, Inc. Media production with correlation of image stream and abstract objects in a three-dimensional virtual stage
US6199082B1 (en) * 1995-07-17 2001-03-06 Microsoft Corporation Method for delivering separate design and content in a multimedia publishing system
US6219045B1 (en) * 1995-11-13 2001-04-17 Worlds, Inc. Scalable virtual world chat client-server system
US6222551B1 (en) * 1999-01-13 2001-04-24 International Business Machines Corporation Methods and apparatus for providing 3D viewpoint selection in a server/client arrangement
US6268864B1 (en) * 1998-06-11 2001-07-31 Presenter.Com, Inc. Linking a video and an animation
US6278466B1 (en) * 1998-06-11 2001-08-21 Presenter.Com, Inc. Creating animation from a video
US20010020943A1 (en) * 2000-02-17 2001-09-13 Toshiki Hijiri Animation data compression apparatus, animation data compression method, network server, and program storage media
US6329994B1 (en) * 1996-03-15 2001-12-11 Zapa Digital Arts Ltd. Programmable computer graphic objects
US20010051535A1 (en) * 2000-06-13 2001-12-13 Minolta Co., Ltd. Communication system and communication method using animation and server as well as terminal device used therefor
US20020024517A1 (en) * 2000-07-14 2002-02-28 Komatsu Ltd. Apparatus and method for three-dimensional image production and presenting real objects in virtual three-dimensional space
US6377257B1 (en) * 1999-10-04 2002-04-23 International Business Machines Corporation Methods and apparatus for delivering 3D graphics in a networked environment
US6414684B1 (en) * 1996-04-25 2002-07-02 Matsushita Electric Industrial Co., Ltd. Method for communicating and generating computer graphics animation data, and recording media
US20020089506A1 (en) * 2001-01-05 2002-07-11 Templeman James N. User control of simulated locomotion
US20020138843A1 (en) * 2000-05-19 2002-09-26 Andrew Samaan Video distribution method and system
US6463444B1 (en) * 1997-08-14 2002-10-08 Virage, Inc. Video cataloger system with extensibility
US20020167518A1 (en) * 1996-10-16 2002-11-14 Alexander Migdal System and method for computer modeling of 3D objects or surfaces by mesh constructions having optimal quality characteristics and dynamic resolution capabilities
US20020175994A1 (en) * 2001-05-25 2002-11-28 Kuniteru Sakakibara Image pickup system
US20020186221A1 (en) * 2001-06-05 2002-12-12 Reactrix Systems, Inc. Interactive video display system
US20030047602A1 (en) * 1997-10-16 2003-03-13 Takahito Iida System for granting permission of user's personal information to third party
US6538651B1 (en) * 1999-03-19 2003-03-25 John Hayman Parametric geometric element definition and generation system and method
US6546405B2 (en) * 1997-10-23 2003-04-08 Microsoft Corporation Annotating temporally-dimensioned multimedia content
US6557041B2 (en) * 1998-08-24 2003-04-29 Koninklijke Philips Electronics N.V. Real time video game uses emulation of streaming over the internet in a broadcast event
US20030080978A1 (en) * 2001-10-04 2003-05-01 Nassir Navab Augmented reality system
US6559845B1 (en) * 1999-06-11 2003-05-06 Pulse Entertainment Three dimensional animation system and method
US20030090523A1 (en) * 2001-05-14 2003-05-15 Toru Hayashi Information distribution system and information distibution method
US6609451B1 (en) * 1998-10-21 2003-08-26 Omron Corporation Mine detector and inspection apparatus
US20030195853A1 (en) * 2002-03-25 2003-10-16 Mitchell Cyndi L. Interaction system and method
US20040001064A1 (en) * 2002-06-28 2004-01-01 Microsoft Corporation Methods and system for general skinning via hardware accelerators
US6714200B1 (en) * 2000-03-06 2004-03-30 Microsoft Corporation Method and system for efficiently streaming 3D animation across a wide area network
US20040061781A1 (en) * 2002-09-17 2004-04-01 Eastman Kodak Company Method of digital video surveillance utilizing threshold detection and coordinate tracking
US6738065B1 (en) * 1999-08-10 2004-05-18 Oshri Even-Zohar Customizable animation system
US20040114786A1 (en) * 2002-12-06 2004-06-17 Cross Match Technologies, Inc. System and method for capturing print information using a coordinate conversion method
US20040119716A1 (en) * 2002-12-20 2004-06-24 Chang Joon Park Apparatus and method for high-speed marker-free motion capture
US6757432B2 (en) * 1998-07-17 2004-06-29 Matsushita Electric Industrial Co., Ltd. Apparatus for transmitting and/or receiving stream data and method for producing the same
US6760010B1 (en) * 2000-03-15 2004-07-06 Figaro Systems, Inc. Wireless electronic libretto display apparatus and method
US20040131241A1 (en) * 2002-10-15 2004-07-08 Curry Douglas N. Method of converting rare cell scanner image coordinates to microscope coordinates using reticle marks on a sample media
US20040167924A1 (en) * 2003-02-21 2004-08-26 Canon Kabushiki Kaisha Information processing method, information processing apparatus, and distributed processing system
US20040179013A1 (en) * 2003-03-13 2004-09-16 Sony Corporation System and method for animating a digital facial model
US20040181548A1 (en) * 2003-03-12 2004-09-16 Thomas Mark Ivan Digital asset server and asset management system
US20040189702A1 (en) * 2002-09-09 2004-09-30 Michal Hlavac Artificial intelligence platform
US6806864B2 (en) * 1999-12-03 2004-10-19 Siemens Aktiengesellschaft Operating device for influencing displayed information
US6820112B1 (en) * 1999-03-11 2004-11-16 Sony Corporation Information processing system, information processing method and apparatus, and information serving medium
US20040252123A1 (en) * 1999-12-28 2004-12-16 International Business Machines Corporation System and method for presentation of room navigation
US20040263476A1 (en) * 2003-06-24 2004-12-30 In-Keon Lim Virtual joystick system for controlling the operation of security cameras and controlling method thereof
US6898484B2 (en) * 2002-05-01 2005-05-24 Dorothy Lemelson Robotic manufacturing and assembly with relative radio positioning using radio based location determination
US6947044B1 (en) * 1999-05-21 2005-09-20 Kulas Charles J Creation and playback of computer-generated productions using script-controlled rendering engines
US20050225552A1 (en) * 2004-04-09 2005-10-13 Vital Idea, Inc. Method and system for intelligent scalable animation with intelligent parallel processing engine and intelligent animation engine
US20050248577A1 (en) * 2004-05-07 2005-11-10 Valve Corporation Method for separately blending low frequency and high frequency information for animation of a character in a virtual environment
US20060022983A1 (en) * 2004-07-27 2006-02-02 Alias Systems Corp. Processing three-dimensional data
US20060036162A1 (en) * 2004-02-02 2006-02-16 Ramin Shahidi Method and apparatus for guiding a medical instrument to a subsurface target site in a patient
US20060074517A1 (en) * 2003-05-30 2006-04-06 Liebherr-Werk Nenzing Gmbh Crane or excavator for handling a cable-suspended load provided with optimised motion guidance
US20060106494A1 (en) * 2004-10-28 2006-05-18 Accelerated Pictures, Llc Camera and animation controller, systems and methods
US20060140507A1 (en) * 2003-06-23 2006-06-29 Mitsuharu Ohki Image processing method and device, and program
US7216299B2 (en) * 1998-10-16 2007-05-08 Maquis Techtrix Llc Interface and program using visual data arrangements for expressing user preferences concerning an action or transaction
US7226425B2 (en) * 2002-08-26 2007-06-05 Kensey Nash Corporation Crimp and cut tool for sealing and unsealing guide wires and tubular instruments
US7245741B1 (en) * 2000-11-14 2007-07-17 Siemens Aktiengesellschaft Method and device for determining whether the interior of a vehicle is occupied
US7246322B2 (en) * 2002-07-09 2007-07-17 Kaleidescope, Inc. Grid-like guided user interface for video selection and display
US20080028312A1 (en) * 2006-07-28 2008-01-31 Accelerated Pictures, Inc. Scene organization in computer-assisted filmmaking
US20080024615A1 (en) * 2006-07-28 2008-01-31 Accelerated Pictures, Inc. Camera control
US7347780B1 (en) * 2001-05-10 2008-03-25 Best Robert M Game system and game programs

Family Cites Families (2)

* Cited by examiner, ā€  Cited by third party
Publication number Priority date Publication date Assignee Title
JP5043844B2 (en) 2004-08-30 2012-10-10 ćƒˆćƒ¬ćƒ¼ć‚¹ ć‚Ŗćƒ—ćƒ†ć‚£ćƒƒć‚Æ 惆ć‚Æ惎惭ć‚øćƒ¼ć‚ŗ ćƒ”ćƒ¼ćƒ†ć‚£ćƒ¼ćƒÆ悤 ćƒŖ惟惆惃惉 Camera control method and apparatus
US7266425B2 (en) * 2004-09-30 2007-09-04 Rockwell Automation Technologies, Inc. Systems and methods that facilitate motion control through coordinate system transformations

Patent Citations (82)

* Cited by examiner, ā€  Cited by third party
Publication number Priority date Publication date Assignee Title
US3574954A (en) * 1968-02-09 1971-04-13 Franckh Sche Verlagshandlung W Optical educational toy
US5129044A (en) * 1988-03-01 1992-07-07 Hitachi Construction Machinery Co., Ltd. Position/force controlling apparatus for working machine with multiple of degrees of freedom
US5764980A (en) * 1988-10-24 1998-06-09 The Walt Disney Company Method for coordinating production of an animated feature using a logistics system
US5032842A (en) * 1989-02-07 1991-07-16 Furuno Electric Co., Ltd. Detection system with adjustable target area
US5268996A (en) * 1990-12-20 1993-12-07 General Electric Company Computer image generation method for determination of total pixel illumination due to plural light sources
US5764276A (en) * 1991-05-13 1998-06-09 Interactive Pictures Corporation Method and apparatus for providing perceived video viewing experiences using still images
US5658238A (en) * 1992-02-25 1997-08-19 Olympus Optical Co., Ltd. Endoscope apparatus capable of being switched to a mode in which a curvature operating lever is returned and to a mode in which the curvature operating lever is not returned
US5921659A (en) * 1993-06-18 1999-07-13 Light & Sound Design, Ltd. Stage lighting lamp unit and stage lighting system including such unit
US5617515A (en) * 1994-07-11 1997-04-01 Dynetics, Inc. Method and apparatus for controlling and programming a robot or other moveable object
US5644722A (en) * 1994-11-17 1997-07-01 Sharp Kabushiki Kaisha Schedule-managing apparatus being capable of moving or copying a schedule of a date to another date
US6199082B1 (en) * 1995-07-17 2001-03-06 Microsoft Corporation Method for delivering separate design and content in a multimedia publishing system
US6219045B1 (en) * 1995-11-13 2001-04-17 Worlds, Inc. Scalable virtual world chat client-server system
US5790124A (en) * 1995-11-20 1998-08-04 Silicon Graphics, Inc. System and method for allowing a performer to control and interact with an on-stage display device
US6329994B1 (en) * 1996-03-15 2001-12-11 Zapa Digital Arts Ltd. Programmable computer graphic objects
US5852435A (en) * 1996-04-12 1998-12-22 Avid Technology, Inc. Digital multimedia editing and data management system
US6414684B1 (en) * 1996-04-25 2002-07-02 Matsushita Electric Industrial Co., Ltd. Method for communicating and generating computer graphics animation data, and recording media
US5909218A (en) * 1996-04-25 1999-06-01 Matsushita Electric Industrial Co., Ltd. Transmitter-receiver of three-dimensional skeleton structure motions and method thereof
US20010007452A1 (en) * 1996-04-25 2001-07-12 Matsushita Electric Industrial Co., Ltd. Transmitter-receiver of three-dimensional skeleton structure motions and method thereof
US6222560B1 (en) * 1996-04-25 2001-04-24 Matsushita Electric Industrial Co., Ltd. Transmitter-receiver of three-dimensional skeleton structure motions and method thereof
US5752244A (en) * 1996-07-15 1998-05-12 Andersen Consulting Llp Computerized multimedia asset management system
US6005548A (en) * 1996-08-14 1999-12-21 Latypov; Nurakhmed Nurislamovich Method for tracking and displaying user's spatial position and orientation, a method for representing virtual reality for a user, and systems of embodiment of such methods
US20020167518A1 (en) * 1996-10-16 2002-11-14 Alexander Migdal System and method for computer modeling of 3D objects or surfaces by mesh constructions having optimal quality characteristics and dynamic resolution capabilities
US5864404A (en) * 1996-12-31 1999-01-26 Datalogic S.P.A. Process and apparatus for measuring the volume of an object by means of a laser scanner and a CCD detector
US6084590A (en) * 1997-04-07 2000-07-04 Synapix, Inc. Media production with correlation of image stream and abstract objects in a three-dimensional virtual stage
US6058397A (en) * 1997-04-08 2000-05-02 Mitsubishi Electric Information Technology Center America, Inc. 3D virtual environment creation management and delivery system
US6463444B1 (en) * 1997-08-14 2002-10-08 Virage, Inc. Video cataloger system with extensibility
US20030047602A1 (en) * 1997-10-16 2003-03-13 Takahito Iida System for granting permission of user's personal information to third party
US6546405B2 (en) * 1997-10-23 2003-04-08 Microsoft Corporation Annotating temporally-dimensioned multimedia content
US6268864B1 (en) * 1998-06-11 2001-07-31 Presenter.Com, Inc. Linking a video and an animation
US6278466B1 (en) * 1998-06-11 2001-08-21 Presenter.Com, Inc. Creating animation from a video
US6757432B2 (en) * 1998-07-17 2004-06-29 Matsushita Electric Industrial Co., Ltd. Apparatus for transmitting and/or receiving stream data and method for producing the same
US6557041B2 (en) * 1998-08-24 2003-04-29 Koninklijke Philips Electronics N.V. Real time video game uses emulation of streaming over the internet in a broadcast event
US6697869B1 (en) * 1998-08-24 2004-02-24 Koninklijke Philips Electronics N.V. Emulation of streaming over the internet in a broadcast application
US7216299B2 (en) * 1998-10-16 2007-05-08 Maquis Techtrix Llc Interface and program using visual data arrangements for expressing user preferences concerning an action or transaction
US6609451B1 (en) * 1998-10-21 2003-08-26 Omron Corporation Mine detector and inspection apparatus
US6222551B1 (en) * 1999-01-13 2001-04-24 International Business Machines Corporation Methods and apparatus for providing 3D viewpoint selection in a server/client arrangement
US6820112B1 (en) * 1999-03-11 2004-11-16 Sony Corporation Information processing system, information processing method and apparatus, and information serving medium
US6538651B1 (en) * 1999-03-19 2003-03-25 John Hayman Parametric geometric element definition and generation system and method
US6947044B1 (en) * 1999-05-21 2005-09-20 Kulas Charles J Creation and playback of computer-generated productions using script-controlled rendering engines
US6559845B1 (en) * 1999-06-11 2003-05-06 Pulse Entertainment Three dimensional animation system and method
US20030137516A1 (en) * 1999-06-11 2003-07-24 Pulse Entertainment, Inc. Three dimensional animation system and method
US6738065B1 (en) * 1999-08-10 2004-05-18 Oshri Even-Zohar Customizable animation system
US6377257B1 (en) * 1999-10-04 2002-04-23 International Business Machines Corporation Methods and apparatus for delivering 3D graphics in a networked environment
US6806864B2 (en) * 1999-12-03 2004-10-19 Siemens Aktiengesellschaft Operating device for influencing displayed information
US20040252123A1 (en) * 1999-12-28 2004-12-16 International Business Machines Corporation System and method for presentation of room navigation
US20010020943A1 (en) * 2000-02-17 2001-09-13 Toshiki Hijiri Animation data compression apparatus, animation data compression method, network server, and program storage media
US6714200B1 (en) * 2000-03-06 2004-03-30 Microsoft Corporation Method and system for efficiently streaming 3D animation across a wide area network
US6760010B1 (en) * 2000-03-15 2004-07-06 Figaro Systems, Inc. Wireless electronic libretto display apparatus and method
US20020138843A1 (en) * 2000-05-19 2002-09-26 Andrew Samaan Video distribution method and system
US20010051535A1 (en) * 2000-06-13 2001-12-13 Minolta Co., Ltd. Communication system and communication method using animation and server as well as terminal device used therefor
US20020024517A1 (en) * 2000-07-14 2002-02-28 Komatsu Ltd. Apparatus and method for three-dimensional image production and presenting real objects in virtual three-dimensional space
US7245741B1 (en) * 2000-11-14 2007-07-17 Siemens Aktiengesellschaft Method and device for determining whether the interior of a vehicle is occupied
US6646643B2 (en) * 2001-01-05 2003-11-11 The United States Of America As Represented By The Secretary Of The Navy User control of simulated locomotion
US20020089506A1 (en) * 2001-01-05 2002-07-11 Templeman James N. User control of simulated locomotion
US7347780B1 (en) * 2001-05-10 2008-03-25 Best Robert M Game system and game programs
US20030090523A1 (en) * 2001-05-14 2003-05-15 Toru Hayashi Information distribution system and information distibution method
US20020175994A1 (en) * 2001-05-25 2002-11-28 Kuniteru Sakakibara Image pickup system
US20020186221A1 (en) * 2001-06-05 2002-12-12 Reactrix Systems, Inc. Interactive video display system
US20030080978A1 (en) * 2001-10-04 2003-05-01 Nassir Navab Augmented reality system
US20030195853A1 (en) * 2002-03-25 2003-10-16 Mitchell Cyndi L. Interaction system and method
US6898484B2 (en) * 2002-05-01 2005-05-24 Dorothy Lemelson Robotic manufacturing and assembly with relative radio positioning using radio based location determination
US20040001064A1 (en) * 2002-06-28 2004-01-01 Microsoft Corporation Methods and system for general skinning via hardware accelerators
US7246322B2 (en) * 2002-07-09 2007-07-17 Kaleidescope, Inc. Grid-like guided user interface for video selection and display
US7226425B2 (en) * 2002-08-26 2007-06-05 Kensey Nash Corporation Crimp and cut tool for sealing and unsealing guide wires and tubular instruments
US20040189702A1 (en) * 2002-09-09 2004-09-30 Michal Hlavac Artificial intelligence platform
US20040061781A1 (en) * 2002-09-17 2004-04-01 Eastman Kodak Company Method of digital video surveillance utilizing threshold detection and coordinate tracking
US20040131241A1 (en) * 2002-10-15 2004-07-08 Curry Douglas N. Method of converting rare cell scanner image coordinates to microscope coordinates using reticle marks on a sample media
US20040114786A1 (en) * 2002-12-06 2004-06-17 Cross Match Technologies, Inc. System and method for capturing print information using a coordinate conversion method
US20040119716A1 (en) * 2002-12-20 2004-06-24 Chang Joon Park Apparatus and method for high-speed marker-free motion capture
US20040167924A1 (en) * 2003-02-21 2004-08-26 Canon Kabushiki Kaisha Information processing method, information processing apparatus, and distributed processing system
US20040181548A1 (en) * 2003-03-12 2004-09-16 Thomas Mark Ivan Digital asset server and asset management system
US20040179013A1 (en) * 2003-03-13 2004-09-16 Sony Corporation System and method for animating a digital facial model
US20060074517A1 (en) * 2003-05-30 2006-04-06 Liebherr-Werk Nenzing Gmbh Crane or excavator for handling a cable-suspended load provided with optimised motion guidance
US20060140507A1 (en) * 2003-06-23 2006-06-29 Mitsuharu Ohki Image processing method and device, and program
US20040263476A1 (en) * 2003-06-24 2004-12-30 In-Keon Lim Virtual joystick system for controlling the operation of security cameras and controlling method thereof
US20060036162A1 (en) * 2004-02-02 2006-02-16 Ramin Shahidi Method and apparatus for guiding a medical instrument to a subsurface target site in a patient
US20050225552A1 (en) * 2004-04-09 2005-10-13 Vital Idea, Inc. Method and system for intelligent scalable animation with intelligent parallel processing engine and intelligent animation engine
US20050248577A1 (en) * 2004-05-07 2005-11-10 Valve Corporation Method for separately blending low frequency and high frequency information for animation of a character in a virtual environment
US20060022983A1 (en) * 2004-07-27 2006-02-02 Alias Systems Corp. Processing three-dimensional data
US20060106494A1 (en) * 2004-10-28 2006-05-18 Accelerated Pictures, Llc Camera and animation controller, systems and methods
US20080028312A1 (en) * 2006-07-28 2008-01-31 Accelerated Pictures, Inc. Scene organization in computer-assisted filmmaking
US20080024615A1 (en) * 2006-07-28 2008-01-31 Accelerated Pictures, Inc. Camera control

Cited By (73)

* Cited by examiner, ā€  Cited by third party
Publication number Priority date Publication date Assignee Title
US7433760B2 (en) 2004-10-28 2008-10-07 Accelerated Pictures, Inc. Camera and animation controller, systems and methods
US20070109304A1 (en) * 2005-11-17 2007-05-17 Royi Akavia System and method for producing animations based on drawings
WO2007061929A2 (en) * 2005-11-17 2007-05-31 Little Director, Inc. System and method for producing animations based on drawings
WO2007061929A3 (en) * 2005-11-17 2008-10-02 Little Director Inc System and method for producing animations based on drawings
US8281281B1 (en) * 2006-06-07 2012-10-02 Pixar Setting level of detail transition points
US20080028312A1 (en) * 2006-07-28 2008-01-31 Accelerated Pictures, Inc. Scene organization in computer-assisted filmmaking
US20080024615A1 (en) * 2006-07-28 2008-01-31 Accelerated Pictures, Inc. Camera control
US7880770B2 (en) 2006-07-28 2011-02-01 Accelerated Pictures, Inc. Camera control
US20100029377A1 (en) * 2006-10-03 2010-02-04 Canterbury Stephen A Shared physics engine in a wagering game system
US20090002377A1 (en) * 2007-06-26 2009-01-01 Samsung Electronics Co., Ltd. Apparatus and method for synchronizing and sharing virtual character
US8687005B2 (en) * 2007-06-26 2014-04-01 Samsung Electronics Co., Ltd. Apparatus and method for synchronizing and sharing virtual character
US20090027302A1 (en) * 2007-07-25 2009-01-29 Lenovo (Beijing) Limited Method for operating object between terminals and terminal using the method
US8446337B2 (en) * 2007-07-25 2013-05-21 Lenovo (Beijing) Limited Method for operating object between terminals and terminal using the method
US20090046097A1 (en) * 2007-08-09 2009-02-19 Scott Barrett Franklin Method of making animated video
US8169438B1 (en) * 2008-03-31 2012-05-01 Pixar Temporally coherent hair deformation
US8704832B2 (en) 2008-09-20 2014-04-22 Mixamo, Inc. Interactive design, synthesis and delivery of 3D character motion data through the web
US9373185B2 (en) 2008-09-20 2016-06-21 Adobe Systems Incorporated Interactive design, synthesis and delivery of 3D motion data through the web
US20100073361A1 (en) * 2008-09-20 2010-03-25 Graham Taylor Interactive design, synthesis and delivery of 3d character motion data through the web
US20100073379A1 (en) * 2008-09-24 2010-03-25 Sadan Eray Berger Method and system for rendering real-time sprites
US9460539B2 (en) 2008-10-14 2016-10-04 Adobe Systems Incorporated Data compression for real-time streaming of deformable 3D models for 3D animation
US8749556B2 (en) 2008-10-14 2014-06-10 Mixamo, Inc. Data compression for real-time streaming of deformable 3D models for 3D animation
US20100149179A1 (en) * 2008-10-14 2010-06-17 Edilson De Aguiar Data compression for real-time streaming of deformable 3d models for 3d animation
US20100110081A1 (en) * 2008-10-30 2010-05-06 Microsoft Corporation Software-aided creation of animated stories
US9978175B2 (en) 2008-11-24 2018-05-22 Adobe Systems Incorporated Real time concurrent design of shape, texture, and motion for 3D character animation
US9305387B2 (en) 2008-11-24 2016-04-05 Adobe Systems Incorporated Real time generation of animation-ready 3D character models
US8982122B2 (en) 2008-11-24 2015-03-17 Mixamo, Inc. Real time concurrent design of shape, texture, and motion for 3D character animation
US20100134490A1 (en) * 2008-11-24 2010-06-03 Mixamo, Inc. Real time generation of animation-ready 3d character models
US8659596B2 (en) 2008-11-24 2014-02-25 Mixamo, Inc. Real time generation of animation-ready 3D character models
US20100146422A1 (en) * 2008-12-08 2010-06-10 Samsung Electronics Co., Ltd. Display apparatus and displaying method thereof
US9619914B2 (en) 2009-02-12 2017-04-11 Facebook, Inc. Web platform for interactive design, synthesis and delivery of 3D character motion data
US20100259547A1 (en) * 2009-02-12 2010-10-14 Mixamo, Inc. Web platform for interactive design, synthesis and delivery of 3d character motion data
US20100231582A1 (en) * 2009-03-10 2010-09-16 Yogurt Bilgi Teknolojileri A.S. Method and system for distributing animation sequences of 3d objects
US20100285877A1 (en) * 2009-05-05 2010-11-11 Mixamo, Inc. Distributed markerless motion capture
US20120188256A1 (en) * 2009-06-25 2012-07-26 Samsung Electronics Co., Ltd. Virtual world processing device and method
US20110081959A1 (en) * 2009-10-01 2011-04-07 Wms Gaming, Inc. Representing physical state in gaming systems
US8928672B2 (en) 2010-04-28 2015-01-06 Mixamo, Inc. Real-time automatic concatenation of 3D animation sequences
US8797328B2 (en) * 2010-07-23 2014-08-05 Mixamo, Inc. Automatic generation of 3D character animation from 3D meshes
US20120019517A1 (en) * 2010-07-23 2012-01-26 Mixamo, Inc. Automatic generation of 3d character animation from 3d meshes
US20150145859A1 (en) * 2010-07-23 2015-05-28 Mixamo, Inc. Automatic Generation of 3D Character Animation from 3D Meshes
US20140289703A1 (en) * 2010-10-01 2014-09-25 Adobe Systems Incorporated Methods and Systems for Physically-Based Runtime Effects
US9652201B2 (en) * 2010-10-01 2017-05-16 Adobe Systems Incorporated Methods and systems for physically-based runtime effects
US8922547B2 (en) * 2010-12-22 2014-12-30 Electronics And Telecommunications Research Institute 3D model shape transformation method and apparatus
US20120162217A1 (en) * 2010-12-22 2012-06-28 Electronics And Telecommunications Research Institute 3d model shape transformation method and apparatus
US9648707B2 (en) 2011-03-04 2017-05-09 Eski Inc. Devices and methods for providing a distributed manifestation in an environment
US10499482B2 (en) 2011-03-04 2019-12-03 Eski Inc. Devices and methods for providing a distributed manifestation in an environment
US10104751B2 (en) 2011-03-04 2018-10-16 Eski Inc. Devices and methods for providing a distributed manifestation in an environment
US9286028B2 (en) 2011-03-04 2016-03-15 Eski Inc. Devices and methods for providing a distributed manifestation in an environment
US9974151B2 (en) 2011-03-04 2018-05-15 Eski Inc. Devices and methods for providing a distributed manifestation in an environment
WO2012119215A1 (en) * 2011-03-04 2012-09-13 Eski Inc. Devices and methods for providing a distributed manifestation in an environment
EP2538330A1 (en) * 2011-06-21 2012-12-26 Unified Computing Limited A method of rendering a scene file in a cloud-based render farm
US10049482B2 (en) 2011-07-22 2018-08-14 Adobe Systems Incorporated Systems and methods for animation recommendations
US10565768B2 (en) 2011-07-22 2020-02-18 Adobe Inc. Generating smooth animation sequences
US11170558B2 (en) 2011-11-17 2021-11-09 Adobe Inc. Automatic rigging of three dimensional characters for animation
US10748325B2 (en) 2011-11-17 2020-08-18 Adobe Inc. System and method for automatic rigging of three dimensional characters for facial animation
US9626788B2 (en) 2012-03-06 2017-04-18 Adobe Systems Incorporated Systems and methods for creating animations using human faces
US9747495B2 (en) 2012-03-06 2017-08-29 Adobe Systems Incorporated Systems and methods for creating and distributing modifiable animated video messages
US20130314749A1 (en) * 2012-05-28 2013-11-28 Ian A. R. Boyd System and method for the creation of an e-enhanced multi-dimensional pictokids presentation using pictooverlay technology
US20140122984A1 (en) * 2012-05-28 2014-05-01 Ian A. R. Boyd System and method for the creation of an e-enhanced multi-dimensional pictostory using pictooverlay technology
US20150269855A1 (en) * 2014-03-20 2015-09-24 Digizyme, Inc. Systems and methods for interacting with a visual cell
US20150269870A1 (en) * 2014-03-20 2015-09-24 Digizyme, Inc. Visual cell
WO2015143303A1 (en) 2014-03-20 2015-09-24 Digizyme, Inc. Systems and methods for providing a visualization product
US20190068791A1 (en) * 2016-02-25 2019-02-28 Kddi Corporation Device controller, communication terminal, device control method, compensation calculation method, and device control system
US10447863B2 (en) * 2016-02-25 2019-10-15 Kddi Corporation Device controller, communication terminal, device control method, compensation calculation method, and device control system
US10742820B2 (en) 2016-02-25 2020-08-11 Kddi Corporation Device controller, communication terminal, device control method, compensation calculation method, and device control system
US11032429B2 (en) 2016-02-25 2021-06-08 Kddi Corporation Device controller, communication terminal, device control method, compensation calculation method, and device control system
US11509774B2 (en) 2016-02-25 2022-11-22 Kddi Corporation Device controller, communication terminal, device control method, compensation calculation method, and device control system
US10559111B2 (en) 2016-06-23 2020-02-11 LoomAi, Inc. Systems and methods for generating computer ready animation models of a human head from captured data images
US10169905B2 (en) 2016-06-23 2019-01-01 LoomAi, Inc. Systems and methods for animating models from audio data
US10062198B2 (en) 2016-06-23 2018-08-28 LoomAi, Inc. Systems and methods for generating computer ready animation models of a human head from captured data images
US9786084B1 (en) 2016-06-23 2017-10-10 LoomAi, Inc. Systems and methods for generating computer ready animation models of a human head from captured data images
US10863607B2 (en) 2016-09-07 2020-12-08 Eski Inc. Projection systems for distributed manifestation and related methods
US10198845B1 (en) 2018-05-29 2019-02-05 LoomAi, Inc. Methods and systems for animating facial expressions
US11551393B2 (en) 2019-07-23 2023-01-10 LoomAi, Inc. Systems and methods for animation generation

Also Published As

Publication number Publication date
US7433760B2 (en) 2008-10-07
US20060106494A1 (en) 2006-05-18
US20080312770A1 (en) 2008-12-18
WO2006050197A2 (en) 2006-05-11
WO2006050198A2 (en) 2006-05-11
WO2006050197A3 (en) 2007-12-21
WO2006050198A3 (en) 2009-04-16

Similar Documents

Publication Publication Date Title
US20060109274A1 (en) Client/server-based animation software, systems and methods
US7116330B2 (en) Approximating motion using a three-dimensional model
Latoschik et al. FakeMi: A fake mirror system for avatar embodiment studies
US7460130B2 (en) Method and system for generation, storage and distribution of omni-directional object views
Hornung et al. Character animation from 2d pictures and 3d motion data
US11024098B1 (en) Augmenting a physical object with virtual components
Hauswiesner et al. Free viewpoint virtual try-on with commodity depth cameras
EP3980974A1 (en) Single image-based real-time body animation
CN109151540A (en) The interaction processing method and device of video image
Dutreve et al. Feature points based facial animation retargeting
Ponton et al. Combining Motion Matching and Orientation Prediction to Animate Avatars for Consumerā€Grade VR Devices
WO2001063560A1 (en) 3d game avatar using physical characteristics
Balcisoy et al. Interaction between real and virtual humans in augmented reality
Apostolakis et al. Natural user interfaces for virtual character full body and facial animation in immersive virtual worlds
Li et al. Collaborative distributed virtual sculpting
KR101859318B1 (en) Video content production methods using 360 degree virtual camera
US11450054B2 (en) Method for operating a character rig in an image-generation system using constraints on reference nodes
US11074738B1 (en) System for creating animations using component stress indication
JP7459199B1 (en) Image Processing System
US9128516B1 (en) Computer-generated imagery using hierarchical models and rigging
Huynh et al. A framework for cost-effective communication system for 3D data streaming and real-time 3D reconstruction
Magnenat-Thalmann et al. Applications of interactive virtual humans in mobile augmented reality
Huynh Development of a standardized framework for cost-effective communication system based on 3D data streaming and real-time 3D reconstruction
Lai et al. Extra detail addition based on existing texture for animated news production
Mashalkar et al. Creating Personalized Avatars

Legal Events

Date Code Title Description
AS Assignment

Owner name: ACCELERATED PICTURES, LLC, WEST VIRGINIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ALVAREZ, DONALD;PARRY, MARK;REEL/FRAME:017119/0268;SIGNING DATES FROM 20051219 TO 20051228

AS Assignment

Owner name: ACCELERATED PICTURES, INC., WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ACCELERATED PICTURES, LLC;REEL/FRAME:021429/0209

Effective date: 20080821

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION