WO2000008853A1 - Interactive movie creation from one or more still images in a digital imaging device - Google Patents

Interactive movie creation from one or more still images in a digital imaging device Download PDF

Info

Publication number
WO2000008853A1
WO2000008853A1 PCT/US1999/017636 US9917636W WO0008853A1 WO 2000008853 A1 WO2000008853 A1 WO 2000008853A1 US 9917636 W US9917636 W US 9917636W WO 0008853 A1 WO0008853 A1 WO 0008853A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
path
movie
panning
image frames
Prior art date
Application number
PCT/US1999/017636
Other languages
French (fr)
Inventor
Carl J. Alsing
Eric C. Anderson
Original Assignee
Flashpoint Technology, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=22438627&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=WO2000008853(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Flashpoint Technology, Inc. filed Critical Flashpoint Technology, Inc.
Priority to AU52545/99A priority Critical patent/AU5254599A/en
Publication of WO2000008853A1 publication Critical patent/WO2000008853A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/635Region indicators; Field of view indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects

Definitions

  • the present invention relates generally to digital imaging devices, and more particularly to a method and system for interactively creating a movie from one or more images in such a device.
  • Digital video cameras differ from digital still cameras in a number of respects. Digital video cameras capture approximately thirty frames per second and are optimized to capture a large amount of moving images, but sacrifice image quality. That is, digital video cameras typically capture thirty low-resolution 640 x 480 images per second.
  • the uncompressed digital video signals from all those low resolution images require huge amounts memory storage, and high-ratio real-time compression schemes, such as MPEG, are essential for providing digital video for today's computers.
  • the hardware to support such processing is expensive, placing most digital video cameras outside the reach of most consumers.
  • Still digital cameras offer a less expensive alternative to digital video cameras, but are used primarily for capturing high quality static photographs. Still digital cameras are less expensive because they have far less processing power and memory capacity than digital video cameras. Even with these limitations, some still digital cameras are also capable of capturing sequential images, such as a burst image.
  • a burst image is a series of images captured in rapid succession, such as 3 images per second, for instance.
  • a typical still digital camera equipped with an LCD screen operates in two modes, capture or record mode for capturing images, and play mode for playing back the captured images on the LCD screen. Unfortunately, even still digital cameras capable of capturing burst images are incapable of displaying the images comprising the burst in play mode as a movie.
  • a burst image usually includes only 3-8 images and therefore does not have a sufficient number of images to display as a movie. And even if there were enough images to play as a movie, the camera would be incapable of displaying the images at the high frame rate required for a movie presentation. This is because the camera would have to retrieve each image from memory, decompress, resize and then display each image. Due to the limited resources of today's still digital cameras, the display of a burst image resembles more of a slide show than a movie.
  • the present invention provides a method and system for interactively creating a movie from a still image in a digital imaging device that includes a display screen.
  • the method and system include determining a path of panning across the still image, generating a plurality of image frames along the path of panning, and then displaying the plurality image frames for a predetermined time interval on the display screen to present the movie.
  • the path of panning is determined by the user placing several key frames on the still images. By varying the sizes of the key frames, the user may also control the amount of zoom between the key frames.
  • the camera automatically interpolates a series of image frames between each pair of key frames to provide a sufficient number of frames to play during the duration of the movie.
  • a movie is created from a group of images by cross fading the movies as a function of time.
  • the resulting movie appears much smoother than it would otherwise because abrupt scene changes between each of the images in the groups are minimized by the cross fading transitions between the group of images.
  • high-frame rate movie presentations are created from a low frame-rate still digital camera having limited storing capacity.
  • FIG. 1 is a block diagram of a digital camera that operates in accordance with the present invention.
  • FIGS. 2A and 2B are diagrams depicting exemplary hardware components of the camera's user interface.
  • FIG. 3 is a flow chart illustrating the basic process of creating a movie from a still image in accordance with the present invention.
  • FIG. 4A is a diagram showing an example image displayed on the LCD screen in play mode.
  • FIGS. 5A - 5F are diagrams illustrating the process of defining a panning and zooming path by the placing a sequence of key image frames within an image.
  • FIG. 6 is a diagram illustrating several image frames between an adjacent pair of key image frames.
  • FIGS. 7 and 8 are diagrams illustrating the creation of a movie from a group of still images in accordance with the present invention.
  • FIG. 9 is a flow chart of an improved process for creating a movie from a group of images that minimizes abrupt scene changes and creates a smoother playing movie.
  • FIG. 10A is a diagram illustrating an example panning and zooming path across the composite image, where the "X's" indicate the path and the location of key image frames.
  • FIG. 10B is a diagram illustrating a portion of the image frames that will be generated along the panning and zooming path.
  • FIG. 10C is a timing diagram showing the percentage of images A, B and
  • the present invention relates to a method and system for generating a digital movie from digital still images.
  • the following description is presented to enable one of ordinary skill in the art to make and use the invention and is provided in the context of a patent application and its requirements.
  • the present invention will be described in the context of a still digital camera, various modifications to the preferred embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments. That is, any digital imaging capture device which captures, stores, or displays digital images, could incorporate the features described hereinbelow and that device would be within the spirit and scope of the present invention.
  • the present invention is not intended to be limited to the embodiment shown but is to be accorded the widest scope consistent with the principles and features described herein. Referring now to FIG.
  • Camera 110 preferably comprises an imaging device 114, a system bus 116 and a computer 118.
  • Imaging device 114 includes an image sensor, such as a charged coupled device (CCD) or a CMOS sensor, for generating a set of raw image data representing a captured image.
  • system bus 116 provides connection paths between imaging device 114, an optional power manager 342, central processing unit (CPU) 344, dynamic random-access memory (DRAM) 346, input/output interface (I/O) 348, nonvolatile memory 350, and buffers/connector 352 that connect an optional removable memory 354 to system bus 116.
  • CPU central processing unit
  • DRAM dynamic random-access memory
  • I/O input/output interface
  • nonvolatile memory 350 nonvolatile memory 350
  • buffers/connector 352 that connect an optional removable memory 354 to system bus 116.
  • CPU 344 may include a conventional microprocessor device for controlling the operation of camera 110.
  • CPU 344 is capable of concurrently running multiple software routines to control the various processes of camera 110 within a multithreaded environment. For example, images may be captured at the same time that previously captured images are processed in the background to effectively increase the capture rate of the camera.
  • CPU 244 runs a operating system that includes a menu-driven GUI and provides image processing through software, rather than hardware.
  • An example of such software is the DigitaTM Operating Environment by FlashPoint Technology of San Jose, California.
  • CPU 344 is preferably a microprocessor, one or more DSP's (digital signal processor) or ASIC's (Application Specific Integrated Circuit) could also be used.
  • DSP's digital signal processor
  • ASIC's Application Specific Integrated Circuit
  • I/O 348 is an interface device allowing communications to and from computer 118.
  • I/O 348 permits an external host computer (not shown) to connect to and communicate with computer 118.
  • I/O 348 also interfaces with a plurality of buttons and/or dials 404, and an optional status LCD
  • Non-volatile memory 350 which may typically comprise a conventional read-only memory or flash memory, stores a set of computer-readable program instructions to control the operation of camera 110.
  • Removable memory 354 serves as an additional image data storage area and is preferably a non-volatile device, such a flash disk, readily removable and replaceable by a camera 110 user via buffers/connector 352.
  • Power supply 356 supplies operating power to the various components of camera 110.
  • Power manager 342 communicates via line 366 with power supply
  • power supply 356 provides operating power to a main power bus 362 and also to a secondary power bus 364.
  • the main power bus 362 provides power to imaging device 114, I/O 348, non-volatile memory 350 and removable memory 354.
  • the secondary power bus 364 provides power to power manager 342, CPU 344 and DRAM 346.
  • Power supply 356 is connected to main batteries 358 and also to backup batteries 360.
  • a- camera 110 user may also connect power supply 356 to an external power source.
  • the main batteries 358 provide operating power to power supply 356 which then provides the operating power to camera 110 via both main power bus 362 and secondary power bus 364.
  • the backup batteries 360 provide operating power to power supply 356 which then provides the operating power only to the secondary power bus 364 of camera 110.
  • Dynamic Random-Access-Memory (DRAM) 346 is a contiguous block of dynamic memory that may be selectively allocated for various storage functions. DRAM 346 stores both raw and compressed image data and is also used by CPU 344 while executing the software routines used within computer 118.
  • the raw image data received from imaging device 114 is temporarily stored in several input buffers (not shown) within DRAM 346. Once the raw image data is processed, it is stored in a frame buffer (not shown) for display on the LCD screen 402.
  • the input buffers and the frame buffer are split into two ping-pong buffers to improve the display speed of the digital camera and to prevent the tearing of the image in the display 402.
  • LCD controller 390 transfers the image data to LCD screen 402 for display.
  • FIGS. 2A and 2B are diagrams depicting exemplary hardware components of the camera's user interface 408.
  • FIG. 2A is back view of the camera 110 showing the LCD screen 402, a four-way navigation control button 409, an overlay button 412, a menu button 414, and a set of programmable soft keys 416.
  • FIG. 2B is a top view of the camera 110 showing a shutter button 418, and a mode dial 420.
  • the camera may optionally include status LCD 406, status LCD scroll and select buttons 422 and 424, a sound record button 426, and zoom-in, zoom-out buttons 426a and 426b.
  • the camera operates in at least two modes, capture mode for capturing images, and play mode for playing back the captured images on the LCD screen 402.
  • the camera is capable of capturing single still images as well as sequential images, which are objects comprising multiple images.
  • Examples of a sequential image include a panorama image, a burst image, and a time lapse image.
  • a panorama image comprises several overlapping images of a larger scene.
  • a burst image is a series of images captured in rapid succession, such as 3 images per second, for instance.
  • a time lapse image is a series of images automatically captured by the camera at predefined time intervals for a defined duration (e.g. capturing a picture every five minutes for an hour.
  • the camera is capable of capturing sequential images, such as burst images, and time lapse images
  • the camera can only play the objects back at a relatively slow frame rate because each image must be retrieved from memory, decompressed, resized and then displayed.
  • the play back of a burst image or a time lapse resembles more of a slide show than a movie.
  • the present invention is a method and system for interactively creating a movie from one or more images in a digital camera.
  • a panning and zooming path of a set of small frames is defined across a still image.
  • a number of image frames is generated between the key image frames, such that when the sequence of frames is displayed at regular time intervals, it appears similar to a panning motion picture.
  • a movie can also be created from several still images taken at different times and of adjacent (or the same) view using a similar technique.
  • the resulting frames from each of the still images are then appended together to create a longer presentation.
  • the user may control the placements of the frames, zooms, time intervals, and other photographic parameters to create the desired motion effects.
  • FIG. 3 is a flow chart illustrating the basic process of creating a movie from a still image in accordance with the present invention. Initially, the user places the camera in capture mode and captures a desired image in step 450. The user then switches the camera to play mode to display the captured image or a previously captured image on the LCD screen 402 in step 452.
  • FIG. 4A is a diagram showing an example image 432 displayed on the LCD screen 402 in play mode.
  • a path of panning and zooming across the image is determined in step 454.
  • the path of panning defines how the finished movie will appear to the user to sweep across the scene depicted in the image.
  • FIG. 4B is a diagram illustrating an example path of panning across image 432 shown by the arrows.
  • a path of zooming may also be determined along the path of panning. Whereas the path of panning simulates the motion of a movie camera across a scene, the zoom path simulates the amount by which the movie zooms in and out of the scene.
  • the path of panning is manually determined by the user through the placement of a sequence of icons in the image that identifies key image frames.
  • the location of the key image frames in relation to each other determines the panning path.
  • the size of the key image frames may be controlled by the user, which in turn determines the zoom path; a larger key image frame followed by a smaller key image frame simulates zooming in on a scene.
  • a zoom function is not provided to simplify the process, in which case the size of the key image frames are made constant.
  • FIGS. 5A - 5F are diagrams illustrating the process of a user defining the panning and zooming path shown in FIG. 4B by the placing a sequence of key image frames within image 432.
  • the user begins by defining the starting point of the movie by placing the first key image frame 475 in image 432. Key image frames two through five are then placed as shown in FIGS 5B - 5E, respectively, and the user places the last key image frame 475' that will end the movie in FIG. 5F. Notice the last key image frame 475' is larger the previous frame. This will cause a zoom-out effect near end of the movie when it is played.
  • the camera After the path of panning and zooming has been determined, the camera generates image frames along the path in step 456.
  • FIG. 6 is a diagram illustrating several image frames 477 generated between an adjacent pair of key image frames 475. Generating the image frames 477 requires calculating: 1 ) the number of frames to generate between each pair of key image frames 475, 2) the position of each image frame 477 on the image, and optionally, 3) the size of each image frame 477 which determines the zoom factor.
  • the number of image frames 477 to generate between the key image frames 475 will be dependent upon the following combination of variables: 1 ) the frame rate at which the movie will be displayed and the duration of the movie; or 2) the frame rate at which the movie will be displayed and a time stamp associated with each key image frame.
  • Each of the above variables may be either preset or entered by the user. The actual equations for calculating the number of image frames 477 from these variables would be readably apparent to those skilled in the art.
  • the X, Y position of the N th image frame 477 is then derived by adding ⁇ X and ⁇ Y to the position of N-1 image frame.
  • the above equations described above simulate panning between key image frames 475 as a linear function.
  • the panning path may be simulated more naturally using a curve fitting function, such as a Bezier or b-spline function, for example.
  • the size of the key image frames 475 may be used to simulate a camera zooming in or out of the original image.
  • the size of the image frames 477 between a pair of adjacent key image frames 475 can be determined by calculating the slopes between the four corners of the adjacent key image frames 475, and then using the slopes to determine the boundaries of the image frames 477.
  • each image frame 477 is displayed on the LCD screen 402 for a predetermined time interval (e.g. 1/30 of a second for 30 fps) in step 458.
  • a predetermined time interval e.g. 1/30 of a second for 30 fps
  • each frame is discarded after it is displayed to reduce storage requirements.
  • the frames may be displayed after they have all been processed and stored.
  • the image data within the frames from the original image may be resized to fit the LCD screen 402. Displaying the image frames 477 full-sized on the LCD screen 402 at a high frame appears to the user as a smooth panning movie.
  • GUI graphical user interface
  • the GUI may be implemented in a variety of ways and a detailed discussion of the GUI is outside the scope of this discussion.
  • icons may be used to mark the path of panning and zooming across the image, as shown in FIG. 4B and FIG. 7.
  • the key image frames 475 may be transparently generated along the prescribed panning path.
  • the image processing can be performed each time the movie is being presented for viewing, thus requiring less storage memory, as described above.
  • the storage memory need only contain the original still image(s) and the processing parameters.
  • the image processing can be carried out to create the movie, and the movie can be stored as a multimedia object for subsequent playback.
  • FIGS. 7 and 8 diagrams illustrating the creation of a movie from a group of still images in accordance with the present invention are shown.
  • FIG. 7 illustrates the creation of a movie from a panorama image comprising several partially overlapping images of a larger scene.
  • the user can create a panning motion that traverses the entire scene. If the panorama included enough images to form a 360° pattern, then a panning motion could be created that continuously spirals across the panorama.
  • FIG. 8 is a diagram illustrating the creation of a movie from a group of images, which may comprise a sequential image or represent several still images taken of substantially the same subject matter, but with different zoom settings and/or camera orientations.
  • the group may also represent totally unrelated still images that the user has manually grouped for some reason.
  • the images in the group shown here as images A, B, and C, were taken at different times along time line 480. Thus, each image in the group is associated with a different time. In the case where the images are unrelated, the user may manually associate the images with a time stamp.
  • images A, B and C represent a burst image of a marching band, where the first image shows the band member's left leg up and the second and third images show the band member's right leg up.
  • a movie could be created from images A, B and C by defining a key image frame 477 on each of the three images and then generating image frames 477 between each pair of key image frames 475.
  • the image frames 477 between the first and second key image frames 475 would be created from the pixels from the first image, and the image frames 477 between the second and third key image frames 475 would be created from the pixels from the second image.
  • the movie will not appear smooth and continuous even if played at 30 fps because of the abrupt scene change that occur between key image frames 475 one and two.
  • FIG. 9 is a flow chart of a preferred embodiment for creating a movie from a group of images that minimizes abrupt scene changes and creates a smoother playing movie.
  • the process begins by selecting a group of images to create a movie from in step 490.
  • the group may be selected by capturing individual still images, capturing a sequential image, or selecting from previously stored images.
  • a composite image 482 may optionally be generated from the group in step 492.
  • Composite image generation is a well- known process in which the common points in two or more images are compared and then made to align to create a blended image.
  • FIG. 10A is a diagram illustrating an example panning and zooming path across the composite image 482, where the "X's" indicate the panning path and the location of key image frames 475. In this example, the area of selection extends through each image comprising the composite image 482.
  • FIG. 10B is a diagram illustrating a portion of the image frames 477 that will be generated along the panning and zooming path.
  • the image frames 477 are generated by cross fading the original images as a function of the time associated with each image. Referring again to FIG. 9, the steps performed during the cross fading process are shown. For each pixel in an image frame, a percentage weight is assigned to each corresponding pixel from each image in the group based on the time associated with each image in step 498. Because the percentages are based on time, pixels in different parts of the image may be assigned different percentages. After the percentage weights are assigned to the pixels, the percentage weights are summed to derive the corresponding pixels in the image frame in step 500.
  • pixel X which will comprise a portion of a particular image frame, lies within a section of the composite image 482 made up of images A, B, and C, while pixel Y lies within a section of the composite image 482 made up of images A and C only.
  • FIG. 10C is a timing diagram showing the percentage of images A, B and C used to generate a cross fade movie in accordance with the present invention.
  • T A the time associated with image A, 100% of each pixel from image A will be used to generate corresponding image frame pixels up until the time picture B was taken.
  • time TB the time associated with image B, A's contribution to the image frames 477 is gradually decreased, while the percentage of B is increased over time to 100%.
  • the time associated with image C the percentage of C is gradually increased.
  • image A has almost completely dissolved, and image C begins to take its place.
  • the image frame 477 corresponding to pixel "X" will include almost no percentage of image A, approximately 75% of image B, and approximately 25% of image C. As the percentage of image C approaches 100%, image B's contribution is decreased to 0%.
  • the image frame 477 corresponding to pixel "Y” will include approximately 5% of image B, and approximately 95% of image B. The image frames 477 from this point on are generated from 100% percent of image C.
  • the percentages of cross fading are shown to vary in FIG. 10C as a linear function. However, the percentages of cross fading can also be computed using other functions, such as a square and cosine function, or a perceptual curve used to create a linear effect, for example.
  • the number of image frames 477 generated by cross fading is a function of the number of frames per second to be displayed and the duration of the movie.
  • the present invention has been described in accordance with the embodiments shown, one of ordinary skill in the art will readily recognize that there could be variations to the embodiments and those variations would be within the spirit and scope of the present invention.
  • the present invention may be implemented in other types of digital imaging devices, such as an electronic device for archiving images that displays stored images on a television, for instance.
  • software written according to the present invention may be stored on a computer- readable medium, such as a removable memory, or transmitted over a network, and loaded into the digital camera for execution. Accordingly, many modifications may be made by one of ordinary skill in the art without departing from the spirit and scope of the appended claims.

Abstract

A method and system for interactively creating a movie from a still image in a digital imaging device that includes a display screen. The method includes steps for determining a path of panning across the still image (454), for generating image frames along the path of panning (456), and for displaying the image frames for a predetermined time interval on the display screen to play the movie (458). The system includes structures for performing the steps listed above.

Description

INTERACTIVE MOVIE CREATION FROM ONE OR MORE STILL IMAGES IN A DIGITAL IMAGING DEVICE
FIELD OF THE INVENTION
The present invention relates generally to digital imaging devices, and more particularly to a method and system for interactively creating a movie from one or more images in such a device.
BACKGROUND OF THE INVENTION
The use of digital video cameras and digital still cameras is becoming widespread. Digital video cameras differ from digital still cameras in a number of respects. Digital video cameras capture approximately thirty frames per second and are optimized to capture a large amount of moving images, but sacrifice image quality. That is, digital video cameras typically capture thirty low-resolution 640 x 480 images per second. However, the uncompressed digital video signals from all those low resolution images require huge amounts memory storage, and high-ratio real-time compression schemes, such as MPEG, are essential for providing digital video for today's computers. Unfortunately, the hardware to support such processing is expensive, placing most digital video cameras outside the reach of most consumers.
Still digital cameras offer a less expensive alternative to digital video cameras, but are used primarily for capturing high quality static photographs. Still digital cameras are less expensive because they have far less processing power and memory capacity than digital video cameras. Even with these limitations, some still digital cameras are also capable of capturing sequential images, such as a burst image. A burst image is a series of images captured in rapid succession, such as 3 images per second, for instance. A typical still digital camera equipped with an LCD screen operates in two modes, capture or record mode for capturing images, and play mode for playing back the captured images on the LCD screen. Unfortunately, even still digital cameras capable of capturing burst images are incapable of displaying the images comprising the burst in play mode as a movie. One reason it is that a burst image usually includes only 3-8 images and therefore does not have a sufficient number of images to display as a movie. And even if there were enough images to play as a movie, the camera would be incapable of displaying the images at the high frame rate required for a movie presentation. This is because the camera would have to retrieve each image from memory, decompress, resize and then display each image. Due to the limited resources of today's still digital cameras, the display of a burst image resembles more of a slide show than a movie.
Accordingly, what is needed is a method and system for interactively creating a movie from one or more still images in a digital camera. The present invention addresses such a need.
SUMMARY OF THE INVENTION
The present invention provides a method and system for interactively creating a movie from a still image in a digital imaging device that includes a display screen. The method and system include determining a path of panning across the still image, generating a plurality of image frames along the path of panning, and then displaying the plurality image frames for a predetermined time interval on the display screen to present the movie. In a preferred embodiment of the present invention, the path of panning is determined by the user placing several key frames on the still images. By varying the sizes of the key frames, the user may also control the amount of zoom between the key frames. After the key frames are defined, the camera automatically interpolates a series of image frames between each pair of key frames to provide a sufficient number of frames to play during the duration of the movie. In another aspect of the present invention, a movie is created from a group of images by cross fading the movies as a function of time. The resulting movie appears much smoother than it would otherwise because abrupt scene changes between each of the images in the groups are minimized by the cross fading transitions between the group of images. According to the present invention, high-frame rate movie presentations are created from a low frame-rate still digital camera having limited storing capacity. BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram of a digital camera that operates in accordance with the present invention.
FIGS. 2A and 2B are diagrams depicting exemplary hardware components of the camera's user interface.
FIG. 3 is a flow chart illustrating the basic process of creating a movie from a still image in accordance with the present invention.
FIG. 4A is a diagram showing an example image displayed on the LCD screen in play mode. FIGS. 5A - 5F are diagrams illustrating the process of defining a panning and zooming path by the placing a sequence of key image frames within an image.
FIG. 6 is a diagram illustrating several image frames between an adjacent pair of key image frames. FIGS. 7 and 8 are diagrams illustrating the creation of a movie from a group of still images in accordance with the present invention.
FIG. 9 is a flow chart of an improved process for creating a movie from a group of images that minimizes abrupt scene changes and creates a smoother playing movie. FIG. 10A is a diagram illustrating an example panning and zooming path across the composite image, where the "X's" indicate the path and the location of key image frames.
FIG. 10B is a diagram illustrating a portion of the image frames that will be generated along the panning and zooming path. FIG. 10C is a timing diagram showing the percentage of images A, B and
C used to generate a cross fade movie in accordance with the present invention.
DETAILED DESCRIPTION OF THE INVENTION
The present invention relates to a method and system for generating a digital movie from digital still images. The following description is presented to enable one of ordinary skill in the art to make and use the invention and is provided in the context of a patent application and its requirements. Although the present invention will be described in the context of a still digital camera, various modifications to the preferred embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments. That is, any digital imaging capture device which captures, stores, or displays digital images, could incorporate the features described hereinbelow and that device would be within the spirit and scope of the present invention. Thus, the present invention is not intended to be limited to the embodiment shown but is to be accorded the widest scope consistent with the principles and features described herein. Referring now to FIG. 1 , a block diagram of one preferred embodiment of a digital camera 110 is shown for use in accordance with the present invention. Camera 110 preferably comprises an imaging device 114, a system bus 116 and a computer 118. Imaging device 114 includes an image sensor, such as a charged coupled device (CCD) or a CMOS sensor, for generating a set of raw image data representing a captured image. In a preferred embodiment, system bus 116 provides connection paths between imaging device 114, an optional power manager 342, central processing unit (CPU) 344, dynamic random-access memory (DRAM) 346, input/output interface (I/O) 348, nonvolatile memory 350, and buffers/connector 352 that connect an optional removable memory 354 to system bus 116.
CPU 344 may include a conventional microprocessor device for controlling the operation of camera 110. In the preferred embodiment, CPU 344 is capable of concurrently running multiple software routines to control the various processes of camera 110 within a multithreaded environment. For example, images may be captured at the same time that previously captured images are processed in the background to effectively increase the capture rate of the camera. In a preferred embodiment, CPU 244 runs a operating system that includes a menu-driven GUI and provides image processing through software, rather than hardware. An example of such software is the Digita™ Operating Environment by FlashPoint Technology of San Jose, California.
Although CPU 344 is preferably a microprocessor, one or more DSP's (digital signal processor) or ASIC's (Application Specific Integrated Circuit) could also be used.
I/O 348 is an interface device allowing communications to and from computer 118. For example, I/O 348 permits an external host computer (not shown) to connect to and communicate with computer 118. I/O 348 also interfaces with a plurality of buttons and/or dials 404, and an optional status LCD
406, which in addition to the LCD screen 402, are the hardware elements of the camera's user interface 408.
Non-volatile memory 350, which may typically comprise a conventional read-only memory or flash memory, stores a set of computer-readable program instructions to control the operation of camera 110. Removable memory 354 serves as an additional image data storage area and is preferably a non-volatile device, such a flash disk, readily removable and replaceable by a camera 110 user via buffers/connector 352.
Power supply 356 supplies operating power to the various components of camera 110. Power manager 342 communicates via line 366 with power supply
356 and coordinates power management operations for camera 110. In the preferred embodiment, power supply 356 provides operating power to a main power bus 362 and also to a secondary power bus 364. The main power bus 362 provides power to imaging device 114, I/O 348, non-volatile memory 350 and removable memory 354. The secondary power bus 364 provides power to power manager 342, CPU 344 and DRAM 346.
Power supply 356 is connected to main batteries 358 and also to backup batteries 360. In the preferred embodiment, a- camera 110 user may also connect power supply 356 to an external power source. During normal operation of power supply 356, the main batteries 358 provide operating power to power supply 356 which then provides the operating power to camera 110 via both main power bus 362 and secondary power bus 364. During a power failure mode in which the main batteries 358 have failed (when their output voltage has fallen below a minimum operational voltage level) the backup batteries 360 provide operating power to power supply 356 which then provides the operating power only to the secondary power bus 364 of camera 110.
Dynamic Random-Access-Memory (DRAM) 346 is a contiguous block of dynamic memory that may be selectively allocated for various storage functions. DRAM 346 stores both raw and compressed image data and is also used by CPU 344 while executing the software routines used within computer 118. The raw image data received from imaging device 114 is temporarily stored in several input buffers (not shown) within DRAM 346. Once the raw image data is processed, it is stored in a frame buffer (not shown) for display on the LCD screen 402. In a preferred embodiment, the input buffers and the frame buffer are split into two ping-pong buffers to improve the display speed of the digital camera and to prevent the tearing of the image in the display 402. After processed image data has been stored in DRAM 346, LCD controller 390 transfers the image data to LCD screen 402 for display.
FIGS. 2A and 2B are diagrams depicting exemplary hardware components of the camera's user interface 408. FIG. 2A is back view of the camera 110 showing the LCD screen 402, a four-way navigation control button 409, an overlay button 412, a menu button 414, and a set of programmable soft keys 416. FIG. 2B is a top view of the camera 110 showing a shutter button 418, and a mode dial 420. The camera may optionally include status LCD 406, status LCD scroll and select buttons 422 and 424, a sound record button 426, and zoom-in, zoom-out buttons 426a and 426b. The camera operates in at least two modes, capture mode for capturing images, and play mode for playing back the captured images on the LCD screen 402. In a preferred embodiment, the camera is capable of capturing single still images as well as sequential images, which are objects comprising multiple images. Examples of a sequential image include a panorama image, a burst image, and a time lapse image. A panorama image comprises several overlapping images of a larger scene. A burst image is a series of images captured in rapid succession, such as 3 images per second, for instance. And a time lapse image is a series of images automatically captured by the camera at predefined time intervals for a defined duration (e.g. capturing a picture every five minutes for an hour.
Although the camera is capable of capturing sequential images, such as burst images, and time lapse images, the camera can only play the objects back at a relatively slow frame rate because each image must be retrieved from memory, decompressed, resized and then displayed. The play back of a burst image or a time lapse resembles more of a slide show than a movie. And since the images are taken at such disparate times, there is insufficient continuity between the images to provide smooth video play back even if the camera had the capability of displaying the images at a high frame rate.
The present invention is a method and system for interactively creating a movie from one or more images in a digital camera. According to the present invention, a panning and zooming path of a set of small frames is defined across a still image. Thereafter, a number of image frames is generated between the key image frames, such that when the sequence of frames is displayed at regular time intervals, it appears similar to a panning motion picture. A movie can also be created from several still images taken at different times and of adjacent (or the same) view using a similar technique. The resulting frames from each of the still images are then appended together to create a longer presentation. According to the present invention, the user may control the placements of the frames, zooms, time intervals, and other photographic parameters to create the desired motion effects.
FIG. 3 is a flow chart illustrating the basic process of creating a movie from a still image in accordance with the present invention. Initially, the user places the camera in capture mode and captures a desired image in step 450. The user then switches the camera to play mode to display the captured image or a previously captured image on the LCD screen 402 in step 452.
FIG. 4A is a diagram showing an example image 432 displayed on the LCD screen 402 in play mode. Referring again to FIG. 3, once the image is displayed, a path of panning and zooming across the image is determined in step 454. The path of panning defines how the finished movie will appear to the user to sweep across the scene depicted in the image. FIG. 4B is a diagram illustrating an example path of panning across image 432 shown by the arrows. According to the present invention, a path of zooming may also be determined along the path of panning. Whereas the path of panning simulates the motion of a movie camera across a scene, the zoom path simulates the amount by which the movie zooms in and out of the scene.
In a preferred embodiment, the path of panning is manually determined by the user through the placement of a sequence of icons in the image that identifies key image frames. The location of the key image frames in relation to each other determines the panning path. In one preferred embodiment, the size of the key image frames may be controlled by the user, which in turn determines the zoom path; a larger key image frame followed by a smaller key image frame simulates zooming in on a scene. In a second preferred embodiment, a zoom function is not provided to simplify the process, in which case the size of the key image frames are made constant.
FIGS. 5A - 5F are diagrams illustrating the process of a user defining the panning and zooming path shown in FIG. 4B by the placing a sequence of key image frames within image 432. In FIG. 5A, the user begins by defining the starting point of the movie by placing the first key image frame 475 in image 432. Key image frames two through five are then placed as shown in FIGS 5B - 5E, respectively, and the user places the last key image frame 475' that will end the movie in FIG. 5F. Notice the last key image frame 475' is larger the previous frame. This will cause a zoom-out effect near end of the movie when it is played.
Referring again to FIG. 3, after the path of panning and zooming has been determined, the camera generates image frames along the path in step 456.
This is accomplished by generating image frames between each adjacent pair of key image frames.
FIG. 6 is a diagram illustrating several image frames 477 generated between an adjacent pair of key image frames 475. Generating the image frames 477 requires calculating: 1 ) the number of frames to generate between each pair of key image frames 475, 2) the position of each image frame 477 on the image, and optionally, 3) the size of each image frame 477 which determines the zoom factor.
The number of image frames 477 to generate between the key image frames 475 will be dependent upon the following combination of variables: 1 ) the frame rate at which the movie will be displayed and the duration of the movie; or 2) the frame rate at which the movie will be displayed and a time stamp associated with each key image frame. Each of the above variables may be either preset or entered by the user. The actual equations for calculating the number of image frames 477 from these variables would be readably apparent to those skilled in the art. After the number of image frames 477 to generate between each pair of adjacent key image frames 475 is calculated, the position of each of the image frames 477 on the image 432 is determined using pixel coordinates. Referring still to FIG. 6, the X, Y position of the Nth image frame 477 out of M total image frames 477 can be calculated by: X = A + N (C - A) / (M -1)
Y = B + N (D - B) / (M -1 ) However, a computationally faster method is used to calculate the positional X, Y offset (Δ) between each of the frames by calculating: ΔX = (C - A) / (M + 1 ) ΔY = (D - B) / (M + 1 )
The X, Y position of the Nth image frame 477 is then derived by adding ΔX and ΔY to the position of N-1 image frame.
It should be noted that the above equations described above simulate panning between key image frames 475 as a linear function. However, the panning path may be simulated more naturally using a curve fitting function, such as a Bezier or b-spline function, for example.
As stated previously, the size of the key image frames 475 may be used to simulate a camera zooming in or out of the original image. The size of the image frames 477 between a pair of adjacent key image frames 475 can be determined by calculating the slopes between the four corners of the adjacent key image frames 475, and then using the slopes to determine the boundaries of the image frames 477.
Referring again to FIG. 3, after generating each image frame 477, the image frame 477 is displayed on the LCD screen 402 for a predetermined time interval (e.g. 1/30 of a second for 30 fps) in step 458. In a preferred embodiment, each frame is discarded after it is displayed to reduce storage requirements. Alternatively, the frames may be displayed after they have all been processed and stored.
Just before displaying image frame 477, the image data within the frames from the original image may be resized to fit the LCD screen 402. Displaying the image frames 477 full-sized on the LCD screen 402 at a high frame appears to the user as a smooth panning movie.
A graphical user interface (GUI) may be used to assist the user in defining desired processing parameters, automate some or all of the desired processes, and record the prescribed processing parameters. The GUI may be implemented in a variety of ways and a detailed discussion of the GUI is outside the scope of this discussion. However, it should be noted that different types of icons may be used to mark the path of panning and zooming across the image, as shown in FIG. 4B and FIG. 7. In the case where the panning and zooming path is preset by the camera, or chosen by the user through the placement of icons, the key image frames 475 may be transparently generated along the prescribed panning path.
Once the specified processing parameters and sequences are recorded, the image processing can be performed each time the movie is being presented for viewing, thus requiring less storage memory, as described above. The storage memory need only contain the original still image(s) and the processing parameters. Alternatively, the image processing can be carried out to create the movie, and the movie can be stored as a multimedia object for subsequent playback.
Referring now to FIGS. 7 and 8, diagrams illustrating the creation of a movie from a group of still images in accordance with the present invention are shown. FIG. 7 illustrates the creation of a movie from a panorama image comprising several partially overlapping images of a larger scene. In this case, the user can create a panning motion that traverses the entire scene. If the panorama included enough images to form a 360° pattern, then a panning motion could be created that continuously spirals across the panorama.
FIG. 8 is a diagram illustrating the creation of a movie from a group of images, which may comprise a sequential image or represent several still images taken of substantially the same subject matter, but with different zoom settings and/or camera orientations. The group may also represent totally unrelated still images that the user has manually grouped for some reason. The images in the group, shown here as images A, B, and C, were taken at different times along time line 480. Thus, each image in the group is associated with a different time. In the case where the images are unrelated, the user may manually associate the images with a time stamp.
The problem with creating a movie from such a group of images is that abrupt scene changes occur due to the time difference between the images. For example, assume images A, B and C represent a burst image of a marching band, where the first image shows the band member's left leg up and the second and third images show the band member's right leg up.
In one embodiment, a movie could be created from images A, B and C by defining a key image frame 477 on each of the three images and then generating image frames 477 between each pair of key image frames 475. The image frames 477 between the first and second key image frames 475 would be created from the pixels from the first image, and the image frames 477 between the second and third key image frames 475 would be created from the pixels from the second image. When the image frames 477 are then played, the movie will not appear smooth and continuous even if played at 30 fps because of the abrupt scene change that occur between key image frames 475 one and two.
FIG. 9 is a flow chart of a preferred embodiment for creating a movie from a group of images that minimizes abrupt scene changes and creates a smoother playing movie. The process begins by selecting a group of images to create a movie from in step 490. The group may be selected by capturing individual still images, capturing a sequential image, or selecting from previously stored images.
After selecting a group of images, a composite image 482 may optionally be generated from the group in step 492. Composite image generation is a well- known process in which the common points in two or more images are compared and then made to align to create a blended image.
Next, a panning and zooming path across the composite image 482 is determined in step 494. The reason a composite image 482 is created is to simplify the selection of the panning and zooming path by the user. If no composite image 482 is generated, then the panning and zooming path must be determined across the individual images in the group. FIG. 10A is a diagram illustrating an example panning and zooming path across the composite image 482, where the "X's" indicate the panning path and the location of key image frames 475. In this example, the area of selection extends through each image comprising the composite image 482.
Referring again to FIG. 9, after the panning and zooming path has been determined, the individual image frames 477 are generated by cross fading the original images in step 496. FIG. 10B is a diagram illustrating a portion of the image frames 477 that will be generated along the panning and zooming path.
According to the present invention, the image frames 477 are generated by cross fading the original images as a function of the time associated with each image. Referring again to FIG. 9, the steps performed during the cross fading process are shown. For each pixel in an image frame, a percentage weight is assigned to each corresponding pixel from each image in the group based on the time associated with each image in step 498. Because the percentages are based on time, pixels in different parts of the image may be assigned different percentages. After the percentage weights are assigned to the pixels, the percentage weights are summed to derive the corresponding pixels in the image frame in step 500.
Referring to both FIGS. 8 and 10B for example, pixel X which will comprise a portion of a particular image frame, lies within a section of the composite image 482 made up of images A, B, and C, while pixel Y lies within a section of the composite image 482 made up of images A and C only.
FIG. 10C is a timing diagram showing the percentage of images A, B and C used to generate a cross fade movie in accordance with the present invention. At time TA, the time associated with image A, 100% of each pixel from image A will be used to generate corresponding image frame pixels up until the time picture B was taken. At time TB, the time associated with image B, A's contribution to the image frames 477 is gradually decreased, while the percentage of B is increased over time to 100%. At time T , the time associated with image C, the percentage of C is gradually increased.
At this point, image A has almost completely dissolved, and image C begins to take its place. The image frame 477 corresponding to pixel "X" will include almost no percentage of image A, approximately 75% of image B, and approximately 25% of image C. As the percentage of image C approaches 100%, image B's contribution is decreased to 0%. The image frame 477 corresponding to pixel "Y" will include approximately 5% of image B, and approximately 95% of image B. The image frames 477 from this point on are generated from 100% percent of image C.
It should be noted that the percentages of cross fading are shown to vary in FIG. 10C as a linear function. However, the percentages of cross fading can also be computed using other functions, such as a square and cosine function, or a perceptual curve used to create a linear effect, for example. The number of image frames 477 generated by cross fading is a function of the number of frames per second to be displayed and the duration of the movie.
When the resulting series of image frames 477 created using cross fading is displayed at 30 fps, there is a smooth continuous panning motion of the background, while moving objects in the foreground appear to be in motion, although not as smoothly as the background. The movie appears much smoother than it would otherwise because abrupt scene changes between each of the images in the groups are minimized by the cross fading transitions between the group of images. Thus, a high-frame rate movie presentation is created from a low frame-rate still digital camera having limited storing capacity. In summary, a method and system for interactively creating a movie from one or more still images in a digital camera has been disclosed. Although the present invention has been described in accordance with the embodiments shown, one of ordinary skill in the art will readily recognize that there could be variations to the embodiments and those variations would be within the spirit and scope of the present invention. For example, the present invention may be implemented in other types of digital imaging devices, such as an electronic device for archiving images that displays stored images on a television, for instance. In addition, software written according to the present invention may be stored on a computer- readable medium, such as a removable memory, or transmitted over a network, and loaded into the digital camera for execution. Accordingly, many modifications may be made by one of ordinary skill in the art without departing from the spirit and scope of the appended claims.

Claims

What is claimed is:
1 A method for interactively creating a movie from a still image in a digital imaging device, the method comprising the steps of: a) Determining a path of panning across the still image; b) Generating image frames along the path of panning; and c) Displaying the image frames for a predetermined time interval on a display screen to present the movie.
2 The method of claim 1 wherein step 1a) further includes the step of: i) Automatically determining the path of panning.
3 The method of claim 1 wherein step 1a) further includes the step of: i) Enabling a user to manually determine the path of panning.
4 The method of claim 3 wherein step 1a) further includes the step of: ii) Defining a plurality of key image frames along the path of panning.
5 The method of claim 4 wherein step 1a) further includes the step of: iii) Generating a set of the image frames between each pair of adjacent key image frames.
The method of claim 5 wherein step 1c) further includes the steps of: i) Playing each key image frame and image frame for the predetermined time interval. The method of claim 3 wherein step 1a)i) further includes the step of: (1 ) Determining a zoom path across the still image.
The method of claim 7 wherein step 1a)i) further includes the step of:
(2) Determining the zoom path by varying the sizes of the plurality of key frames.
9 The method of claim 1 further including the step of: e) Storing the movie as a multimedia object for subsequent play back.
10 The method of claim 1 further including the steps of: e) Recording processing parameters for controlling the creation of the movie; and f) Generating the image frames each time the movie is being presented for viewing, to thereby reduce storage space.
11 A system for interactively creating a movie from a still image comprising: a display screen; a memory for storing a still image; and processing means coupled to the display screen and to the memory, wherein the processing means includes means to determine a path of panning across the still image, generate image frames along the path of panning, and display the image frames for a predetermined time interval on the display screen to present the movie.
12 The system of claim 11 wherein the processing means automatically determines the path of panning. The system of claim 11 further including a user interface means for enabling a user to manually determine the path of panning across the still image.
The system of claim 13 wherein the processing means defines a plurality of key image frames along the path of panning.
The system of claim 14 wherein the processing means generates a set of the image frames between each pair of adjacent key image frames.
The system of claim 15 wherein the processing means plays each key image frame and image frame for the predetermined time interval.
The system of claim 13 wherein the user determines a zoom path across the still image.
The system of claim 17 wherein the zoom path is determined by varying the sizes of the plurality of key frames.
The system of claim 11 further including means for storing the movie as a multimedia object for subsequent play back.
The system of claim 11 wherein the processor means records processing parameters for controlling the creation of the movie, and generates the image frames each time the movie is being presented for viewing, to thereby reduce storage space.
A computer readable medium containing program instructions for interactively creating a movie from a still image in a digital imaging device, the program instructions for: a) Determining a path of panning across the still image; b) Generating image frames along the path of panning; and c) Displaying the image frames for a predetermined time interval to play the movie.
The computer readable medium of claim 21 wherein instruction 21a) further includes the instruction of: i) Automatically determining the path of panning.
The computer readable medium of claim 21 wherein instruction 21a) further includes the instruction of: i) Enabling a user to manually determine the path of panning.
The computer readable medium of claim 23 wherein instruction 21a) further includes the instruction of: ii) Defining a plurality of key image frames along the path of panning
The computer readable medium of claim 24 wherein instruction 21a) further includes the instruction of: iii) Generating a set of the image frames between each pair of adjacent key image frames. 6 The computer readable medium of claim 25 wherein instruction 21c) further includes the instructions of: i) Playing each key image frame and image frame for the predetermined time interval.
7 The computer readable medium of claim 23 wherein instruction 21a)i) further includes the instruction of:
(1 ) Determining a zoom path across the still image.
8 The computer readable medium of claim 27 wherein instruction 21a)i) further includes the instruction of:
(1 ) Determining the zoom path by varying the sizes of the plurality of key frames.
29 The computer readable medium of claim 21 further including the instruction of: e) Storing the movie as a multimedia object for subsequent play back.
30 The computer readable medium of claim 21 further including the steps of: e) Recording processing parameters for controlling the creation of the movie; and f) Generating the image frames each time the movie is being presented for viewing, to thereby reduce storage space.
31 A method for interactively creating a movie from a group of still images in a digital imaging device, the method comprising the steps of: a) Selecting a group of images, each of the images taken at a different time and comprising a plurality of pixels; b) Determining a path of panning across the group of still images; c) Generating a set of image frames along the path of panning by cross- fading the group of images as a function of time; and d) Playing the movie by displaying the image frames in sequence for a predetermined time on a display screen.
The method of claim 31 wherein step 31c) further includes the step of: i) For each pixel in each one of the image frames, assigning a percentage weight to each corresponding pixel from each one of the group of images based on the time associated with the image; and ii) Summing the assigned percentage weights to derive the pixels in the image frame.
The method of claim 32 wherein step 31 b) further includes the step of: i) Defining a plurality of key image frames along the path of panning.
The method of claim 33 wherein step 31 b) further includes the step of: ii) Generating the set of the image frames between each pair of adjacent key image frames.
The method of claim 31 wherein step 31 a) further includes the step of: i) Defining a zoom path across the still image.
The method of claim 35 further including the step of: ii) Defining a zoom path across the still image by varying the sizes of the plurality of key image frames.
PCT/US1999/017636 1998-08-04 1999-08-03 Interactive movie creation from one or more still images in a digital imaging device WO2000008853A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU52545/99A AU5254599A (en) 1998-08-04 1999-08-03 Interactive movie creation from one or more still images in a digital imaging device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/129,140 US6362850B1 (en) 1998-08-04 1998-08-04 Interactive movie creation from one or more still images in a digital imaging device
US09/129,140 1998-08-04

Publications (1)

Publication Number Publication Date
WO2000008853A1 true WO2000008853A1 (en) 2000-02-17

Family

ID=22438627

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1999/017636 WO2000008853A1 (en) 1998-08-04 1999-08-03 Interactive movie creation from one or more still images in a digital imaging device

Country Status (3)

Country Link
US (2) US6362850B1 (en)
AU (1) AU5254599A (en)
WO (1) WO2000008853A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1235182A2 (en) * 2001-02-23 2002-08-28 Hewlett-Packard Company Motion picture generation from a static digital image
EP1463001A2 (en) 2003-03-27 2004-09-29 Victor Company of Japan, Ltd. Computer method for generating a display picture
EP1648173A2 (en) 2004-10-06 2006-04-19 Microsoft Corporation Creation of image based video using step-images
WO2010064236A1 (en) * 2008-12-01 2010-06-10 Visual Domains Ltd. Method and system for browsing visual content displayed in a virtual three-dimensional space
EP2283642A1 (en) * 2008-06-11 2011-02-16 Nokia Corporation Method, apparatus, and computer program product for presenting burst images
JP2011090258A (en) * 2009-10-26 2011-05-06 Fujifilm Corp Wide-angle image display control method, device for the same, and wide-angle image taking device
CN102316266A (en) * 2010-07-07 2012-01-11 索尼公司 Display control device, display control method, and program
JP2012050050A (en) * 2010-08-27 2012-03-08 Fuji Mach Mfg Co Ltd Image display system and image display method
WO2012031767A1 (en) * 2010-09-10 2012-03-15 Deutsche Telekom Ag Method and system for obtaining a control information related to a digital image
US8363058B2 (en) 2003-04-30 2013-01-29 Hewlett-Packard Development Company, L.P. Producing video and audio-photos from a static digital image
CN104781779A (en) * 2012-11-06 2015-07-15 诺基亚技术有限公司 Method and apparatus for creating motion effect for image
CN112819927A (en) * 2021-02-04 2021-05-18 上海哔哩哔哩科技有限公司 Video generation method and device based on pictures

Families Citing this family (121)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5973734A (en) 1997-07-09 1999-10-26 Flashpoint Technology, Inc. Method and apparatus for correcting aspect ratio in a camera graphical user interface
US6317141B1 (en) 1998-12-31 2001-11-13 Flashpoint Technology, Inc. Method and apparatus for editing heterogeneous media objects in a digital imaging device
US6388877B1 (en) 1999-02-04 2002-05-14 Palm, Inc. Handheld computer with open accessory slot
GB2350005B (en) * 1999-03-17 2003-01-08 Canon Kk Image processing apparatus
US20050057500A1 (en) * 1999-12-17 2005-03-17 Bohn David D. Display and pointer manipulation using relative movement to a device
TW592755B (en) * 2000-01-14 2004-06-21 Sony Computer Entertainment Inc Electronic equipment, recording medium and method for changing parameter settings of the electronic equipment or computer
JP4127750B2 (en) * 2000-05-30 2008-07-30 富士フイルム株式会社 Digital camera with music playback function
US6549207B1 (en) * 2000-06-05 2003-04-15 Kenzo Matsumoto Method and apparatus for dissolving image on display screen
US20060064716A1 (en) * 2000-07-24 2006-03-23 Vivcom, Inc. Techniques for navigating multiple video streams
US7015976B1 (en) * 2000-08-07 2006-03-21 Ati International Srl Automatic panning of digital content while zoomed
US6801662B1 (en) * 2000-10-10 2004-10-05 Hrl Laboratories, Llc Sensor fusion architecture for vision-based occupant detection
US20020080274A1 (en) * 2000-12-21 2002-06-27 Gubernick Franklin L. Photograph display system
JP3889233B2 (en) * 2001-03-08 2007-03-07 株式会社モノリス Image encoding method and apparatus, and image decoding method and apparatus
CN1156783C (en) * 2001-03-27 2004-07-07 国际商业机器公司 Server and method for loading advertisement on web-pages, web-pages display device and method
GB2378341B (en) * 2001-07-31 2005-08-24 Hewlett Packard Co Improvements in and relating to dislaying digital images
GB2378342A (en) * 2001-07-31 2003-02-05 Hewlett Packard Co Selecting images based upon the similarity between images
JP2003061052A (en) * 2001-08-09 2003-02-28 Canon Inc Image-reproducing device and image-reproducing method
US20030063102A1 (en) * 2001-10-01 2003-04-03 Gilles Rubinstenn Body image enhancement
US7634103B2 (en) * 2001-10-01 2009-12-15 L'oreal S.A. Analysis using a three-dimensional facial image
US20030065255A1 (en) * 2001-10-01 2003-04-03 Daniela Giacchetti Simulation of an aesthetic feature on a facial image
US7343052B2 (en) * 2002-04-09 2008-03-11 Sonic Solutions End-user-navigable set of zoomed-in images derived from a high-resolution master image
FR2849950B1 (en) * 2003-01-15 2005-03-18 Eastman Kodak Co METHOD FOR DISPLAYING AN IMAGE ENTERED BY A DIGITAL VIEWING APPARATUS
US7882258B1 (en) * 2003-02-05 2011-02-01 Silver Screen Tele-Reality, Inc. System, method, and computer readable medium for creating a video clip
US20040196286A1 (en) * 2003-04-01 2004-10-07 Microsoft Corporation Progressive scale graph
JP4864295B2 (en) * 2003-06-02 2012-02-01 富士フイルム株式会社 Image display system, image display apparatus, and program
US8682097B2 (en) 2006-02-14 2014-03-25 DigitalOptics Corporation Europe Limited Digital image enhancement with reference images
US8494286B2 (en) 2008-02-05 2013-07-23 DigitalOptics Corporation Europe Limited Face detection in mid-shot digital images
US7269292B2 (en) 2003-06-26 2007-09-11 Fotonation Vision Limited Digital image adjustable compression and resolution using face detection information
US7620218B2 (en) 2006-08-11 2009-11-17 Fotonation Ireland Limited Real-time face tracking with reference images
US9692964B2 (en) 2003-06-26 2017-06-27 Fotonation Limited Modification of post-viewing parameters for digital images using image region or feature information
US7844076B2 (en) 2003-06-26 2010-11-30 Fotonation Vision Limited Digital image processing using face detection and skin tone information
US7565030B2 (en) 2003-06-26 2009-07-21 Fotonation Vision Limited Detecting orientation of digital images using face detection information
US8948468B2 (en) 2003-06-26 2015-02-03 Fotonation Limited Modification of viewing parameters for digital images using face detection information
US8155397B2 (en) 2007-09-26 2012-04-10 DigitalOptics Corporation Europe Limited Face tracking in a camera processor
US8593542B2 (en) 2005-12-27 2013-11-26 DigitalOptics Corporation Europe Limited Foreground/background separation using reference images
US7574016B2 (en) 2003-06-26 2009-08-11 Fotonation Vision Limited Digital image processing using face detection information
US9129381B2 (en) 2003-06-26 2015-09-08 Fotonation Limited Modification of post-viewing parameters for digital images using image region or feature information
US7792970B2 (en) 2005-06-17 2010-09-07 Fotonation Vision Limited Method for establishing a paired connection between media devices
US7440593B1 (en) 2003-06-26 2008-10-21 Fotonation Vision Limited Method of improving orientation and color balance of digital images using face detection information
US8498452B2 (en) 2003-06-26 2013-07-30 DigitalOptics Corporation Europe Limited Digital image processing using face detection information
US8896725B2 (en) 2007-06-21 2014-11-25 Fotonation Limited Image capture device with contemporaneous reference image capture mechanism
US7471846B2 (en) 2003-06-26 2008-12-30 Fotonation Vision Limited Perfecting the effect of flash within an image acquisition devices using face detection
US8989453B2 (en) 2003-06-26 2015-03-24 Fotonation Limited Digital image processing using face detection information
US8330831B2 (en) 2003-08-05 2012-12-11 DigitalOptics Corporation Europe Limited Method of gathering visual meta data using a reference image
US7209601B2 (en) * 2003-07-22 2007-04-24 Omnivision Technologies, Inc. CMOS image sensor using high frame rate with frame addition and movement compensation
US7532753B2 (en) * 2003-09-29 2009-05-12 Lipsky Scott E Method and system for specifying color of a fill area
US8739060B2 (en) * 2003-09-29 2014-05-27 Eqapez Foundation, L.L.C. Method and system for displaying multiple aspect ratios of a viewport
US20050081247A1 (en) * 2003-09-29 2005-04-14 Lipsky Scott E. Method and system for generating image display plans
US6987522B2 (en) * 2003-09-29 2006-01-17 Beon Media Inc. Method and system for specifying pan speed
US7050072B2 (en) * 2003-09-29 2006-05-23 Galleryplayer, Inc. Method and system for specifying a pan path
US6956589B2 (en) * 2003-09-29 2005-10-18 Beon Media Inc. Method and system for specifying zoom size
US6989848B2 (en) * 2003-09-29 2006-01-24 Beon Media Inc. Method and system for specifying zoom speed
FR2860665B1 (en) * 2003-10-07 2006-04-07 Canon Kk SEQUENCE DECODING OF DIGITAL IMAGES
JP4458811B2 (en) * 2003-10-29 2010-04-28 キヤノン株式会社 Imaging apparatus, display control method, and display control program
WO2005055190A1 (en) * 2003-12-05 2005-06-16 Sharp Kabushiki Kaisha Display data creation device, display automatic operation data creation device, display data creation method, display automatic operation data creation method, display data creation program, display automatic operation data creation program, and computer-readable recording medium containing these programs
JP2005182196A (en) * 2003-12-16 2005-07-07 Canon Inc Image display method and image display device
FR2867343B1 (en) * 2004-03-04 2006-06-30 Eastman Kodak Co METHOD AND APPARATUS FOR SHOOTING FOR THE PRODUCTION OF DYNAMIC EVENTS
US20050206751A1 (en) * 2004-03-19 2005-09-22 East Kodak Company Digital video system for assembling video sequences
US20060004697A1 (en) * 2004-06-09 2006-01-05 Lipsky Scott E Method and system for restricting the display of images
US20060041632A1 (en) * 2004-08-23 2006-02-23 Microsoft Corporation System and method to associate content types in a portable communication device
JP2006094053A (en) * 2004-09-22 2006-04-06 Fuji Photo Film Co Ltd Photo movie creating method and apparatus thereof, and photo movie creating program
JP4581924B2 (en) * 2004-09-29 2010-11-17 株式会社ニコン Image reproducing apparatus and image reproducing program
US8320641B2 (en) 2004-10-28 2012-11-27 DigitalOptics Corporation Europe Limited Method and apparatus for red-eye detection using preview or other reference images
US7315631B1 (en) 2006-08-11 2008-01-01 Fotonation Vision Limited Real-time face tracking in a digital image acquisition device
US8503800B2 (en) 2007-03-05 2013-08-06 DigitalOptics Corporation Europe Limited Illumination detection using classifier chains
US20060204214A1 (en) * 2005-03-14 2006-09-14 Microsoft Corporation Picture line audio augmentation
US7444597B2 (en) * 2005-03-18 2008-10-28 Microsoft Corporation Organizing elements on a web page via drag and drop operations
US20060212806A1 (en) * 2005-03-18 2006-09-21 Microsoft Corporation Application of presentation styles to items on a web page
US20060212792A1 (en) * 2005-03-18 2006-09-21 Microsoft Corporation Synchronously publishing a web page and corresponding web page resources
US20060224778A1 (en) * 2005-04-04 2006-10-05 Microsoft Corporation Linked wizards
US9648281B2 (en) * 2005-05-23 2017-05-09 Open Text Sa Ulc System and method for movie segment bookmarking and sharing
US8724969B2 (en) 2005-05-23 2014-05-13 Open Text S.A. Method, system and computer program product for editing movies in distributed scalable media environment
US8141111B2 (en) 2005-05-23 2012-03-20 Open Text S.A. Movie advertising playback techniques
US8145528B2 (en) 2005-05-23 2012-03-27 Open Text S.A. Movie advertising placement optimization based on behavior and content analysis
US8339420B2 (en) * 2005-06-30 2012-12-25 Panasonic Corporation Method and apparatus for producing size-appropriate images to be displayed by an electronic device with a small display area
US8803886B2 (en) * 2005-08-12 2014-08-12 Sony Corporation Face image display, face image display method, and face image display program
US20070081303A1 (en) * 2005-10-11 2007-04-12 Lawrence Lam Recess housing feature for computing devices
US7911482B1 (en) * 2006-01-06 2011-03-22 Videomining Corporation Method and system for efficient annotation of object trajectories in image sequences
CA2654960A1 (en) 2006-04-10 2008-12-24 Avaworks Incorporated Do-it-yourself photo realistic talking head creation system and method
WO2007124138A2 (en) * 2006-04-18 2007-11-01 Sorensen Associates Inc Still image queue analysis system and method
US20070256023A1 (en) * 2006-04-28 2007-11-01 Microsoft Corporation Demonstration scripting using random-access frame presentation
DE602007012246D1 (en) 2006-06-12 2011-03-10 Tessera Tech Ireland Ltd PROGRESS IN EXTENDING THE AAM TECHNIQUES FROM GRAY CALENDAR TO COLOR PICTURES
KR101310823B1 (en) * 2006-06-20 2013-09-25 삼성전자주식회사 Method for controlling digital photographing apparatus, and digital photographing apparatus adopting the method
US7916897B2 (en) 2006-08-11 2011-03-29 Tessera Technologies Ireland Limited Face tracking for controlling imaging parameters
US7403643B2 (en) 2006-08-11 2008-07-22 Fotonation Vision Limited Real-time face tracking in a digital image acquisition device
JP4958499B2 (en) * 2006-08-18 2012-06-20 株式会社ソニー・コンピュータエンタテインメント Image display control device, image display method, and program
US9224145B1 (en) 2006-08-30 2015-12-29 Qurio Holdings, Inc. Venue based digital rights using capture device with digital watermarking capability
US20080056548A1 (en) * 2006-09-05 2008-03-06 Pablo Irarrazaval Enhancement of visual perception through dynamic cues
US20080055315A1 (en) * 2006-09-05 2008-03-06 Dale Ducharme Method and System to Establish and Animate a Coordinate System for Content on a Display
US8055067B2 (en) 2007-01-18 2011-11-08 DigitalOptics Corporation Europe Limited Color segmentation
JP4854539B2 (en) * 2007-02-21 2012-01-18 キヤノン株式会社 Image processing apparatus, control method thereof, and program
JP5049356B2 (en) 2007-02-28 2012-10-17 デジタルオプティックス・コーポレイション・ヨーロッパ・リミテッド Separation of directional lighting variability in statistical face modeling based on texture space decomposition
WO2008107002A1 (en) 2007-03-05 2008-09-12 Fotonation Vision Limited Face searching and detection in a digital image acquisition device
US7734161B2 (en) * 2007-04-19 2010-06-08 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Image stabilization with adaptive shutter control
US7916971B2 (en) 2007-05-24 2011-03-29 Tessera Technologies Ireland Limited Image processing method and apparatus
US8054310B2 (en) * 2007-06-18 2011-11-08 International Business Machines Corporation Recasting a legacy web page as a motion picture with audio
US20090006965A1 (en) * 2007-06-26 2009-01-01 Bodin William K Assisting A User In Editing A Motion Picture With Audio Recast Of A Legacy Web Page
US7945847B2 (en) * 2007-06-26 2011-05-17 International Business Machines Corporation Recasting search engine results as a motion picture with audio
JP4780053B2 (en) * 2007-07-23 2011-09-28 ソニー株式会社 Image processing apparatus, image processing method, and program
US20090141436A1 (en) * 2007-11-30 2009-06-04 Yoshimichi Matsuoka Trim element for housing of computing device
US7855737B2 (en) 2008-03-26 2010-12-21 Fotonation Ireland Limited Method of making a digital camera image of a scene including the camera user
US20090295787A1 (en) * 2008-06-02 2009-12-03 Amlogic, Inc. Methods for Displaying Objects of Interest on a Digital Display Device
CN102027505A (en) 2008-07-30 2011-04-20 泰塞拉技术爱尔兰公司 Automatic face and skin beautification using face detection
US8074181B2 (en) * 2008-09-15 2011-12-06 Microsoft Corporation Screen magnifier panning model with dynamically resizable panning regions
CN101783129B (en) * 2009-01-04 2012-12-05 虹软(杭州)科技有限公司 Image scaling device and image scaling method
US8379917B2 (en) 2009-10-02 2013-02-19 DigitalOptics Corporation Europe Limited Face recognition performance using additional image features
JP4868075B2 (en) * 2009-10-09 2012-02-01 株式会社ニコン Imaging device
JP2013003596A (en) * 2011-06-10 2013-01-07 Sony Corp Information processing apparatus, program, and information processing method
US20130002858A1 (en) * 2011-06-28 2013-01-03 Bridge Robert F Mechanisms for Conserving Power in a Compressive Imaging System
KR101910110B1 (en) * 2011-09-26 2018-12-31 삼성디스플레이 주식회사 Display device and driving method thereof
WO2015130270A1 (en) * 2014-02-26 2015-09-03 Empire Technology Development Llc Photo and document integration
US20150277715A1 (en) * 2014-04-01 2015-10-01 Microsoft Corporation Content display with contextual zoom focus
US10042529B2 (en) * 2014-04-01 2018-08-07 Microsoft Technology Licensing, Llc Content display with dynamic zoom focus
JP5768172B2 (en) * 2014-08-04 2015-08-26 富士フイルム株式会社 Image display control method and apparatus, and image pickup apparatus
US9607243B1 (en) 2014-10-10 2017-03-28 Google Inc. Time-lapsed image sequence generation
JP6372696B2 (en) * 2014-10-14 2018-08-15 ソニー株式会社 Information processing apparatus, information processing method, and program
RU2698158C1 (en) * 2016-06-30 2019-08-22 Абракадабра Реклам Ве Яйинджылык Лимитед Сыркеты Digital multimedia platform for converting video objects into multimedia objects presented in a game form
US10609284B2 (en) * 2016-10-22 2020-03-31 Microsoft Technology Licensing, Llc Controlling generation of hyperlapse from wide-angled, panoramic videos
US11012748B2 (en) 2018-09-19 2021-05-18 International Business Machines Corporation Dynamically providing customized versions of video content
CN112995536A (en) * 2021-02-04 2021-06-18 上海哔哩哔哩科技有限公司 Video synthesis method and system
CN112995533A (en) * 2021-02-04 2021-06-18 上海哔哩哔哩科技有限公司 Video production method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2160748A (en) * 1984-06-22 1985-12-24 Micro Consultants Ltd Graphic simulation system
US4750888A (en) * 1983-12-15 1988-06-14 Giravions Dorand Method and device for training in the operation of moving vehicles
JPH05207502A (en) * 1992-01-29 1993-08-13 Nippon Hoso Kyokai <Nhk> Video image synthesis system
JPH06133221A (en) * 1992-10-14 1994-05-13 Sony Corp Image pickup device

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4694345A (en) * 1985-04-11 1987-09-15 Rank Cintel Limited Video signals special effects generator with variable pixel size
US5109278A (en) * 1990-07-06 1992-04-28 Commonwealth Edison Company Auto freeze frame display for intrusion monitoring system
US5249056A (en) * 1991-07-16 1993-09-28 Sony Corporation Of America Apparatus for generating video signals from film
US5657402A (en) * 1991-11-01 1997-08-12 Massachusetts Institute Of Technology Method of creating a high resolution still image using a plurality of images and apparatus for practice of the method
US5542037A (en) * 1992-08-24 1996-07-30 Casio Computer Co., Ltd. Image displaying apparatus wherein selected stored image data is combined and the combined image data is displayed
JP2813728B2 (en) * 1993-11-01 1998-10-22 インターナショナル・ビジネス・マシーンズ・コーポレイション Personal communication device with zoom / pan function
US5767845A (en) * 1994-08-10 1998-06-16 Matsushita Electric Industrial Co. Multi-media information record device, and a multi-media information playback device
ATE280978T1 (en) * 1997-01-30 2004-11-15 Yissum Res Dev Co MOSAIC IMAGE PROCESSING SYSTEM
US6232973B1 (en) * 1998-08-07 2001-05-15 Hewlett-Packard Company Appliance and method for navigating among multiple captured images and functional menus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4750888A (en) * 1983-12-15 1988-06-14 Giravions Dorand Method and device for training in the operation of moving vehicles
GB2160748A (en) * 1984-06-22 1985-12-24 Micro Consultants Ltd Graphic simulation system
JPH05207502A (en) * 1992-01-29 1993-08-13 Nippon Hoso Kyokai <Nhk> Video image synthesis system
JPH06133221A (en) * 1992-10-14 1994-05-13 Sony Corp Image pickup device

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2372658A (en) * 2001-02-23 2002-08-28 Hewlett Packard Co A method of creating moving video data from a static image
US6930687B2 (en) 2001-02-23 2005-08-16 Hewlett-Packard Development Company, L.P. Method of displaying a digital image
EP1235182A2 (en) * 2001-02-23 2002-08-28 Hewlett-Packard Company Motion picture generation from a static digital image
EP1235182A3 (en) * 2001-02-23 2011-03-30 Hewlett-Packard Company Motion picture generation from a static digital image
EP1463001A2 (en) 2003-03-27 2004-09-29 Victor Company of Japan, Ltd. Computer method for generating a display picture
US7436408B2 (en) 2003-03-27 2008-10-14 Victor Company Of Japan, Ltd. Computer program for generating pictures
EP1463001A3 (en) * 2003-03-27 2009-01-07 Victor Company of Japan, Ltd. Computer method for generating a display picture
US8363058B2 (en) 2003-04-30 2013-01-29 Hewlett-Packard Development Company, L.P. Producing video and audio-photos from a static digital image
EP1648173A2 (en) 2004-10-06 2006-04-19 Microsoft Corporation Creation of image based video using step-images
EP1648173A3 (en) * 2004-10-06 2009-04-01 Microsoft Corporation Creation of image based video using step-images
KR101203247B1 (en) 2004-10-06 2012-11-20 마이크로소프트 코포레이션 Creation of image based video using step-images
EP2283642A4 (en) * 2008-06-11 2011-06-15 Nokia Corp Method, apparatus, and computer program product for presenting burst images
EP2283642A1 (en) * 2008-06-11 2011-02-16 Nokia Corporation Method, apparatus, and computer program product for presenting burst images
US9013592B2 (en) 2008-06-11 2015-04-21 Nokia Corporation Method, apparatus, and computer program product for presenting burst images
US8497920B2 (en) 2008-06-11 2013-07-30 Nokia Corporation Method, apparatus, and computer program product for presenting burst images
CN102301400B (en) * 2008-12-01 2016-01-20 维卓多密有限公司 Browse the method and system of the vision content shown in virtual three-dimensional space
WO2010064236A1 (en) * 2008-12-01 2010-06-10 Visual Domains Ltd. Method and system for browsing visual content displayed in a virtual three-dimensional space
JP2011090258A (en) * 2009-10-26 2011-05-06 Fujifilm Corp Wide-angle image display control method, device for the same, and wide-angle image taking device
EP2405639A1 (en) * 2010-07-07 2012-01-11 Sony Corporation Display control device, display control method, and program
CN102316266A (en) * 2010-07-07 2012-01-11 索尼公司 Display control device, display control method, and program
US10447874B2 (en) 2010-07-07 2019-10-15 Sony Corporation Display control device and display control method for automatic display of an image
JP2012050050A (en) * 2010-08-27 2012-03-08 Fuji Mach Mfg Co Ltd Image display system and image display method
WO2012031767A1 (en) * 2010-09-10 2012-03-15 Deutsche Telekom Ag Method and system for obtaining a control information related to a digital image
CN104781779A (en) * 2012-11-06 2015-07-15 诺基亚技术有限公司 Method and apparatus for creating motion effect for image
EP2917820A4 (en) * 2012-11-06 2016-07-20 Nokia Technologies Oy Method and apparatus for creating motion effect for image
US9883117B2 (en) 2012-11-06 2018-01-30 Nokia Technologies Oy Method and apparatus for creating motion effect for image
CN104781779B (en) * 2012-11-06 2018-06-15 诺基亚技术有限公司 For creating the method and apparatus of the movement effects for image
US10212365B2 (en) 2012-11-06 2019-02-19 Nokia Technologies Oy Method and apparatus for creating motion effect for image
CN112819927A (en) * 2021-02-04 2021-05-18 上海哔哩哔哩科技有限公司 Video generation method and device based on pictures

Also Published As

Publication number Publication date
US6362850B1 (en) 2002-03-26
US6587119B1 (en) 2003-07-01
AU5254599A (en) 2000-02-28

Similar Documents

Publication Publication Date Title
US6362850B1 (en) Interactive movie creation from one or more still images in a digital imaging device
US6683649B1 (en) Method and apparatus for creating a multimedia presentation from heterogeneous media objects in a digital imaging device
US7337403B2 (en) Method and apparatus for editing heterogeneous media objects in a digital imaging device
US6738075B1 (en) Method and apparatus for creating an interactive slide show in a digital imaging device
JP4250543B2 (en) Imaging apparatus, information processing apparatus, and control method thereof
JP4799009B2 (en) Image processing apparatus and method
US8436920B2 (en) Camera apparatus with magnified playback features
KR100478612B1 (en) Moving image playback apparatus and method thereof having multi-picture playback function
JP2012242821A (en) Display image generation method
US9025936B2 (en) Video processing apparatus, method of adding time code, and methode of preparing editing list
JP4802524B2 (en) Image processing apparatus, camera system, video system, network data system, and image processing method
JP2006303896A (en) Camera system for changing over display of reduced guide image in reproducing image by magnification, and image display method
JP2009147824A (en) Imaging apparatus and imaging method
JP2003324678A (en) Image processing unit, image processing system, image processing method, storage medium, and program
JP2006287596A (en) Camera apparatus with animation reproduction function in a plurality of image selection screens
JP2006203334A (en) Image recording apparatus, control method thereof and program
JP5663920B2 (en) Electronic device and panorama image display program
JP5066878B2 (en) Camera and display system
JPH11168685A (en) Image processing method
US8774603B2 (en) Information processing apparatus, information processing method, program, and recording medium
JP5055685B2 (en) Imaging apparatus, screen display method, and user interface
JP2002281382A (en) Image processor and image processing method
JP4408397B2 (en) Imaging apparatus, imaging method, and program
JP2001238154A (en) Moving picture display device
JP4710767B2 (en) Information processing apparatus and method, and program

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU CA CN IL JP KR MX

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
122 Ep: pct application non-entry in european phase