US20070206024A1 - System and method for smooth pointing of objects during a presentation - Google Patents

System and method for smooth pointing of objects during a presentation Download PDF

Info

Publication number
US20070206024A1
US20070206024A1 US11/367,518 US36751806A US2007206024A1 US 20070206024 A1 US20070206024 A1 US 20070206024A1 US 36751806 A US36751806 A US 36751806A US 2007206024 A1 US2007206024 A1 US 2007206024A1
Authority
US
United States
Prior art keywords
displayed image
color
line
identified portion
identifying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/367,518
Inventor
Ravishankar Rao
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
International Business Machines Corp
Original Assignee
International Business Machines Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by International Business Machines Corp filed Critical International Business Machines Corp
Priority to US11/367,518 priority Critical patent/US20070206024A1/en
Assigned to IBM CORPORATION reassignment IBM CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: RAO, RAVISHANKAR
Publication of US20070206024A1 publication Critical patent/US20070206024A1/en
Priority to US12/172,222 priority patent/US8159501B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

Definitions

  • the present invention is directed to the field of interface devices for computer users, and more particularly to the use of pointing devices during presentations.
  • images are displayed on a monitor or projected on a screen for audience viewing.
  • a speaker calls attention to specific portions of the displayed images using pointing devices such as illuminated arrows and laser pointers.
  • laser pointers are generally preferred pointing devices, there are various limitations associated with the use of laser pointing devices.
  • the size of the projected image is typically large, especially in auditorium settings, and the speaker is positioned relatively close to the projected image. At this relatively close proximity to the projected image, the speaker cannot easily view the entire image, and there is significant perspective distortion when viewing projected images up close. Therefore, the speaker cannot easily view the presentation and must use a small computer screen on which the images are simultaneously displayed, alternating attention between the computer screen and the projected image.
  • laser pointers are typically held in the hand of the speaker and manually moved to the desired locations on the projected images, laser pointers are susceptible to motion jitter, causing inaccurate pointing and viewing problems for the audience.
  • the speaker has to face the projected image to ensure that the correct areas of the screen are selected using the laser pointer, preventing eye contact between the speaker and the audience.
  • these physical highlighting systems are only applicable to presentations given at a single location and not to presentations made simultaneously to multiple distributed locations, for example across a computer network or through a video conference.
  • U.S. Pat. No. 6,747,636 is directed to a display system for overlaying graphics applied to a display surface onto the graphics output of an application running in the foreground, and to allow user input to the application program by means of a touch sensitive screen.
  • a completely interactive display system is provided in which the user can highlight and edit information generated by the computer program by simply drawing on the touch sensitive screen.
  • a user when a user selectively applies pressure in the form of a point, line or script to the surface of a touch screen, information is conveyed by the touch screen to a computer, which in response updates the image projected by the computer screen with the user script and reproduces such information at a sufficient rate that user drawings on the board become interactive.
  • the computer can interpret the user's touch input, and in response emulate operation of the mouse, a light pen or a digital command, for conveying information to the application program being executed.
  • User script is overlaid onto the image, causing the image to be updated to include the user drawing, for example, points, lines, or script.
  • the system does not attempt to interpret the drawings made by the user or to correct errors or waviness in the points, lines or script entered by the user on the touch screen.
  • the system does not provide for enhancing contrast between the drawings entered by the user and the information displayed by the computer on the LCD screen in order to enhance the ability of the viewing audience to identify the highlighted areas of a projected image.
  • U.S. Pat. No. 6,538,643 is directed to a method and system for presenting slides in which touch pad portions of a touch pad are mapped to pre-selected parts of a presented slide such that a pre-selected part of the present slide is highlighted in response to the corresponding touchpad portion being touched.
  • highlighting portions of the slide color change, outlining or other ways of distinguishing the selected portions of the slide can be used.
  • An operator generates slides to be displayed on a display screen and associates highlighting techniques, such as color change, blinking or brightening, animated builds, check marks or underlining with a pre-selected part, point, portion or attribute of the slides for drawling audience attention to the pre-selected parts of the slides.
  • the operator may then touch a touch pad portion of touch pad to execute the highlighting of a corresponding pre-selected part of the slide.
  • This system requires pre-selection of areas of a presentation slide, association of a pre-determined highlighting technique with each pre-selected area and mapping each pre-selected area to a specific portion of the touchpad device.
  • the system does not provide for random, real-time user input of highlighting during a presentation or for the selection of random areas of the presentation slide outside of the pre-selected areas.
  • the system does not maximize the highlighting contrast between the pre-selected areas of the presentation slide and the non-selected areas of the presentation slide.
  • U.S. Pat. No. 5,428,417 is directed to a visual lecture aid that provides a lecturer with a remote-controlled touch panel assembly, which makes available any one of a group of stored graphic icon pointers and highlighters that can be seen by the lecturer on a display panel. These icon pointers and highlighters include highlighting overstrikes and highlighting squares.
  • the lecturer utilizes the remote-controlled touch panel to select each of the pointing and highlighting icons by touch selection from a menu of displayed icons on the display panel and sequentially positions each icon and overstrike highlighting line by a simple finger movement over the display panel and subsequently freezes the final position of each pointing icon by touching a displayed freeze button.
  • both projected color selection and brightness control are available to the lecturer from a remote controlled display panel.
  • This system only provides for the selection of specific predefined graphics and does not allow for entry or recognition of user-defined graphics or shapes on the touch screen.
  • the color selections and brightness are predefined within the system, and the user must select from one of these predefined colors. The colors, however, are not created or selected to maximize the contrast of the display screen over which the selected graphics are displayed.
  • Suitable systems will provide for random and real-time user selection of highlighted regions of a selected image during the course of the presentation while permitting the speaker to generally maintain focus and eye-contact with the audience. Color contrast of the highlighted area will be selected to maximize contrast with the portions of image to be highlighted.
  • the system can work with computer generated highlights and with conventional laser pointers.
  • Exemplary embodiments in accordance with the present invention are directed to systems and methods that allow a speaker or presenter, while facing the audience and during a presentation, to select areas of a projected image for highlighting and to highlight the selected images on the projected image. Highlighting is accomplished by using lines, areas or figures that are superimposed onto the projected image or by controlling the movement of a pointing device such as a laser pointer.
  • the exemplary systems and methods of the present invention reduce or eliminate random jitter or jagged lines in the displayed highlight or controlled pointing device using, for example, motion smoothing applied to indications made by the presenter. For example when the projected highlight is a traced path, the trajectory of the path is rendered with a smooth curve before projection onto the displayed image.
  • a system in accordance with the present invention includes a touch-sensitive display monitor, i.e. touch screen, that displays the projected image to the speaker and that accepts entry of user-selected indication of the areas of the projected image to be highlighted.
  • the speaker can contact regions of the touch screen corresponding to the areas of the projected image that are to be highlighted, can draw boxes, circles or other geometric shapes around the area of the image to be highlighted or can trace paths between any two objects within the projected image.
  • the regions of the touch screen monitor that are contacted by the speaker are formulated as lines, areas or cursors that are projected on the displayed image.
  • the color, brightness and opacity of the projected lines, areas or cursors are adapted to contrast with the existing color of the displayed image in the region of the line, area or cursor while still providing for an unobstructed view of the information contained in the projected image.
  • the area of the displayed image that is touched by the speaker is rendered through a color that is contrasting to the average background color around the area of the image that is touched or selected by the speaker.
  • FIG. 1 is a schematic representation of an embodiment of a presentation system for use with methods for smooth pointing of objects during a presentation in accordance with the present invention
  • FIG. 2 is a schematic representation of an embodiment of a method for smooth pointing of objects during a presentation in accordance with the present invention
  • FIG. 3 is a schematic representation of another embodiment of a method for smooth pointing of objects during a presentation in accordance with the present invention
  • FIG. 4 is a representation of a display screen in accordance with an embodiment of the present invention.
  • FIG. 5 is an embodiment of a non-rectilinear line entered in accordance with the present invention.
  • FIG. 6 is an embodiment of a corrected non-rectilinear line corresponding to the non-rectilinear line of FIG. 5 ;
  • FIG. 7 is an embodiment of a perimeter of a two-dimensional space entered in accordance with the present invention.
  • FIG. 8 is an embodiment of a corrected perimeter corresponding to the perimeter of FIG. 7 ;
  • FIG. 9 is an example of one embodiment of the method in accordance with the present invention.
  • FIG. 10 is an example of another embodiment of the method in accordance with the present invention.
  • FIG. 1 an exemplary embodiment of a presentation system 10 for use with exemplary methods in accordance with the present invention is illustrated.
  • Systems and methods in accordance with the present invention can be used in any arrangement where one or more speakers are presenting information or data to multiple recipients located in one or more physical locations. Suitable arrangements include, but are not limited to, video conferences, lectures, distance learning programs, Internet based programs, conference lectures and classroom lectures.
  • the presentation system 10 includes at least one control system 26 for controlling or conducting the presentation.
  • the control system 26 is capable of generating and storing data or information to be displayed during the course of the presentation.
  • control systems include, but are not limited to, computer systems such as laptop computers, desktop computers and servers, programmable logic controllers, EEPROM's, single function control systems specifically created to conduct presentations and combinations thereof.
  • the system 10 also includes a plurality of presentation display platforms 30 in communication with the control system 26 .
  • the control system 26 controls the display of the presentation data on these display platforms.
  • the display platforms 30 are in direct communication with the control system 26 and controlled thereby without any intermediate controllers or processors or are in contact with the control system across one or more networks 28 or through additional nodes (not shown).
  • Suitable networks include any type of local area network or wide area network known and available in the art.
  • Suitable display platforms are capable of displaying images 22 stored in electronic, machine-readable format and communicated to the display platform 30 by the presentation control system 26 .
  • Suitable display platforms include, but are not limited to, computer monitors 34 , including cathode ray tubes (CRT's), plasma displays and liquid crystal displays (LCD's), and projection type displays that include a projection screen 36 and a projection mechanism 24 , for example projectors or LCD's used in combination with overhead projectors.
  • CTR's cathode ray tubes
  • LCD's liquid crystal displays
  • projection type displays that include a projection screen 36 and a projection mechanism 24 , for example projectors or LCD's used in combination with overhead projectors.
  • a suitable projection type display is the Epson EMP-732, commercially available from the Seiko Epson Corporation of 3-3-5 Owa, Suwa, Nagano, Japan.
  • the system 10 includes an interaction mechanism 12 that allows the speaker to control the presentation, for example to control the selection of images to be displayed on the display platforms.
  • the interaction mechanism facilitates user-defined input into the displayed images during the presentation, for example highlighting selected portions of the displayed images in real-time during the presentation. These selected highlights are shown or projected on the displayed images for viewing by the audience.
  • the presentation interaction mechanism 12 is in communication with the presentation control system 26 and includes a presentation monitor 14 .
  • the presentation interaction mechanism 12 is included in the presentation control system 26 .
  • the presentation monitor 14 is independent of and separate from the display platforms, although in one embodiment the presentation monitor and a display platform can be combined.
  • the presentation monitor is capable of displaying the images 22 provided by the control system. Any monitor capable of displaying electronic or computer-generated images can be used. Suitable presentation monitors are known and available in the art and include the same types of devices that can be used as display platforms.
  • the presentation monitor 14 is positioned so that it is viewable by the presenter or speaker, and in particular is viewable by the speaker such that the speaker maintains eye contact with the audience during the presentation.
  • the presentation control system simultaneously displays the images on one or more presentation platforms and the presentation monitor.
  • the presentation interaction mechanism 12 also includes at least one pointing device or input mechanism 16 that allows the speaker or user to manually annotate a displayed image in real time by indicating or drawing generally straight lines, non-rectilinear lines, i.e. curves, two-dimensional objects, for example the perimeter of a geographic shape, alpha-numeric annotations and combinations thereof on the image displayed on the presentation monitor and the presentation platforms.
  • a plurality of input mechanisms 16 can also be provided, for example where each input mechanism is arranged to facilitate a specific type of user-defined input.
  • the input mechanism 16 is in communication with the control system 26 .
  • the input mechanism 16 is in communication with but separate from the presentation monitor 14 .
  • the input mechanism 16 and presentation monitor 14 are integrated into a single device, e.g.
  • a touch sensitive display screen In one embodiment, this single device is the control system.
  • Suitable user interaction mechanisms include, but are not limited to, any mechanism known and available in the art that permits manual entry of user input into a computer-generated display field, for example a point-and-click device such as a computer mouse or trackball, a pressure pad, a touch screen and a pressure tablet used in combination with a stylus 18 .
  • the input mechanism 16 is a touch screen, for example that is part of a general purpose computer system. Suitable touch screens are commercially available as add-on touch-screens, called touchscreens.com, from Mass Multimedia Inc. of Colorado Springs, Colo.
  • the input mechanism 16 facilitates manual selection of a user-identified portion 20 of the displayed image.
  • This user-identified portion 20 is then displayed as a highlighted portion 21 of the displayed image on all the display platforms.
  • the highlighted portion 21 is displayed on the image by the control system using computer-generated graphics.
  • the control system is in communication with an optical pointer, e.g. a laser pointer 36 , through a motorized or mechanical control mechanism 32 .
  • the control system 26 through the mechanical control mechanism 32 moves the laser pointer 36 to trace the user-defined highlighted area 21 on the displayed image 22 .
  • Suitable mechanical control mechanism, laser pointers and control software are known and available in the art.
  • An image or slide from a presentation containing at least one, and preferably a plurality of images is displayed 52 on at least one display platform.
  • the image is displayed simultaneously on a plurality of display platforms.
  • These display platforms can be located in a single physical or geographic location, for example an auditorium, conference room or lecture hall, or at a plurality of distributed geographic locations, for example a plurality of offices located at various locations across a country or throughout the world.
  • Suitable methods for displaying the image include any method for displaying an electronically generated or stored image to be viewed by either multiple persons in one location or by multiple people located at multiple locations.
  • the images are displayed by a control system, e.g. a computer, on a CRT, LCD, plasma or projection display in communication with the control system and of sufficient size to be viewed by the presentation audience.
  • the image is displayed simultaneously on a monitoring screen 54 that is separate from and independent of the display platform.
  • the monitoring screen is in communication with the control system and is capable of receiving electronically generated images from the control system for display.
  • Suitable monitoring screens include, but are not limited to, any type of computer monitor known and available in the art including desktop, laptop and handheld monitors.
  • the monitoring screen is positioned to face the speaker such that the speaker can maintain eye contact with the audience during the presentation.
  • the monitoring screen is suitably sized for viewing by the speaker.
  • the monitoring screen is also one of the display platforms.
  • the monitoring screen is a touch sensitive screen that provides for both speaker monitoring of the presentation and speaker input. Any touch sensitive screens known and available in the art can be used as the monitoring screen.
  • the speaker identifies at least one portion of the displayed image to be highlighted 56 .
  • the speaker can identify a plurality of portions of the displayed image to be highlighted. By identifying portions of the displayed image to be highlighted on the display platforms during a presentation, the speaker draws attention to specific portions of each displayed image and highlights aspects of the presentation that correlate the speaker's comments with the information and data provided on the displayed image. Suitable identified portions include, but are not limited to single points, line 80 ( FIG. 4 ), non-rectilinear lines ( FIGS. 5 and 6 ) and two-dimensional shapes or areas ( FIGS. 7 and 8 ). These two dimensional shapes include geometric shapes and objects, e.g. arrows.
  • the identified portion can be viewed as a line drawn on or across the displayed image, for example a straight line, a non-rectilinear line or a line defining a perimeter of a geometric or two-dimensional shape.
  • non-rectilinear line includes any curved line including simple curves, compound curves and curves that form objects imbedded in text such as brackets and parentheses.
  • identifying the portion of the displayed image includes drawing a line 80 , i.e. generally straight line, between a first object 78 on the displayed image 22 and one of a plurality of second objects 79 .
  • identifying the portion of the displayed image includes drawing a non-rectilinear line 82 on the displayed image 22 .
  • identifying the portion of the displayed image includes drawing a two-dimensional shape 86 on the displayed image 22 .
  • the two-dimensional shape can be a geometric shape and the step of drawing the two-dimensional shape includes manually drawing at least a portion of the perimeter of that shape.
  • the perimeter is indicated substantially around the two-dimensional shape on the displayed image so that the desired two-dimensional shape is adequately indicated.
  • any method for providing user-defined input into a computer-based system can be used by the speaker to identify the desired portion of the displayed image.
  • the speaker uses a pointing or input mechanism in communication with both the monitoring screen and the control system to identify the desired portion of the displayed image.
  • Suitable pointing mechanisms include, but are not limited to, point and click mechanisms such as a computer mouse, trackball or a touchpad. These point and click devices can be wired or wireless devices.
  • Other suitable pointing mechanisms include touch sensitive screens, wherein a touch sensitive plate is placed on a display screen such that points on the touch sensitive plate correspond to locations on the display screen and therefore to images displayed on the display screen. These points can be touched directly by using a finger or by using another device such as a stylus.
  • the pointing mechanism allows the speaker to manually identify the desired portion of the displayed image in real-time during a presentation.
  • the user-defined portion is displayed on the monitoring screen as entered by the speaker.
  • the speaker interaction mechanism 12 includes a touch sensitive screen acting as both the monitoring screen and the pointing mechanism
  • the speaker touches the touch sensitive screen at one or more locations corresponding to the desired portion of the displayed image. For example, the speaker touches the touch screen at a location corresponding to a particular object within the displayed image or draws a line, non-rectilinear line, or shape on the touch screen in a location that corresponds to the desired location on the displayed image.
  • the identified portion of the displayed image is manually entered by hand or by a hand-held device, the line, curve or shape entered will often contain imperfections or variations, for example waviness in the lines. However, these variations in the initial trajectory of the identified portion are undesirable. Waviness in the entered line results in blurriness when that line is shown on one of the display platforms. In addition, undesirable variations reduce the clarity of the identified portion. Therefore, in one embodiment the identified portion is analyzed to determine if undesirable variations in that identified portion need to be corrected 58 . In one embodiment, identification of whether or not undesirable variations need to be corrected is conducted by the control system. The undesirable variations can be predefined, for example by the speaker. For example, a sudden jump in the coordinates of a line that is drawn can be considered to be an undesirable variation. If undesirable variations exist in the identified portion or, alternatively, variations exist in the identified portion that exceed a predefined limit, then the variations are corrected in the identified portion 60 , resulting in a modified portion. This modified portion is used to highlight the displayed image.
  • the identified portion is a substantially straight line across the displayed image that is manually drawn by the speaker across the displayed image using the interaction mechanisms as defined herein.
  • the undesirable variations i.e. waviness, motion jitter and unintended curves, are corrected in the manually drawn line, creating a modified line.
  • the identified portion is a non-rectilinear line 82 , and in particular a compound curve. Correction of undesirable variations involves applying a curve smoothing algorithm to as-drawn non-rectilinear line 82 resulting in a modified non-rectilinear line 84 ( FIG. 6 ).
  • the shape is entered by manually indicating at least a portion of the perimeter of a two-dimensional shape on the displayed image. Undesirable variations are corrected by correcting the perimeter around the two-dimensional shape.
  • the two-dimensional shape is a geometric shape, for example a circle, ellipse, square, rectangle or triangle. Referring to FIGS. 7 and 8 , the perimeter 86 ( FIG. 7 ) is entered manually by the speaker resulting in variations including a wavy line and an incomplete perimeter, i.e. the circle is not completely closed. These variations are removed from the entered perimeter to create a modified perimeter 88 ( FIG. 8 ).
  • the entered perimeter is substantially the perimeter of a regular geometric shape, e.g. a circle.
  • correction of undesirable variations in the identified portion includes selecting a geometric shape from a predetermined list of predetermined shapes that matches the shape entered by the speaker. In one embodiment, this is performed by using template matching methods, which return the closest match to the drawn object.
  • a suitable appropriate curve smoothing algorithm employs piecewise parametric cubics and is described in M. Plass and M. Stone, “Curve-Fitting with Piecewise Parametric Cubics”, SIGGRAPH 1983: Proceedings of the 10th annual conference on Computer graphics and interactive techniques, pages 229-239, ACM Press.
  • the curve-fitting or curve-smoothing algorithm can be applied after the presenter specifies a trajectory on the touch screen, and uses all the points that recorded the presenter's touch. Alternately, the curve-fitting algorithm is applied at fixed time intervals, e.g. every second, to the points that record the presenter's touch during this interval.
  • the determined color is the color of the displayed image in an area covered or occupied by the identified portion.
  • the determined color is any color of the displayed image in the region or portion covered by that line.
  • the determined color is the color of the displayed image in an region immediately adjacent to the identified portion, e.g. on either side of the line. This determination of a color in the displayed image can be used for any type of identified portion including lines, non-rectilinear lines and the perimeters of two-dimensional shapes.
  • the determined color is selected from any color of the displayed image located within the area of the two-dimensional shape, within the portion or region of the displayed image located under the perimeter, within a portion or region of the displayed image adjacent to the perimeter and combinations thereof.
  • two or more colors are identified in the relevant portions of the displayed image.
  • the average color of the displayed image in portions that are covered by or adjacent to the identified portions is identified. Suitable methods for identifying the average color of the displayed image are known and available in the literature and are made, for example, on a pixel-by-pixel basis.
  • the hue saturation value (HSV) is determined for the displayed image in the relevant portions or regions of the displayed image, yielding an average HSV or a predominant HSV in the desired regions of the displayed image indicated by the identified portion. This determination of color is made in real-time and is based on the image currently displayed at the time that the identified portion is selected by the speaker.
  • a contrasting color to the determined color is identified 64 .
  • the Commission Internationale De L'Eclairage (CIE) Delta E metric can be used to define the concept of contrasting colors.
  • CIE Delta E metric can be found in Color Science , by G. Wyszecki and W. Stiles, pg. 165, John Wiley 2 nd Edition.
  • the CIE Delta E metric defines differences between two colors. If the CIE Delta E metric between two colors is sufficiently large, those two colors are considered to be contrasting.
  • the colors are considered to contrasting, while the CIE Delta E metric is about 1 for two colors that are just noticeably different.
  • a color that yields a large CIR Delta E with respect to the original color is chosen, for example randomly.
  • the identified portion is highlighted in the displayed image on the display platforms by drawing or displaying the identified portion on the display platforms using the contrasting color 66 .
  • the identified portion can also be displayed on the monitoring screen using the contrasting color.
  • the appearance of the highlighted identified portion against the displayed image is optimized, making it easier for the audience to follow the presentation and to see the highlight.
  • the contrasting color can be varied throughout the identified portion to maintain an optimum appearance. Highlighting the identified portion using the contrasting color can be undertaken on a single display platform or on a plurality of display platforms.
  • the selected portion of the identified portion is a line or non-rectilinear line
  • the entire line or non-rectilinear line is displayed in the contrasting color.
  • the identified portion is a perimeter of a two-dimensional space
  • the perimeter, the entire two-dimensional area or both the perimeter and the entire two-dimensional area are displayed using the contrasting color.
  • the average color in this identified portion is determined.
  • a square area for example of a size of about 96 ⁇ 96 pixels is used to calculate the average color.
  • the nominal display size in one embodiment is about 800 ⁇ 600 pixels.
  • Other square areas can be used, for example about 32 ⁇ 32 pixels.
  • the identified portion of the displayed image is a two-dimensional space or area, and the area is defined as the interior of the perimeter of the two-dimensional shape drawn by the user.
  • the perimeter drawn by the system on the displayed image is a smooth curve that is fitted to the points indicated by the user on the presentation monitor using the pointing mechanism.
  • the area within the perimeter is filled with the contrasting average color of the initial pixels contained within the smooth two-dimensional curve perimeter.
  • the pixels within the perimeter used to calculate an average color are augmented by selecting additional pixels that fall within a predetermined distance outside the perimeter.
  • display of the contrasting color is achieved through a linear combination of the selected contrasting color with the pre-existing pixel color at a given location.
  • the new color is fC+(1 ⁇ f)P.
  • f is a number between about 0 and about 1.
  • the value of f is less than about 0.5 so that the original pixel value is visible.
  • the new color that is rendered and displayed is a weighted average of the original color and the contrasting color of the original pixel color.
  • the complementary color of a given color is used to enhance the visibility of a highlight.
  • a complementary color is a color that is the most opposite, i.e. 180 degrees opposite, of a given color.
  • there is only one complementary color as there is only one point that is 180 degrees opposite the given color.
  • An algorithm to calculate a complementary color is described in “Computer Graphics, Principles and Practice”, by J. D. Foley, A. Van Dam, S. K. Feiner and J. F. Hughes, Second Edition, 1997, pg. 590, Addison Wesley Publishing Company.
  • the identified color within the displayed image is represented in an HSV color space, and a complementary color is identified that is 180 degrees opposite the given color in the HSV hexcone.
  • the visibility of the selected area and the contents of that area are enhanced by toggling the colors of the pixels within the area in a cyclical manner.
  • C the complement of that contrasting color
  • P the complement of that original pixel color
  • P′ the complement of that original pixel color
  • the pixel at the specified location, L is rendered with a color that is a linear combination of the contrasting color, C, and the given original pixel color, P, and is computed by the equation given above, i.e. fC+(1 ⁇ f)P.
  • the pixel at the specified location, L is rendered by the complement to the contrasting color, C′, and the complement to the original pixel color, P′, and is given by the equation fC′+(1 ⁇ f)P′.
  • This toggling of colors makes the contents of the selected area more visible.
  • the period of the cycle is about 2 seconds.
  • the constant f is not a set value but is represented by a quantity that changes or decays over time, causing the color modifications and toggling effect to disappear over time.
  • an alternative exemplary embodiment of a method for correcting the undesirable variations in the manually indicated highlight portion of the displayed image 68 in accordance with the present invention is illustrated.
  • an image is displayed on a least one and possibly a plurality of display platforms 70 .
  • a portion of the displayed image is identified 72 , for example by manually drawing a line across the displayed image. Suitable systems and methods for identifying the desired portion of the displayed image are as defined above.
  • the identified portion includes straight lines, non-rectilinear or curved lines and perimeters of two-dimensional areas or objects on the displayed image. Undesirable variations or imperfections in the identified area are corrected 74 , yielding a corrected identified portion.
  • the corrected identified portion is drawn on 76 .
  • FIG. 9 A first example of an exemplary embodiment of a method for highlighting a visual presentation in real time in accordance with the present invention is illustrated in FIG. 9 .
  • a presentation image 22 is displayed simultaneously on a display platform 34 and a monitoring display 14 .
  • the monitoring display 14 is the screen of a laptop computer, which is serving as the monitoring mechanism 12 .
  • the user input device 16 is also in communication with the laptop computer and is in the form of a computer mouse.
  • the speaker identifies a portion 20 of the displayed image using the input device and this identified portion is displayed on the monitoring screen 14 just as it is entered by the speaker including with any waviness or imperfections that result from manual entry using the imputer device.
  • the laptop computer also serves as the control system that controls the presentation of images for the presentation and that executes methods in accordance with the present invention.
  • FIG. 10 A second example of an exemplary embodiment of a method in accordance with the present invention is illustrated in FIG. 10 .
  • the monitoring mechanism 12 is a laptop computer that also serves as the control system for the presentation.
  • the user input device 16 is a touch pad integrated into the laptop computer.
  • the desired portion 20 of the displayed image is identified by the speaker using the touch pad and displayed on the monitoring screen 14 .
  • the identified line is corrected; however, the corrected line 21 is not drawn over the displayed image on the display platform 34 using computer-generated graphics.
  • a motorized and controllable laser light pointer 36 in communication with the interaction mechanism is used to trace the corrected identified portion 21 across the displayed image 22 .
  • a controllable, motorized laser pointer alleviates the need for creating an adaptively varying color for the cursor.
  • the laser pointer since the laser pointer is mounted on a motorized stage, the presenter does not have to face the projection screen at any time and can always face the audience directly.
  • the motion of the motorized stage is calibrated such that the laser pointer points to the corners of the projection screen whenever the corners of the monitor used to preview the presentation are touched. The precise coordinates can be obtained through simple bicubic interpolation.
  • Methods and systems in accordance with the present invention enable the speaker giving a presentation in front of a live audience to face the audience at all times while speaking.
  • the speaker merely contacts a touch-screen monitor that previews the presentation, and this interaction is transformed into a color-adaptive highlight or cursor that is displayed on the final projection platform.
  • the displayed highlight is suitably smoothed so that jittery hand motion is eliminated. This allows the presenter easier access to both his presentation materials and the audience, enabling him to deliver a more effective presentation.
  • Methods and systems in accordance with exemplary embodiments of the present invention can take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment containing both hardware and software elements.
  • the invention is implemented in software, which includes but is not limited to firmware, resident software and microcode.
  • exemplary methods and systems can take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer, logical processing unit or any instruction execution system.
  • a computer-usable or computer-readable medium can be any apparatus that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • Suitable computer-usable or computer readable mediums include, but are not limited to, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor systems (or apparatuses or devices) or propagation mediums.
  • Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk.
  • Current examples of optical disks include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W) and DVD.
  • Suitable data processing systems for storing and/or executing program code include, but are not limited to, at least one processor coupled directly or indirectly to memory elements through a system bus.
  • the memory elements include local memory employed during actual execution of the program code, bulk storage, and cache memories, which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution.
  • I/O devices including but not limited to keyboards, displays and pointing devices, can be coupled to the system either directly or through intervening I/O controllers.
  • Exemplary embodiments of the methods and systems in accordance with the present invention also include network adapters coupled to the system to enable the data processing system to become coupled to other data processing systems or remote printers or storage devices through intervening private or public networks. Suitable currently available types of network adapters include, but are not limited to, modems, cable modems, DSL modems, Ethernet cards and combinations thereof.
  • the present invention is directed to a machine-readable or computer-readable medium containing a machine-executable or computer-executable code that when read by a machine or computer causes the machine or computer to perform a method for highlighting a portion of a displayed image in accordance with exemplary embodiments of the present invention and to the computer-executable code itself.
  • the machine-readable or computer-readable code can be any type of code or language capable of being read and executed by the machine or computer and can be expressed in any suitable language or syntax known and available in the art including machine languages, assembler languages, higher level languages, object oriented languages and scripting languages.
  • the computer-executable code can be stored on any suitable storage medium or database, including databases disposed within, in communication with and accessible by computer networks utilized by systems in accordance with the present invention and can be executed on any suitable hardware platform as are known and available in the art including the control systems used to control the presentations of the present invention.

Abstract

A system and method are provided that allows a speaker to provide real-time annotations to one or more displayed images during a presentation. The speaker inputs the annotations my manually identify a portion of the displayed image using an input mechanism such as a touch screen that is in communication with a presentation monitor and a control system that is controlling the presentation. The identified portions are annotated onto the displayed images using a adaptive color that is selected to provide optimum contrast with the existing colors in the displayed image. In addition to providing adaptive, contrasting colors for the annotations, imperfection such as waviness are removed from the identified portions to created corrected portions that are then annotated onto the presentation image. Annotation is accomplished through computer generated graphics or through the use of a motorized, controllable laser pointer.

Description

    FIELD OF THE INVENTION
  • The present invention is directed to the field of interface devices for computer users, and more particularly to the use of pointing devices during presentations.
  • BACKGROUND OF THE INVENTION
  • In conference and meeting presentations, images are displayed on a monitor or projected on a screen for audience viewing. A speaker calls attention to specific portions of the displayed images using pointing devices such as illuminated arrows and laser pointers. Although laser pointers are generally preferred pointing devices, there are various limitations associated with the use of laser pointing devices. For example, the size of the projected image is typically large, especially in auditorium settings, and the speaker is positioned relatively close to the projected image. At this relatively close proximity to the projected image, the speaker cannot easily view the entire image, and there is significant perspective distortion when viewing projected images up close. Therefore, the speaker cannot easily view the presentation and must use a small computer screen on which the images are simultaneously displayed, alternating attention between the computer screen and the projected image.
  • Since laser pointers are typically held in the hand of the speaker and manually moved to the desired locations on the projected images, laser pointers are susceptible to motion jitter, causing inaccurate pointing and viewing problems for the audience. Moreover, the speaker has to face the projected image to ensure that the correct areas of the screen are selected using the laser pointer, preventing eye contact between the speaker and the audience. In addition, the use of these physical highlighting systems are only applicable to presentations given at a single location and not to presentations made simultaneously to multiple distributed locations, for example across a computer network or through a video conference.
  • Systems have been developed in an attempt to overcome the limitations of using physical and optical pointers in group presentations. For example, U.S. Pat. No. 6,747,636 is directed to a display system for overlaying graphics applied to a display surface onto the graphics output of an application running in the foreground, and to allow user input to the application program by means of a touch sensitive screen. In this way, a completely interactive display system is provided in which the user can highlight and edit information generated by the computer program by simply drawing on the touch sensitive screen. In operation, when a user selectively applies pressure in the form of a point, line or script to the surface of a touch screen, information is conveyed by the touch screen to a computer, which in response updates the image projected by the computer screen with the user script and reproduces such information at a sufficient rate that user drawings on the board become interactive. Alternatively, the computer can interpret the user's touch input, and in response emulate operation of the mouse, a light pen or a digital command, for conveying information to the application program being executed. User script is overlaid onto the image, causing the image to be updated to include the user drawing, for example, points, lines, or script. The system however, does not attempt to interpret the drawings made by the user or to correct errors or waviness in the points, lines or script entered by the user on the touch screen. In addition, the system does not provide for enhancing contrast between the drawings entered by the user and the information displayed by the computer on the LCD screen in order to enhance the ability of the viewing audience to identify the highlighted areas of a projected image.
  • U.S. Pat. No. 6,538,643 is directed to a method and system for presenting slides in which touch pad portions of a touch pad are mapped to pre-selected parts of a presented slide such that a pre-selected part of the present slide is highlighted in response to the corresponding touchpad portion being touched. Instead of highlighting portions of the slide, color change, outlining or other ways of distinguishing the selected portions of the slide can be used. An operator generates slides to be displayed on a display screen and associates highlighting techniques, such as color change, blinking or brightening, animated builds, check marks or underlining with a pre-selected part, point, portion or attribute of the slides for drawling audience attention to the pre-selected parts of the slides. The operator may then touch a touch pad portion of touch pad to execute the highlighting of a corresponding pre-selected part of the slide. This system, however, requires pre-selection of areas of a presentation slide, association of a pre-determined highlighting technique with each pre-selected area and mapping each pre-selected area to a specific portion of the touchpad device. The system does not provide for random, real-time user input of highlighting during a presentation or for the selection of random areas of the presentation slide outside of the pre-selected areas. In addition, the system does not maximize the highlighting contrast between the pre-selected areas of the presentation slide and the non-selected areas of the presentation slide.
  • U.S. Pat. No. 5,428,417 is directed to a visual lecture aid that provides a lecturer with a remote-controlled touch panel assembly, which makes available any one of a group of stored graphic icon pointers and highlighters that can be seen by the lecturer on a display panel. These icon pointers and highlighters include highlighting overstrikes and highlighting squares. The lecturer utilizes the remote-controlled touch panel to select each of the pointing and highlighting icons by touch selection from a menu of displayed icons on the display panel and sequentially positions each icon and overstrike highlighting line by a simple finger movement over the display panel and subsequently freezes the final position of each pointing icon by touching a displayed freeze button. In addition to the graphics, both projected color selection and brightness control are available to the lecturer from a remote controlled display panel. This system, however, only provides for the selection of specific predefined graphics and does not allow for entry or recognition of user-defined graphics or shapes on the touch screen. In addition, the color selections and brightness are predefined within the system, and the user must select from one of these predefined colors. The colors, however, are not created or selected to maximize the contrast of the display screen over which the selected graphics are displayed.
  • Therefore, a need exists for a pointing system that enables a speaker to maintain eye-contact with the audience, and to point at objects on the displayed screen without introducing motion jitter. Suitable systems will provide for random and real-time user selection of highlighted regions of a selected image during the course of the presentation while permitting the speaker to generally maintain focus and eye-contact with the audience. Color contrast of the highlighted area will be selected to maximize contrast with the portions of image to be highlighted. In addition, the system can work with computer generated highlights and with conventional laser pointers.
  • SUMMARY OF THE INVENTION
  • Exemplary embodiments in accordance with the present invention are directed to systems and methods that allow a speaker or presenter, while facing the audience and during a presentation, to select areas of a projected image for highlighting and to highlight the selected images on the projected image. Highlighting is accomplished by using lines, areas or figures that are superimposed onto the projected image or by controlling the movement of a pointing device such as a laser pointer. In addition to facilitating an indication by the speaker or presenter of an area or areas to be highlighted in the projected image, the exemplary systems and methods of the present invention reduce or eliminate random jitter or jagged lines in the displayed highlight or controlled pointing device using, for example, motion smoothing applied to indications made by the presenter. For example when the projected highlight is a traced path, the trajectory of the path is rendered with a smooth curve before projection onto the displayed image.
  • In one exemplary embodiment, a system in accordance with the present invention includes a touch-sensitive display monitor, i.e. touch screen, that displays the projected image to the speaker and that accepts entry of user-selected indication of the areas of the projected image to be highlighted. For example, the speaker can contact regions of the touch screen corresponding to the areas of the projected image that are to be highlighted, can draw boxes, circles or other geometric shapes around the area of the image to be highlighted or can trace paths between any two objects within the projected image. The regions of the touch screen monitor that are contacted by the speaker are formulated as lines, areas or cursors that are projected on the displayed image. The color, brightness and opacity of the projected lines, areas or cursors are adapted to contrast with the existing color of the displayed image in the region of the line, area or cursor while still providing for an unobstructed view of the information contained in the projected image. In one embodiment, the area of the displayed image that is touched by the speaker is rendered through a color that is contrasting to the average background color around the area of the image that is touched or selected by the speaker.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic representation of an embodiment of a presentation system for use with methods for smooth pointing of objects during a presentation in accordance with the present invention;
  • FIG. 2 is a schematic representation of an embodiment of a method for smooth pointing of objects during a presentation in accordance with the present invention;
  • FIG. 3 is a schematic representation of another embodiment of a method for smooth pointing of objects during a presentation in accordance with the present invention;
  • FIG. 4 is a representation of a display screen in accordance with an embodiment of the present invention;
  • FIG. 5 is an embodiment of a non-rectilinear line entered in accordance with the present invention;
  • FIG. 6 is an embodiment of a corrected non-rectilinear line corresponding to the non-rectilinear line of FIG. 5;
  • FIG. 7 is an embodiment of a perimeter of a two-dimensional space entered in accordance with the present invention;
  • FIG. 8 is an embodiment of a corrected perimeter corresponding to the perimeter of FIG. 7;
  • FIG. 9 is an example of one embodiment of the method in accordance with the present invention;
  • FIG. 10 is an example of another embodiment of the method in accordance with the present invention.
  • DETAILED DESCRIPTION
  • Referring initially to FIG. 1, an exemplary embodiment of a presentation system 10 for use with exemplary methods in accordance with the present invention is illustrated. Systems and methods in accordance with the present invention can be used in any arrangement where one or more speakers are presenting information or data to multiple recipients located in one or more physical locations. Suitable arrangements include, but are not limited to, video conferences, lectures, distance learning programs, Internet based programs, conference lectures and classroom lectures. As illustrated, the presentation system 10 includes at least one control system 26 for controlling or conducting the presentation. The control system 26 is capable of generating and storing data or information to be displayed during the course of the presentation. Typically, these data or information are in the form of graphs, charts, images or slides that are stored in electronic, machine-readable format and are often computer-generated using one or more software programs known and available in the art. Suitable control systems include, but are not limited to, computer systems such as laptop computers, desktop computers and servers, programmable logic controllers, EEPROM's, single function control systems specifically created to conduct presentations and combinations thereof.
  • The system 10, also includes a plurality of presentation display platforms 30 in communication with the control system 26. The control system 26 controls the display of the presentation data on these display platforms. The display platforms 30 are in direct communication with the control system 26 and controlled thereby without any intermediate controllers or processors or are in contact with the control system across one or more networks 28 or through additional nodes (not shown). Suitable networks include any type of local area network or wide area network known and available in the art. Suitable display platforms are capable of displaying images 22 stored in electronic, machine-readable format and communicated to the display platform 30 by the presentation control system 26. Suitable display platforms include, but are not limited to, computer monitors 34, including cathode ray tubes (CRT's), plasma displays and liquid crystal displays (LCD's), and projection type displays that include a projection screen 36 and a projection mechanism 24, for example projectors or LCD's used in combination with overhead projectors. A suitable projection type display is the Epson EMP-732, commercially available from the Seiko Epson Corporation of 3-3-5 Owa, Suwa, Nagano, Japan.
  • To facilitate real-time interaction between the speaker giving the presentation and the displayed images, the system 10 includes an interaction mechanism 12 that allows the speaker to control the presentation, for example to control the selection of images to be displayed on the display platforms. In general, the interaction mechanism facilitates user-defined input into the displayed images during the presentation, for example highlighting selected portions of the displayed images in real-time during the presentation. These selected highlights are shown or projected on the displayed images for viewing by the audience. The presentation interaction mechanism 12 is in communication with the presentation control system 26 and includes a presentation monitor 14. In one embodiment, the presentation interaction mechanism 12 is included in the presentation control system 26. In one embodiment, the presentation monitor 14 is independent of and separate from the display platforms, although in one embodiment the presentation monitor and a display platform can be combined. The presentation monitor is capable of displaying the images 22 provided by the control system. Any monitor capable of displaying electronic or computer-generated images can be used. Suitable presentation monitors are known and available in the art and include the same types of devices that can be used as display platforms. In one embodiment, the presentation monitor 14 is positioned so that it is viewable by the presenter or speaker, and in particular is viewable by the speaker such that the speaker maintains eye contact with the audience during the presentation. In one embodiment, the presentation control system simultaneously displays the images on one or more presentation platforms and the presentation monitor.
  • The presentation interaction mechanism 12 also includes at least one pointing device or input mechanism 16 that allows the speaker or user to manually annotate a displayed image in real time by indicating or drawing generally straight lines, non-rectilinear lines, i.e. curves, two-dimensional objects, for example the perimeter of a geographic shape, alpha-numeric annotations and combinations thereof on the image displayed on the presentation monitor and the presentation platforms. A plurality of input mechanisms 16 can also be provided, for example where each input mechanism is arranged to facilitate a specific type of user-defined input. The input mechanism 16 is in communication with the control system 26. In one embodiment, the input mechanism 16 is in communication with but separate from the presentation monitor 14. In another embodiment, the input mechanism 16 and presentation monitor 14 are integrated into a single device, e.g. a touch sensitive display screen. In one embodiment, this single device is the control system. Suitable user interaction mechanisms include, but are not limited to, any mechanism known and available in the art that permits manual entry of user input into a computer-generated display field, for example a point-and-click device such as a computer mouse or trackball, a pressure pad, a touch screen and a pressure tablet used in combination with a stylus 18. In one embodiment, the input mechanism 16 is a touch screen, for example that is part of a general purpose computer system. Suitable touch screens are commercially available as add-on touch-screens, called touchscreens.com, from Mass Multimedia Inc. of Colorado Springs, Colo.
  • The input mechanism 16 facilitates manual selection of a user-identified portion 20 of the displayed image. This user-identified portion 20 is then displayed as a highlighted portion 21 of the displayed image on all the display platforms. In one embodiment, the highlighted portion 21 is displayed on the image by the control system using computer-generated graphics. In another embodiment, the control system is in communication with an optical pointer, e.g. a laser pointer 36, through a motorized or mechanical control mechanism 32. The control system 26 through the mechanical control mechanism 32 moves the laser pointer 36 to trace the user-defined highlighted area 21 on the displayed image 22. Suitable mechanical control mechanism, laser pointers and control software are known and available in the art.
  • Referring now to FIG. 2, an exemplary embodiment of a method for highlighting a portion of the displayed image 50 is illustrated. An image or slide from a presentation containing at least one, and preferably a plurality of images is displayed 52 on at least one display platform. In one embodiment, the image is displayed simultaneously on a plurality of display platforms. These display platforms can be located in a single physical or geographic location, for example an auditorium, conference room or lecture hall, or at a plurality of distributed geographic locations, for example a plurality of offices located at various locations across a country or throughout the world. Suitable methods for displaying the image include any method for displaying an electronically generated or stored image to be viewed by either multiple persons in one location or by multiple people located at multiple locations. In one embodiment, the images are displayed by a control system, e.g. a computer, on a CRT, LCD, plasma or projection display in communication with the control system and of sufficient size to be viewed by the presentation audience.
  • In another embodiment, the image is displayed simultaneously on a monitoring screen 54 that is separate from and independent of the display platform. The monitoring screen is in communication with the control system and is capable of receiving electronically generated images from the control system for display. Suitable monitoring screens include, but are not limited to, any type of computer monitor known and available in the art including desktop, laptop and handheld monitors. In order to provide for user input in real-time during the presentation while maintaining eye contact between the speaker and the audience, the monitoring screen is positioned to face the speaker such that the speaker can maintain eye contact with the audience during the presentation. In one embodiment, the monitoring screen is suitably sized for viewing by the speaker. In one embodiment, the monitoring screen is also one of the display platforms. In one embodiment, the monitoring screen is a touch sensitive screen that provides for both speaker monitoring of the presentation and speaker input. Any touch sensitive screens known and available in the art can be used as the monitoring screen.
  • In one embodiment, the speaker identifies at least one portion of the displayed image to be highlighted 56. Alternatively, the speaker can identify a plurality of portions of the displayed image to be highlighted. By identifying portions of the displayed image to be highlighted on the display platforms during a presentation, the speaker draws attention to specific portions of each displayed image and highlights aspects of the presentation that correlate the speaker's comments with the information and data provided on the displayed image. Suitable identified portions include, but are not limited to single points, line 80 (FIG. 4), non-rectilinear lines (FIGS. 5 and 6) and two-dimensional shapes or areas (FIGS. 7 and 8). These two dimensional shapes include geometric shapes and objects, e.g. arrows. In general, the identified portion can be viewed as a line drawn on or across the displayed image, for example a straight line, a non-rectilinear line or a line defining a perimeter of a geometric or two-dimensional shape. As used herein, non-rectilinear line includes any curved line including simple curves, compound curves and curves that form objects imbedded in text such as brackets and parentheses.
  • In one exemplary embodiment as illustrated in FIG. 4, identifying the portion of the displayed image includes drawing a line 80, i.e. generally straight line, between a first object 78 on the displayed image 22 and one of a plurality of second objects 79. In another embodiment as illustrated in FIG. 5, identifying the portion of the displayed image includes drawing a non-rectilinear line 82 on the displayed image 22. In another embodiment as illustrated in FIG. 7, identifying the portion of the displayed image includes drawing a two-dimensional shape 86 on the displayed image 22. For example, the two-dimensional shape can be a geometric shape and the step of drawing the two-dimensional shape includes manually drawing at least a portion of the perimeter of that shape. Preferably, the perimeter is indicated substantially around the two-dimensional shape on the displayed image so that the desired two-dimensional shape is adequately indicated.
  • Any method for providing user-defined input into a computer-based system can be used by the speaker to identify the desired portion of the displayed image. In one embodiment, the speaker uses a pointing or input mechanism in communication with both the monitoring screen and the control system to identify the desired portion of the displayed image. Suitable pointing mechanisms include, but are not limited to, point and click mechanisms such as a computer mouse, trackball or a touchpad. These point and click devices can be wired or wireless devices. Other suitable pointing mechanisms include touch sensitive screens, wherein a touch sensitive plate is placed on a display screen such that points on the touch sensitive plate correspond to locations on the display screen and therefore to images displayed on the display screen. These points can be touched directly by using a finger or by using another device such as a stylus. The pointing mechanism allows the speaker to manually identify the desired portion of the displayed image in real-time during a presentation.
  • Once identified, the user-defined portion is displayed on the monitoring screen as entered by the speaker. In one embodiment, where the speaker interaction mechanism 12 includes a touch sensitive screen acting as both the monitoring screen and the pointing mechanism, the speaker touches the touch sensitive screen at one or more locations corresponding to the desired portion of the displayed image. For example, the speaker touches the touch screen at a location corresponding to a particular object within the displayed image or draws a line, non-rectilinear line, or shape on the touch screen in a location that corresponds to the desired location on the displayed image.
  • Since the identified portion of the displayed image is manually entered by hand or by a hand-held device, the line, curve or shape entered will often contain imperfections or variations, for example waviness in the lines. However, these variations in the initial trajectory of the identified portion are undesirable. Waviness in the entered line results in blurriness when that line is shown on one of the display platforms. In addition, undesirable variations reduce the clarity of the identified portion. Therefore, in one embodiment the identified portion is analyzed to determine if undesirable variations in that identified portion need to be corrected 58. In one embodiment, identification of whether or not undesirable variations need to be corrected is conducted by the control system. The undesirable variations can be predefined, for example by the speaker. For example, a sudden jump in the coordinates of a line that is drawn can be considered to be an undesirable variation. If undesirable variations exist in the identified portion or, alternatively, variations exist in the identified portion that exceed a predefined limit, then the variations are corrected in the identified portion 60, resulting in a modified portion. This modified portion is used to highlight the displayed image.
  • In one embodiment, the identified portion is a substantially straight line across the displayed image that is manually drawn by the speaker across the displayed image using the interaction mechanisms as defined herein. The undesirable variations, i.e. waviness, motion jitter and unintended curves, are corrected in the manually drawn line, creating a modified line. Referring to FIGS. 5 and 6, the identified portion is a non-rectilinear line 82, and in particular a compound curve. Correction of undesirable variations involves applying a curve smoothing algorithm to as-drawn non-rectilinear line 82 resulting in a modified non-rectilinear line 84 (FIG. 6). In another embodiment where the identified portion of the displayed image is a two-dimensional shape, the shape is entered by manually indicating at least a portion of the perimeter of a two-dimensional shape on the displayed image. Undesirable variations are corrected by correcting the perimeter around the two-dimensional shape. In one embodiment, the two-dimensional shape is a geometric shape, for example a circle, ellipse, square, rectangle or triangle. Referring to FIGS. 7 and 8, the perimeter 86 (FIG. 7) is entered manually by the speaker resulting in variations including a wavy line and an incomplete perimeter, i.e. the circle is not completely closed. These variations are removed from the entered perimeter to create a modified perimeter 88 (FIG. 8). As illustrated, the entered perimeter is substantially the perimeter of a regular geometric shape, e.g. a circle. In one embodiment, correction of undesirable variations in the identified portion includes selecting a geometric shape from a predetermined list of predetermined shapes that matches the shape entered by the speaker. In one embodiment, this is performed by using template matching methods, which return the closest match to the drawn object.
  • A suitable appropriate curve smoothing algorithm employs piecewise parametric cubics and is described in M. Plass and M. Stone, “Curve-Fitting with Piecewise Parametric Cubics”, SIGGRAPH 1983: Proceedings of the 10th annual conference on Computer graphics and interactive techniques, pages 229-239, ACM Press. The curve-fitting or curve-smoothing algorithm can be applied after the presenter specifies a trajectory on the touch screen, and uses all the points that recorded the presenter's touch. Alternately, the curve-fitting algorithm is applied at fixed time intervals, e.g. every second, to the points that record the presenter's touch during this interval.
  • Returning to FIG. 2, after any undesirable variations are corrected, or alternatively if the system and method do not check for or correct undesirable variations, at least one color in the identified portion of the displayed image is determined 62. In one embodiment, the determined color is the color of the displayed image in an area covered or occupied by the identified portion. When the identified portion is a line, the determined color is any color of the displayed image in the region or portion covered by that line. In another embodiment, the determined color is the color of the displayed image in an region immediately adjacent to the identified portion, e.g. on either side of the line. This determination of a color in the displayed image can be used for any type of identified portion including lines, non-rectilinear lines and the perimeters of two-dimensional shapes. When the identified portion is a two-dimensional shape, the determined color is selected from any color of the displayed image located within the area of the two-dimensional shape, within the portion or region of the displayed image located under the perimeter, within a portion or region of the displayed image adjacent to the perimeter and combinations thereof.
  • In one embodiment, two or more colors are identified in the relevant portions of the displayed image. Preferably, the average color of the displayed image in portions that are covered by or adjacent to the identified portions is identified. Suitable methods for identifying the average color of the displayed image are known and available in the literature and are made, for example, on a pixel-by-pixel basis. In one embodiment, the hue saturation value (HSV) is determined for the displayed image in the relevant portions or regions of the displayed image, yielding an average HSV or a predominant HSV in the desired regions of the displayed image indicated by the identified portion. This determination of color is made in real-time and is based on the image currently displayed at the time that the identified portion is selected by the speaker.
  • Having identified and determined a color in the relevant area of the displayed image covered by the identified portion, a contrasting color to the determined color is identified 64. The Commission Internationale De L'Eclairage (CIE) Delta E metric can be used to define the concept of contrasting colors. A definition of the CIE Delta E metric can be found in Color Science, by G. Wyszecki and W. Stiles, pg. 165, John Wiley 2nd Edition. The CIE Delta E metric defines differences between two colors. If the CIE Delta E metric between two colors is sufficiently large, those two colors are considered to be contrasting. For example, if the CIE Delta E metric is greater than about 15, the colors are considered to contrasting, while the CIE Delta E metric is about 1 for two colors that are just noticeably different. In one embodiment for selecting a contrasting color, a color that yields a large CIR Delta E with respect to the original color is chosen, for example randomly.
  • Having identified the contrasting color, whether the contrasting color of a single color or the contrasting average color of the identified average color, the identified portion is highlighted in the displayed image on the display platforms by drawing or displaying the identified portion on the display platforms using the contrasting color 66. The identified portion can also be displayed on the monitoring screen using the contrasting color. By displaying the identified portion in that contrasting color, the appearance of the highlighted identified portion against the displayed image is optimized, making it easier for the audience to follow the presentation and to see the highlight. In areas of the displayed image where the colors vary significantly, the contrasting color can be varied throughout the identified portion to maintain an optimum appearance. Highlighting the identified portion using the contrasting color can be undertaken on a single display platform or on a plurality of display platforms. In one embodiment where the selected portion of the identified portion is a line or non-rectilinear line, the entire line or non-rectilinear line is displayed in the contrasting color. In an embodiment where the identified portion is a perimeter of a two-dimensional space, the perimeter, the entire two-dimensional area or both the perimeter and the entire two-dimensional area are displayed using the contrasting color. Thus, the color of the identified portion that is rendered on the displayed image will be distinct from its background, effectively improving its visibility and avoiding the problem that a fixed cursor color suffers from, i.e. if the background color is close to the cursor color, the cursor becomes difficult to see.
  • In one embodiment, where the display image color is not constant in the identified portion, the average color in this identified portion is determined. In one embodiment, a square area, for example of a size of about 96×96 pixels is used to calculate the average color. The nominal display size in one embodiment is about 800×600 pixels. Other square areas can be used, for example about 32×32 pixels.
  • In one embodiment, the identified portion of the displayed image is a two-dimensional space or area, and the area is defined as the interior of the perimeter of the two-dimensional shape drawn by the user. The perimeter drawn by the system on the displayed image is a smooth curve that is fitted to the points indicated by the user on the presentation monitor using the pointing mechanism. The area within the perimeter is filled with the contrasting average color of the initial pixels contained within the smooth two-dimensional curve perimeter. Alternatively, the pixels within the perimeter used to calculate an average color are augmented by selecting additional pixels that fall within a predetermined distance outside the perimeter. In one embodiment, display of the contrasting color is achieved through a linear combination of the selected contrasting color with the pre-existing pixel color at a given location. For example, for a given selected contrasting color, C, and an original pixel color, P, the new color is fC+(1−f)P. In one embodiment, f is a number between about 0 and about 1. Preferably, the value of f is less than about 0.5 so that the original pixel value is visible. In other words, the new color that is rendered and displayed is a weighted average of the original color and the contrasting color of the original pixel color.
  • In another embodiment, the complementary color of a given color is used to enhance the visibility of a highlight. As used herein, a complementary color is a color that is the most opposite, i.e. 180 degrees opposite, of a given color. For a given color, there is only one complementary color, as there is only one point that is 180 degrees opposite the given color. However, there are many contrasting colors for a given color. An algorithm to calculate a complementary color is described in “Computer Graphics, Principles and Practice”, by J. D. Foley, A. Van Dam, S. K. Feiner and J. F. Hughes, Second Edition, 1997, pg. 590, Addison Wesley Publishing Company. As described, the identified color within the displayed image is represented in an HSV color space, and a complementary color is identified that is 180 degrees opposite the given color in the HSV hexcone.
  • In one embodiment, the visibility of the selected area and the contents of that area are enhanced by toggling the colors of the pixels within the area in a cyclical manner. For a given contrasting color, C, the complement of that contrasting color is identified, C′. For a given original color, P, of a pixel at a specified location, L, the complement of that original pixel color is identified, P′. During the first half of each color cycle, the pixel at the specified location, L, is rendered with a color that is a linear combination of the contrasting color, C, and the given original pixel color, P, and is computed by the equation given above, i.e. fC+(1−f)P. During the second half of each cycle, the pixel at the specified location, L, is rendered by the complement to the contrasting color, C′, and the complement to the original pixel color, P′, and is given by the equation fC′+(1−f)P′. This toggling of colors makes the contents of the selected area more visible. In one embodiment, the period of the cycle is about 2 seconds. In another embodiment, the constant f is not a set value but is represented by a quantity that changes or decays over time, causing the color modifications and toggling effect to disappear over time.
  • In one embodiment as illustrated in FIG. 3, an alternative exemplary embodiment of a method for correcting the undesirable variations in the manually indicated highlight portion of the displayed image 68 in accordance with the present invention is illustrated. Initially, an image is displayed on a least one and possibly a plurality of display platforms 70. A portion of the displayed image is identified 72, for example by manually drawing a line across the displayed image. Suitable systems and methods for identifying the desired portion of the displayed image are as defined above. The identified portion includes straight lines, non-rectilinear or curved lines and perimeters of two-dimensional areas or objects on the displayed image. Undesirable variations or imperfections in the identified area are corrected 74, yielding a corrected identified portion. The corrected identified portion is drawn on 76.
  • A first example of an exemplary embodiment of a method for highlighting a visual presentation in real time in accordance with the present invention is illustrated in FIG. 9. A presentation image 22 is displayed simultaneously on a display platform 34 and a monitoring display 14. As illustrated, the monitoring display 14 is the screen of a laptop computer, which is serving as the monitoring mechanism 12. The user input device 16 is also in communication with the laptop computer and is in the form of a computer mouse. The speaker identifies a portion 20 of the displayed image using the input device and this identified portion is displayed on the monitoring screen 14 just as it is entered by the speaker including with any waviness or imperfections that result from manual entry using the imputer device. Undesirable variations are recognized and corrected to obtain a corrected identified portion 21 that is drawn over the displayed image on the display platform 34 using computer generated graphics. As illustrated, the laptop computer also serves as the control system that controls the presentation of images for the presentation and that executes methods in accordance with the present invention.
  • A second example of an exemplary embodiment of a method in accordance with the present invention is illustrated in FIG. 10. As with the first example, the monitoring mechanism 12 is a laptop computer that also serves as the control system for the presentation. The user input device 16 is a touch pad integrated into the laptop computer. The desired portion 20 of the displayed image is identified by the speaker using the touch pad and displayed on the monitoring screen 14. The identified line is corrected; however, the corrected line 21 is not drawn over the displayed image on the display platform 34 using computer-generated graphics. A motorized and controllable laser light pointer 36 in communication with the interaction mechanism is used to trace the corrected identified portion 21 across the displayed image 22.
  • The use of a controllable, motorized laser pointer alleviates the need for creating an adaptively varying color for the cursor. In addition, since the laser pointer is mounted on a motorized stage, the presenter does not have to face the projection screen at any time and can always face the audience directly. The motion of the motorized stage is calibrated such that the laser pointer points to the corners of the projection screen whenever the corners of the monitor used to preview the presentation are touched. The precise coordinates can be obtained through simple bicubic interpolation.
  • Methods and systems in accordance with the present invention enable the speaker giving a presentation in front of a live audience to face the audience at all times while speaking. In one embodiment, the speaker merely contacts a touch-screen monitor that previews the presentation, and this interaction is transformed into a color-adaptive highlight or cursor that is displayed on the final projection platform. In addition, the displayed highlight is suitably smoothed so that jittery hand motion is eliminated. This allows the presenter easier access to both his presentation materials and the audience, enabling him to deliver a more effective presentation.
  • Methods and systems in accordance with exemplary embodiments of the present invention can take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment containing both hardware and software elements. In a preferred embodiment, the invention is implemented in software, which includes but is not limited to firmware, resident software and microcode. In addition, exemplary methods and systems can take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer, logical processing unit or any instruction execution system. For the purposes of this description, a computer-usable or computer-readable medium can be any apparatus that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. Suitable computer-usable or computer readable mediums include, but are not limited to, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor systems (or apparatuses or devices) or propagation mediums. Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W) and DVD.
  • Suitable data processing systems for storing and/or executing program code include, but are not limited to, at least one processor coupled directly or indirectly to memory elements through a system bus. The memory elements include local memory employed during actual execution of the program code, bulk storage, and cache memories, which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution. Input/output or I/O devices, including but not limited to keyboards, displays and pointing devices, can be coupled to the system either directly or through intervening I/O controllers. Exemplary embodiments of the methods and systems in accordance with the present invention also include network adapters coupled to the system to enable the data processing system to become coupled to other data processing systems or remote printers or storage devices through intervening private or public networks. Suitable currently available types of network adapters include, but are not limited to, modems, cable modems, DSL modems, Ethernet cards and combinations thereof.
  • In one embodiment, the present invention is directed to a machine-readable or computer-readable medium containing a machine-executable or computer-executable code that when read by a machine or computer causes the machine or computer to perform a method for highlighting a portion of a displayed image in accordance with exemplary embodiments of the present invention and to the computer-executable code itself. The machine-readable or computer-readable code can be any type of code or language capable of being read and executed by the machine or computer and can be expressed in any suitable language or syntax known and available in the art including machine languages, assembler languages, higher level languages, object oriented languages and scripting languages. The computer-executable code can be stored on any suitable storage medium or database, including databases disposed within, in communication with and accessible by computer networks utilized by systems in accordance with the present invention and can be executed on any suitable hardware platform as are known and available in the art including the control systems used to control the presentations of the present invention.
  • While it is apparent that the illustrative embodiments of the invention disclosed herein fulfill the objectives of the present invention, it is appreciated that numerous modifications and other embodiments may be devised by those skilled in the art. Additionally, feature(s) and/or element(s) from any embodiment may be used singly or in combination with other embodiment(s) and steps or elements from methods in accordance with the present invention can be executed or performed in any suitable order. Therefore, it will be understood that the appended claims are intended to cover all such modifications and embodiments, which would come within the spirit and scope of the present invention.

Claims (20)

1. A method for highlighting a portion of a displayed image, the method comprising:
displaying an image on at least one display platform;
identifying a portion of the displayed image to be highlighted;
determining at least one color in the identified portion of the displayed image;
identifying a contrasting color to the determined color of the identified portion; and
highlighting the identified portion of the displayed image on the display screen using the contrasting color.
2. The method of claim 1, further comprising:
identifying a complementary color to the contrasting color; and
identifying a complementary color to the determined color of the identified portion;
wherein the step of highlighting the identified portion further comprises using the contrasting color by toggling the identified portion between a linear combination of the contrasting color and the determined color and a linear combination of the complementary color to the contrasting color and the complementary color to the determined color.
3. The method of claim 1, further comprising displaying the image on a monitoring screen separate from the display platform, wherein the step of identifying a portion of the displayed image to be highlighted comprises using a pointing mechanism in communication with the monitoring screen to identify the portion of the displayed image.
4. The method of claim 3, wherein the monitoring screen comprises a touch sensitive screen and the step of using the pointing mechanism to identify the portion of the displayed image further comprises contacting the touch screen at one of more locations corresponding to the identified portion of the displayed image.
5. The method of claim 1, wherein:
the identified portion of the displayed image comprises a line;
the step of identifying the portion of the displayed image to be highlighted comprises:
manually drawing the line across the displayed image; and
correcting undesirable variations in the manually drawn line to create a modified line; and
the step of highlighting the identified portion further comprises displaying the modified line on the displayed image using the contrasting color.
6. The method of claim 5, wherein the line comprises a non-rectilinear line and the step of correcting undesirable variations comprises applying a curve smoothing algorithm to the non-rectilinear line.
7. The method of claim 1, wherein:
the identified portion of the displayed image comprises a two-dimensional shape;
the step of identifying the portion of the displayed image to be highlighted comprises:
manually indicating a perimeter substantially around the two-dimensional shape on the displayed image; and
correcting undesirable variations in the manually indicated perimeter to create the two-dimensional shape; and
the step of highlighting the identified portion further comprises displaying the contrasting color on the displayed image to fill an area covered by the two-dimensional shape.
8. The method of claim 7, wherein the step of correcting undesirable variations further comprises selecting a matching geometric shape from a pre-determined list of geometric shapes as the two-dimensional shape based on the manually indicated perimeter of the two-dimensional shape.
9. The method of claim 1, wherein the identified portion of the displayed image comprises a line and the step of identifying a contrasting color to the color of the identified portion comprises identifying at least one color of the displayed image along the line, in an area of the displayed image adjacent to the line or combinations thereof.
10. The method of claim 1, wherein the identified portion of the displayed image comprises a two-dimensional shape and the step of determining at least one color comprises identifying at least one color in an area of the displayed image covered by the two-dimensional shape.
11. The method of claim 1, wherein:
the step of determining at least one color in the identified portion of the displayed image further comprises identifying an average color of the displayed image in an area covered by the identified portion;
the step of identifying a contrasting color comprises identifying a contrasting average color to the average color of the identified portion; and
the step of highlighting the identified portion of the image comprises highlighting the identified portion of the image on the display screen using the contrasting average color.
12. A method for highlighting selected portions of a displayed image, the method comprising:
displaying an image on at least one display platform;
identifying a portion of the displayed image by manually drawing the line across the displayed image;
correcting undesirable variations in the manually drawn line to create a modified line; and
highlighting the identified portion on the display platform by displaying the modified line on the displayed image.
13. The method of claim 12, further comprising:
determining an average color in the displayed image in at least one of the area of the line or an adjacent area to the line; and
identifying a contrasting average color to the average color;
wherein the step of highlighting the identified portion further comprises displaying the modified line on the displayed image using the contrasting average color.
14. The method of claim 12, wherein the step of highlighting the identified portion by displaying the modified line on the displayed image further comprises using a mechanically controlled laser pointer to trace the modified line across the displayed image on the display platform.
15. The method of claim 12, wherein:
the step of identifying the line further comprises manually indicating a perimeter substantially around a two-dimensional shape on the displayed image;
the step of correcting undesirable variations comprises correction undesirable variations in the manually indicated perimeter to create a modified perimeter defining the two-dimensional shape; and
the step of highlighting the line further comprises indicating the two-dimensional image on the displayed image.
16. The method of claim 15, wherein the step of indicating the two-dimensional image on the displayed image further comprises tracing the modified perimeter using a mechanically controlled laser pointer to trace the modified line across the displayed image on the display platform.
17. A computer-readable medium containing a computer-readable code that when read by a computer causes the computer to perform a method for highlighting a portion of a displayed image, the method comprising:
displaying an image on at least one display platform;
identifying a portion of the displayed image to be highlighted;
determining at least one color in the identified portion of the displayed image;
identifying a contrasting color to the determined color of the identified portion; and
highlighting the identified portion of the displayed image on the display platform using the contrasting color.
18. The computer readable medium of claim 17, wherein:
the method further comprises displaying the image on a touch sensitive monitoring screen separate from the display platform; and
the step of identifying a portion of the displayed image to be highlighted comprises contacting the touch screen at one of more locations corresponding to the identified portion of the displayed image.
19. The computer readable medium of claim 17, wherein:
the identified portion of the displayed image comprises a non-rectilinear line;
the step of identifying the portion of the displayed image to be highlighted comprises:
manually drawing the non-rectilinear line in the displayed image; and
applying a curve smoothing algorithm to the non-rectilinear line to create a modified non-rectilinear line; and
the step of highlighting the identified portion further comprises displaying the modified non-rectilinear line on the displayed image using the contrasting average color.
20. The computer readable medium of claim 17, wherein:
the identified portion of the displayed image comprises a two-dimensional shape;
the step of identifying the portion of the displayed image to be highlighted comprises:
manually indicating a perimeter substantially around the two-dimensional shape on the displayed image; and
correcting undesirable variations in the manually indicated perimeter to create the two-dimensional shape; and
the step of highlighting the identified portion further comprises displaying the contrasting average color on the displayed image to fill an area covered by the two-dimensional shape.
US11/367,518 2006-03-03 2006-03-03 System and method for smooth pointing of objects during a presentation Abandoned US20070206024A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US11/367,518 US20070206024A1 (en) 2006-03-03 2006-03-03 System and method for smooth pointing of objects during a presentation
US12/172,222 US8159501B2 (en) 2006-03-03 2008-07-12 System and method for smooth pointing of objects during a presentation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11/367,518 US20070206024A1 (en) 2006-03-03 2006-03-03 System and method for smooth pointing of objects during a presentation

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/172,222 Continuation US8159501B2 (en) 2006-03-03 2008-07-12 System and method for smooth pointing of objects during a presentation

Publications (1)

Publication Number Publication Date
US20070206024A1 true US20070206024A1 (en) 2007-09-06

Family

ID=38471067

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/367,518 Abandoned US20070206024A1 (en) 2006-03-03 2006-03-03 System and method for smooth pointing of objects during a presentation
US12/172,222 Expired - Fee Related US8159501B2 (en) 2006-03-03 2008-07-12 System and method for smooth pointing of objects during a presentation

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/172,222 Expired - Fee Related US8159501B2 (en) 2006-03-03 2008-07-12 System and method for smooth pointing of objects during a presentation

Country Status (1)

Country Link
US (2) US20070206024A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070206883A1 (en) * 2006-03-06 2007-09-06 Fuji Xerox Co., Ltd. Image processing apparatus and recording medium recording image processing program
US20070274704A1 (en) * 2006-05-25 2007-11-29 Fujitsu Limited Information processing apparatus, information processing method and program
US20090309954A1 (en) * 2006-06-30 2009-12-17 Kodak Graphic Communications Canada Company Methods and apparatus for selecting and applying non-contiguous features in a pattern
US20110057879A1 (en) * 2009-09-06 2011-03-10 Yang Pan Image Projection System with Adjustable Cursor Brightness
US20110138275A1 (en) * 2009-12-09 2011-06-09 Jo Hai Yu Method for selecting functional icons on touch screen
US20110151929A1 (en) * 2009-12-22 2011-06-23 At&T Intellectual Property I, L.P. Simplified control input to a mobile device
US20130004145A1 (en) * 2010-03-31 2013-01-03 Jun Li Trick playback of video data
US20140049493A1 (en) * 2012-08-17 2014-02-20 Konica Minolta, Inc. Information device, and computer-readable storage medium for computer program
EP2927876A1 (en) * 2014-03-31 2015-10-07 Samsung Display Co., Ltd. Generation of display overlay parameters utilizing touch inputs
US9438876B2 (en) 2010-09-17 2016-09-06 Thomson Licensing Method for semantics based trick mode play in video system
US20180276898A1 (en) * 2017-03-22 2018-09-27 Seiko Epson Corporation Transmissive display device, display control method, and computer program
US20190132398A1 (en) * 2017-11-02 2019-05-02 Microsoft Technology Licensing, Llc Networked User Interface Back Channel Discovery Via Wired Video Connection
US11113793B2 (en) * 2019-11-20 2021-09-07 Pacific future technology (Shenzhen) Co., Ltd Method and apparatus for smoothing a motion trajectory in a video

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009009509A (en) * 2007-06-29 2009-01-15 Brother Ind Ltd Printer driver
US8490026B2 (en) * 2008-10-27 2013-07-16 Microsoft Corporation Painting user controls
US8896899B1 (en) 2013-05-15 2014-11-25 Zhen Tang Laser marker
US9524440B2 (en) 2014-04-04 2016-12-20 Myscript System and method for superimposed handwriting recognition technology
US9384403B2 (en) 2014-04-04 2016-07-05 Myscript System and method for superimposed handwriting recognition technology
US9489572B2 (en) 2014-12-02 2016-11-08 Myscript System and method for recognizing geometric shapes
US9507157B2 (en) 2014-12-17 2016-11-29 Zhen Tang Size-adjustable elliptical laser marker
US9658704B2 (en) 2015-06-10 2017-05-23 Apple Inc. Devices and methods for manipulating user interfaces with a stylus
JP7007214B2 (en) * 2018-02-27 2022-01-24 日本電産サンキョー株式会社 Laser pointer with runout correction mechanism and its shake suppression control method

Citations (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4687918A (en) * 1985-05-03 1987-08-18 Hughes Technology Pty Ltd Safe laser pointers with remote directional activation
US4846694A (en) * 1988-06-20 1989-07-11 Image Storage/Retrieval Systems, Inc. Computer controlled, overhead projector display
US5231697A (en) * 1989-04-20 1993-07-27 Kabushiki Kaisha Toshiba Method and system for determining connection states of straight short vectors representing figure in curve fitting
US5287417A (en) * 1992-09-10 1994-02-15 Microsoft Corporation Method and system for recognizing a graphic object's shape, line style, and fill pattern in a pen environment
US5428417A (en) * 1993-08-02 1995-06-27 Lichtenstein; Bernard Visual lecture aid
US5519818A (en) * 1994-09-19 1996-05-21 Taligent, Inc. Object-oriented graphic picking system
US5568279A (en) * 1993-02-11 1996-10-22 Polycom, Inc. Remote interactive projector
US5572651A (en) * 1993-10-15 1996-11-05 Xerox Corporation Table-based user interface for retrieving and manipulating indices between data structures
US5583946A (en) * 1993-09-30 1996-12-10 Apple Computer, Inc. Method and apparatus for recognizing gestures on a computer system
US5734761A (en) * 1994-06-30 1998-03-31 Xerox Corporation Editing scanned document images using simple interpretations
US5861886A (en) * 1996-06-26 1999-01-19 Xerox Corporation Method and apparatus for grouping graphic objects on a computer based system having a graphical user interface
US5880745A (en) * 1993-08-31 1999-03-09 Sharp Kabushiki Kaisha Smooth presentations via remote control of liquid crystal projection apparatus
US5940065A (en) * 1996-03-15 1999-08-17 Elo Touchsystems, Inc. Algorithmic compensation system and method therefor for a touch sensor panel
US6050690A (en) * 1998-01-08 2000-04-18 Siemens Information And Communication Networks, Inc. Apparatus and method for focusing a projected image
US6091408A (en) * 1997-08-13 2000-07-18 Z-Axis Corporation Method for presenting information units on multiple presentation units
US6108001A (en) * 1993-05-21 2000-08-22 International Business Machines Corporation Dynamic control of visual and/or audio presentation
US6229550B1 (en) * 1998-09-04 2001-05-08 Sportvision, Inc. Blending a graphic
US6253164B1 (en) * 1997-12-24 2001-06-26 Silicon Graphics, Inc. Curves and surfaces modeling based on a cloud of points
US6333753B1 (en) * 1998-09-14 2001-12-25 Microsoft Corporation Technique for implementing an on-demand display widget through controlled fading initiated by user contact with a touch sensitive input device
US6538643B2 (en) * 2001-04-25 2003-03-25 Interlink Electronics, Inc. Remote control having a touch pad operable in a pad-to-screen mapping mode for highlighting preselected parts of a slide displayed on a display screen
US20030071850A1 (en) * 2001-10-12 2003-04-17 Microsoft Corporation In-place adaptive handwriting input method and system
US20030142038A1 (en) * 2002-01-31 2003-07-31 General Instrument Corporation Remote markup of a display device using a wireless internet appliance as an electronic canvas
US6642918B2 (en) * 2001-04-23 2003-11-04 Canon Kabushiki Kaisha Control of digital projection system
US6658147B2 (en) * 2001-04-16 2003-12-02 Parascript Llc Reshaping freehand drawn lines and shapes in an electronic document
US6741266B1 (en) * 1999-09-13 2004-05-25 Fujitsu Limited Gui display, and recording medium including a computerized method stored therein for realizing the gui display
US6747636B2 (en) * 1991-10-21 2004-06-08 Smart Technologies, Inc. Projection display and system with pressure sensing at screen, and computer assisted alignment implemented by applying pressure at displayed calibration marks
US6761456B2 (en) * 2002-03-25 2004-07-13 Fuji Photo Optical Co., Ltd. Laser presentation system using a laser pointer
US6802041B1 (en) * 1999-01-20 2004-10-05 Perfectnotes Corporation Multimedia word processor
US20050024387A1 (en) * 2003-07-31 2005-02-03 Viresh Ratnakar LAPE: layered presentation system utilizing compressed-domain image processing
US6944584B1 (en) * 1999-04-16 2005-09-13 Brooks Automation, Inc. System and method for control and simulation
US7071950B2 (en) * 2000-09-01 2006-07-04 Ricoh Co., Ltd. Super imposed image display color selection system and method
US7337389B1 (en) * 1999-12-07 2008-02-26 Microsoft Corporation System and method for annotating an electronic document independently of its content

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4916007A (en) * 1985-10-18 1990-04-10 Tarkett Inc. Underprinted inlaid sheet materials having unique decorative design effects
GB2284734B (en) * 1992-07-17 1997-01-22 Komatsu Mfg Co Ltd Intelligent graphic operation panel
US6100904A (en) * 1997-06-25 2000-08-08 Adobe Systems Incorporated Curvature smoothing
EP0989738A1 (en) * 1998-09-22 2000-03-29 Hewlett-Packard Company Document analysis method to detect BW/color areas and corresponding scanning device
US20020005509A1 (en) * 1999-01-21 2002-01-17 Chia-Chi Teng Dye combinations for image enhancement filters for color video displays
US20040151218A1 (en) * 2002-12-23 2004-08-05 Vlad Branzoi Systems and methods for tremor cancellation in pointers
US7302106B2 (en) * 2003-05-19 2007-11-27 Microsoft Corp. System and method for ink or handwriting compression
US20060225037A1 (en) * 2005-03-30 2006-10-05 Microsoft Corporation Enabling UI template customization and reuse through parameterization
US20060224575A1 (en) * 2005-03-30 2006-10-05 Microsoft Corporation System and method for dynamic creation and management of lists on a distance user interface
US20060224962A1 (en) * 2005-03-30 2006-10-05 Microsoft Corporation Context menu navigational method for accessing contextual and product-wide choices via remote control
US7380722B2 (en) * 2005-07-28 2008-06-03 Avago Technologies Ecbu Ip Pte Ltd Stabilized laser pointer

Patent Citations (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4687918A (en) * 1985-05-03 1987-08-18 Hughes Technology Pty Ltd Safe laser pointers with remote directional activation
US4846694A (en) * 1988-06-20 1989-07-11 Image Storage/Retrieval Systems, Inc. Computer controlled, overhead projector display
US5231697A (en) * 1989-04-20 1993-07-27 Kabushiki Kaisha Toshiba Method and system for determining connection states of straight short vectors representing figure in curve fitting
US6747636B2 (en) * 1991-10-21 2004-06-08 Smart Technologies, Inc. Projection display and system with pressure sensing at screen, and computer assisted alignment implemented by applying pressure at displayed calibration marks
US5287417A (en) * 1992-09-10 1994-02-15 Microsoft Corporation Method and system for recognizing a graphic object's shape, line style, and fill pattern in a pen environment
US5568279A (en) * 1993-02-11 1996-10-22 Polycom, Inc. Remote interactive projector
US6108001A (en) * 1993-05-21 2000-08-22 International Business Machines Corporation Dynamic control of visual and/or audio presentation
US5428417A (en) * 1993-08-02 1995-06-27 Lichtenstein; Bernard Visual lecture aid
US5880745A (en) * 1993-08-31 1999-03-09 Sharp Kabushiki Kaisha Smooth presentations via remote control of liquid crystal projection apparatus
US5583946A (en) * 1993-09-30 1996-12-10 Apple Computer, Inc. Method and apparatus for recognizing gestures on a computer system
US5572651A (en) * 1993-10-15 1996-11-05 Xerox Corporation Table-based user interface for retrieving and manipulating indices between data structures
US5734761A (en) * 1994-06-30 1998-03-31 Xerox Corporation Editing scanned document images using simple interpretations
US5519818A (en) * 1994-09-19 1996-05-21 Taligent, Inc. Object-oriented graphic picking system
US5940065A (en) * 1996-03-15 1999-08-17 Elo Touchsystems, Inc. Algorithmic compensation system and method therefor for a touch sensor panel
US5861886A (en) * 1996-06-26 1999-01-19 Xerox Corporation Method and apparatus for grouping graphic objects on a computer based system having a graphical user interface
US6091408A (en) * 1997-08-13 2000-07-18 Z-Axis Corporation Method for presenting information units on multiple presentation units
US6253164B1 (en) * 1997-12-24 2001-06-26 Silicon Graphics, Inc. Curves and surfaces modeling based on a cloud of points
US6050690A (en) * 1998-01-08 2000-04-18 Siemens Information And Communication Networks, Inc. Apparatus and method for focusing a projected image
US6229550B1 (en) * 1998-09-04 2001-05-08 Sportvision, Inc. Blending a graphic
US6333753B1 (en) * 1998-09-14 2001-12-25 Microsoft Corporation Technique for implementing an on-demand display widget through controlled fading initiated by user contact with a touch sensitive input device
US6802041B1 (en) * 1999-01-20 2004-10-05 Perfectnotes Corporation Multimedia word processor
US6944584B1 (en) * 1999-04-16 2005-09-13 Brooks Automation, Inc. System and method for control and simulation
US6741266B1 (en) * 1999-09-13 2004-05-25 Fujitsu Limited Gui display, and recording medium including a computerized method stored therein for realizing the gui display
US7337389B1 (en) * 1999-12-07 2008-02-26 Microsoft Corporation System and method for annotating an electronic document independently of its content
US7071950B2 (en) * 2000-09-01 2006-07-04 Ricoh Co., Ltd. Super imposed image display color selection system and method
US6658147B2 (en) * 2001-04-16 2003-12-02 Parascript Llc Reshaping freehand drawn lines and shapes in an electronic document
US6642918B2 (en) * 2001-04-23 2003-11-04 Canon Kabushiki Kaisha Control of digital projection system
US6538643B2 (en) * 2001-04-25 2003-03-25 Interlink Electronics, Inc. Remote control having a touch pad operable in a pad-to-screen mapping mode for highlighting preselected parts of a slide displayed on a display screen
US20030071850A1 (en) * 2001-10-12 2003-04-17 Microsoft Corporation In-place adaptive handwriting input method and system
US20030142038A1 (en) * 2002-01-31 2003-07-31 General Instrument Corporation Remote markup of a display device using a wireless internet appliance as an electronic canvas
US6761456B2 (en) * 2002-03-25 2004-07-13 Fuji Photo Optical Co., Ltd. Laser presentation system using a laser pointer
US20050024387A1 (en) * 2003-07-31 2005-02-03 Viresh Ratnakar LAPE: layered presentation system utilizing compressed-domain image processing

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7920742B2 (en) * 2006-03-06 2011-04-05 Fuji Xerox Co., Ltd. Image processing apparatus, program and recording medium for document registration
US20070206883A1 (en) * 2006-03-06 2007-09-06 Fuji Xerox Co., Ltd. Image processing apparatus and recording medium recording image processing program
US20070274704A1 (en) * 2006-05-25 2007-11-29 Fujitsu Limited Information processing apparatus, information processing method and program
US7633512B2 (en) * 2006-05-25 2009-12-15 Fujitsu Limited Information processing apparatus, information processing method and program
US20090309954A1 (en) * 2006-06-30 2009-12-17 Kodak Graphic Communications Canada Company Methods and apparatus for selecting and applying non-contiguous features in a pattern
US20110057879A1 (en) * 2009-09-06 2011-03-10 Yang Pan Image Projection System with Adjustable Cursor Brightness
US8292439B2 (en) 2009-09-06 2012-10-23 Yang Pan Image projection system with adjustable cursor brightness
US20110138275A1 (en) * 2009-12-09 2011-06-09 Jo Hai Yu Method for selecting functional icons on touch screen
US9762717B2 (en) 2009-12-22 2017-09-12 At&T Intellectual Property I, L.P. Simplified control input to a mobile device
US20110151929A1 (en) * 2009-12-22 2011-06-23 At&T Intellectual Property I, L.P. Simplified control input to a mobile device
US9992320B2 (en) 2009-12-22 2018-06-05 At&T Intellectual Property I, L.P. Simplified control input to a mobile device
US9047052B2 (en) * 2009-12-22 2015-06-02 At&T Intellectual Property I, L.P. Simplified control input to a mobile device
US9866922B2 (en) * 2010-03-31 2018-01-09 Thomson Licensing Trick playback of video data
US20130004145A1 (en) * 2010-03-31 2013-01-03 Jun Li Trick playback of video data
US11418853B2 (en) 2010-03-31 2022-08-16 Interdigital Madison Patent Holdings, Sas Trick playback of video data
US9438876B2 (en) 2010-09-17 2016-09-06 Thomson Licensing Method for semantics based trick mode play in video system
US20140049493A1 (en) * 2012-08-17 2014-02-20 Konica Minolta, Inc. Information device, and computer-readable storage medium for computer program
EP2927876A1 (en) * 2014-03-31 2015-10-07 Samsung Display Co., Ltd. Generation of display overlay parameters utilizing touch inputs
US10127700B2 (en) 2014-03-31 2018-11-13 Samsung Display Co., Ltd. Generation of display overlay parameters utilizing touch inputs
US20180276898A1 (en) * 2017-03-22 2018-09-27 Seiko Epson Corporation Transmissive display device, display control method, and computer program
US10657722B2 (en) * 2017-03-22 2020-05-19 Seiko Epson Corporation Transmissive display device, display control method, and computer program
US20190132398A1 (en) * 2017-11-02 2019-05-02 Microsoft Technology Licensing, Llc Networked User Interface Back Channel Discovery Via Wired Video Connection
US11113793B2 (en) * 2019-11-20 2021-09-07 Pacific future technology (Shenzhen) Co., Ltd Method and apparatus for smoothing a motion trajectory in a video

Also Published As

Publication number Publication date
US20080259090A1 (en) 2008-10-23
US8159501B2 (en) 2012-04-17

Similar Documents

Publication Publication Date Title
US8159501B2 (en) System and method for smooth pointing of objects during a presentation
US20130055143A1 (en) Method for manipulating a graphical user interface and interactive input system employing the same
CN105493023B (en) Manipulation to the content on surface
US9195345B2 (en) Position aware gestures with visual feedback as input method
EP2498485B1 (en) Automated selection and switching of displayed information
US10839572B2 (en) Contextual virtual reality interaction
AU2011276970B2 (en) Digital whiteboard system
US20090187817A1 (en) Efficient Image Annotation Display and Transmission
KR20100108417A (en) Method of managing applications in a multi-monitor computer system and multi-monitor computer system employing the method
US20090184943A1 (en) Displaying Information Interactively
US20060214911A1 (en) Pointing device for large field of view displays
CN111064999B (en) Method and system for processing virtual reality input
US8872813B2 (en) Parallax image authoring and viewing in digital media
Waldner et al. Importance-driven compositing window management
US11694371B2 (en) Controlling interactivity of digital content overlaid onto displayed data via graphics processing circuitry using a frame buffer
JP2009116727A (en) Image input display
US11137903B2 (en) Gesture-based transitions between modes for mixed mode digital boards
Malik An exploration of multi-finger interaction on multi-touch surfaces
JP6834197B2 (en) Information processing equipment, display system, program
US9927892B2 (en) Multiple touch selection control
JP6699406B2 (en) Information processing device, program, position information creation method, information processing system
US20140365955A1 (en) Window reshaping by selective edge revisions
JP6945345B2 (en) Display device, display method and program
Reibert et al. Multitouch Interaction with Parallel Coordinates on Large Vertical Displays
JP2010205105A (en) Electronic board system, mouse-event pseudo-generating method, and input coordinate correction method

Legal Events

Date Code Title Description
AS Assignment

Owner name: IBM CORPORATION, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:RAO, RAVISHANKAR;REEL/FRAME:017802/0336

Effective date: 20060508

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION