US20120218203A1 - Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus - Google Patents

Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus Download PDF

Info

Publication number
US20120218203A1
US20120218203A1 US13/370,049 US201213370049A US2012218203A1 US 20120218203 A1 US20120218203 A1 US 20120218203A1 US 201213370049 A US201213370049 A US 201213370049A US 2012218203 A1 US2012218203 A1 US 2012218203A1
Authority
US
United States
Prior art keywords
touch
image
page
unit
display screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/370,049
Inventor
Noriyoshi KANKI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sharp Corp
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2011027236A external-priority patent/JP5536690B2/en
Priority claimed from JP2011027237A external-priority patent/JP5537458B2/en
Application filed by Individual filed Critical Individual
Assigned to SHARP KABUSHIKI KAISHA reassignment SHARP KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KANKI, NORIYOSHI
Publication of US20120218203A1 publication Critical patent/US20120218203A1/en
Priority to US14/823,681 priority Critical patent/US10191648B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention relates to an image display apparatus having a display device and an input device integrated together, allowing touch-input. More specifically, the present invention relates to a touch drawing display apparatus that has functions of drawing operation and other operation regarding touch operations, and operation method thereof, as well as to an image display apparatus switching images page by page by touching and sliding images with a plurality of fingers and the like and a controller for the display apparatus.
  • a so-called electronic blackboard having a display device with a large screen allowing drawing of an image or other processes upon detection of touching of the display device by the user has been known.
  • the electronic blackboard is useful when summarizing opinions of participants or finding a preferable solution to a problem, for example, at a meeting.
  • Electronic blackboards having various configurations have come to be practically used, and one configured as a computer system having a combination of a display device with a large display screen and an input device for detecting two-dimensional position coordinates such as a touch-panel has been used.
  • an electronic blackboard apparatus successively reads pieces of information related to position coordinates designated by a pen or a finger and pieces of information related to amount of movement, and displays a track of inputs on the display device based on the read pieces of information. Consequently, the apparatus can realize operations as an electronic blackboard such as handwriting input.
  • Japanese Patent Laying-Open No. 6-175776 discloses a projection type presentation device allowing drawing on a displayed screen image using touch operation. There is a problem that recognition of an original image becomes difficult when writing is repeated on the same screen image.
  • drawing is done while touch operation is continued in an image forming area, and when the touch operation in the image forming area once ends and then a touch operation in the image forming area is detected next, the former image is automatically deleted.
  • a specific function other than drawing is allocated to a touch operation different from a simple touch operation for drawing in an electronic blackboard, while an image is being drawn in a drawing mode, it takes some time until a touch operation is determined to be the one for executing the specific function.
  • drawing is done immediately after the touching, so as not to cause any time lag between the touch operation and drawing. Any time lag is stressful for the user.
  • Such a configuration sometimes leads to an erroneous drawing not intended by the user.
  • an operation of flicking with multi-touch is allocated to an operation of scrolling screen images.
  • This problem is not limited to the electronic blackboard, and it occurs in display devices allowing display of images drawn by touching, such as tablet-type terminals. This problem cannot be solved by the technique disclosed in '776 Reference.
  • an electronic blackboard includes that it is possible to display or write (draw) images separately on a plurality of screen images.
  • Each unit of display of such images is referred to as a “page” as an analogy to a book.
  • Japanese Patent Laying-Open No. 11-102274 proposes a device in which the screen image is switched if the screen image is touched by a plurality of fingers and the like (a so-called multi-touch) and the plurality of touched positions are slid by more than a prescribed value in the same direction.
  • pages may be turned erroneously while normal input is being done.
  • a finger other than the finger used for input happens to touch the screen surface and is detected as the multi-touch slide input and, as a result, a page is turned unintentionally. Therefore, a mechanism that can prevent unintended turning of a page even when such an erroneous input is made has been desired. Further, a mechanism that allows the user to easily understand what manner of input is necessary to turn a page is also necessary.
  • the present invention provides a touch drawing display apparatus, including: a display unit that displays an image; a detecting unit that is arranged on the display unit and detects a touched position; a drawing unit that draws, on an image displayed on the display unit, a line corresponding to a track formed by movement of the detected position; and an erasing unit that erases, when an operation other than drawing is specified by the track, a line drawn from a start point to an end point of the track specifying the operation.
  • the track for specifying the operation other than drawing is a track formed by a multi-touch operation of simultaneously touching a plurality of positions of the detecting unit.
  • the present invention provides a method of operating a touch drawing display apparatus that includes a display unit that displays an image and a detecting unit that is arranged on the display unit and detects a touched position, including the steps of: drawing, on an image displayed on the display unit, a line corresponding to a track formed by movement of the detected position; determining whether or not an operation other than drawing is specified by the track while the drawing step is being executed; and erasing, if an operation other than drawing is specified by the track, a line drawn from a start point to an end point of the track specifying the operation.
  • the erroneous image drawn during the determination of the touch operation other than that for drawing is not retained, and the drawn image as intended by the user can be stored. Therefore, when the object image is displayed again, the erroneous image is not displayed and only the intended image is displayed.
  • the present invention provides an image display apparatus allowing touch-input, including: a touch detecting unit that has a display screen, displays page images page by page on the display screen, and detects positions and the number of touch inputs that designate positions on the display screen; a scroll unit that scrolls, when a plurality of touch inputs are detected by the touch detecting unit and their positions on the display screen move in one same direction, an image displayed on the display screen along with movement of positions of the plurality of touch inputs; and a first page switching unit that detects, after the plurality of touch inputs are detected by the touch detecting unit, decrease in the number of touch inputs from an output of the touch detecting unit and, depending on whether an amount of movement of the touch inputs is not larger than a first threshold value, selectively executes a process of returning the image on the display screen to a state before scrolling and a process of switching the image on the display screen by one page in accordance with the direction of movement of the plurality of touch inputs.
  • the scroll unit Scrolls the screen image in that direction. Therefore, the user can intuitively understand that the screen image can be scrolled by multi-touch sliding operation.
  • the scroll made by that time point is resumed to the original state before scrolling. Therefore, even when the user erroneously touches the display screen surface with a plurality of fingers and slides, the original screen image can be resumed if the user becomes aware and moves his/her fingers away immediately.
  • the screen image is switched by one page in accordance with the direction of sliding.
  • the pages of screen image can be switched by the simple operation of multi-touch sliding.
  • an image display apparatus allowing touch-input that allows the user to intuitively understand the method of page-by-page switching of screen images using multi-touch and that can prevent any trouble of display when an erroneous operation is made on such an occasion can be provided.
  • the present invention provides a controller for a display apparatus, used for the display apparatus that has a display screen, displays an image received from outside on the display screen and detects and outputs, to the outside, positions and the number of touch inputs that designate positions on the display screen.
  • the controller for the display apparatus includes: a scroll unit that generates, when it is detected that a plurality of touch inputs are detected by the display apparatus and their positions on the display screen move in one same direction, an image by scrolling the image displayed on the display screen along with the movement of positions of the plurality of touch positions, and applies the generated image to the display apparatus; and a page switching unit that detects, after the plurality of touch inputs are detected by the display apparatus, decrease in the number of touch inputs from an output of the display apparatus, depending on whether an amount of movement of the touch inputs is not larger than a first threshold value, selectively executes a process of generating page image data for returning the image on the display screen to a state before scrolling and a process of generating page image data for switching the image on the display screen by one page in accordance with the direction of movement of the plurality of touch inputs, and transmits the generated page image data to the display apparatus.
  • the present invention provides an image display apparatus allowing touch input, including: a touch detecting unit that has a display screen, displays page images page by page on the display screen, and detects positions and the number of touch inputs that designate positions on the display screen; a scroll unit that scrolls, when a plurality of touch inputs are detected by the touch detecting unit and their positions on the display screen move in one same direction, an image displayed on the display screen along with movement of positions of the plurality of touch inputs; a page switching unit that switches, in response to an amount of movement of the plurality of touch inputs exceeding a threshold value during scrolling of the image by the scroll unit, the image on the display screen by one page in accordance with the direction of movement of the plurality of touch inputs; and a returning unit that detects, from an output of the touch detecting unit, decrease of the number of plurality of touch inputs before page switching by the page switching unit during scrolling of the image by the scroll unit, and returns scrolling of the image by the scroll unit to a state before
  • the plurality of touched positions are detected by the touch detecting unit, and when these positions move in the same direction, the scroll unit moves the screen image along the direction of movement of the touched positions. Therefore, the user can intuitively understand that page switching is possible by the so-called multi-touch. Further, if the plurality of touched positions move in the same direction and the threshold value is exceeded, the display of screen image is switched in accordance with the direction of movement of touched positions in the course of scrolling. The screen image can be switched by the simple operation of so-called multi-touch. Further, for this purpose, what is necessary is simply to move the touched positions continuously without moving the fingers and the like away from the screen surface. Further, if the number of touch inputs decreases before page switching takes place, the screen image returns to the state before the scrolling and, therefore, the original state can easily be resumed even when an erroneous operation is made.
  • an image display apparatus allowing touch-input that allows the user to easily understand that page-by-page switching of screen images is possible by multi-touch input and allows easy switching of page-by-page switching of screen images can be provided.
  • the present invention provides a controller for a display apparatus, used for the display apparatus that has a display screen, displays an image received from outside on the display screen and detects and outputs, to the outside, positions and the number of touch inputs that designate a position on the display screen.
  • the controller includes: a scroll unit that generates, when it is detected that a plurality of touch inputs are detected by the display apparatus and their positions on the display screen move in one same direction, an image by scrolling the image displayed on the display screen along with the movement of positions of the plurality of touch positions, and applies the generated image to the display apparatus; a page switching unit that switches, in response to an amount of movement of the plurality of touch inputs exceeding a threshold value during scrolling of the image by the scroll unit, the image on the display screen by one page in accordance with the direction of movement of the plurality of touch inputs; and a returning unit that detects, from an output of the display apparatus, decrease of the number of plurality of touch inputs before page switching by the page switching unit during scrolling of the image by the scroll unit, and returns scrolling of the image by the scroll unit to a state before scrolling.
  • the user can intuitively understand that page-by-page switching of screen images can be done by using multi-touch, and any trouble in display can be prevented even when an erroneous operation is made on such an occasion. Further, by a simple operation of multi-touch sliding, page of images can be switched by one page. It is unnecessary to stop the multi-touch sliding operation.
  • FIG. 1 is a block diagram showing a schematic configuration of an electronic blackboard in accordance with an embodiment of the present invention.
  • FIG. 2 shows an example of a method of detecting a touch-input.
  • FIG. 3 shows an example of a displayed screen image.
  • FIG. 4 is a flowchart representing a control structure of a program realizing erasure of an erroneous drawing in the electronic blackboard apparatus in accordance with a first embodiment of the present invention.
  • FIG. 5 shows coordinate data structure stored when a multi-touch operation is carried out in accordance with the first embodiment of the present invention.
  • FIG. 6 shows a scroll operation realized by the multi-touch operation in accordance with the first embodiment of the present invention.
  • FIG. 7 shows a scroll of a screen image in accordance with the first embodiment of the present invention.
  • FIG. 8 is a flowchart representing a control structure of a program realizing page switching by a multi-touch, in the electronic blackboard apparatus in accordance with a second embodiment of the present invention.
  • FIG. 9 shows a multi-touch sliding operation in accordance with the second embodiment of the present invention.
  • FIG. 10 shows an operation of sliding by a distance D 1 with multi-touch and then moving the fingers away, in accordance with the second embodiment of the present invention.
  • FIG. 11 shows an operation of sliding by a distance D 2 longer than D 1 with multi-touch and then moving the fingers away, in accordance with the second embodiment of the present invention.
  • FIG. 12 shows a screen image after a page switch, displayed when an operation of sliding by a distance D 2 and then moving the fingers away is done, in accordance with the second embodiment of the present invention.
  • FIG. 13 shows a screen image displayed when an operation of sliding by a distance D 3 with multi-touch is done, in accordance with the second embodiment of the present invention.
  • FIG. 14 shows a screen image displayed when the sliding operation is continued even after the screen image is switched by one page from the screen image shown in FIG. 13 .
  • touch means that a position is made detectable by an input position detecting device
  • touch may include touching and pressing the detecting device, just touching and not pressing the detecting device, and coming very close to but not touching the detecting device.
  • a contact type as well as a non-contact device may be used as the input position detecting device.
  • touch means coming very close to the detecting device, that is, to a distance that allows detection of the input position.
  • an electronic blackboard apparatus 100 in accordance with the first embodiment of the present invention includes: a central processing unit (hereinafter denoted as CPU) 102 ; a read only memory (hereinafter denoted as ROM) 104 ; a random access memory (hereinafter denoted as RAM) 106 ; a storage unit 108 ; an interface unit (hereinafter denoted as IF unit) 110 ; a touch detecting unit 112 ; a display unit 114 ; a display control unit 116 ; a video memory (hereinafter denoted as VRAM) 118 ; and a bus 120 .
  • CPU 102 is for overall control of electronic blackboard apparatus 100 .
  • ROM 104 is a non-volatile storage storing programs and data necessary for controlling operations of electronic blackboard apparatus 100 .
  • RAM 106 is a volatile storage.
  • Storage unit 108 is a non-volatile storage retaining data even when power conduction is shut off, and implemented, for example, by a hard disk drive, a flash memory or the like. Storage unit 108 may be configured as a detachable unit.
  • CPU 102 reads a program from ROM 104 to RAM 106 through bus 120 and executes the program using a part of RAM 106 as a work area.
  • CPU 102 controls various units and components of electronic blackboard apparatus 100 in accordance with a program or programs stored in ROM 104 .
  • bus 120 To bus 120 , CPU 102 , ROM 104 , RAM 106 , storage unit 108 , IF unit 110 , touch detecting unit 112 , display control unit 116 and VRAM 118 are connected. Data (including control information) is exchanged between each of the units through bus 120 .
  • IF unit 110 is for establishing connection with the outside such as a network, and transmits/receives image data to and from a computer or the like connected to the network. Image data received from the outside through IF unit 110 is recorded in storage unit 108 .
  • Display unit 114 is a display panel (such as a liquid crystal panel) for displaying images.
  • Display control unit 116 is provided with a driving unit for driving display unit 114 .
  • Display control unit 116 reads image data stored in VRAM 118 at prescribed timing, generates signals for displaying as an image on display unit 114 , and outputs the generated signals to display unit 114 .
  • the image data to be displayed is read by CPU 102 from storage unit 108 and transmitted to VRAM 118 .
  • Touch detecting unit 112 is, for example, a touch-panel, detecting a touch operation by a user.
  • touch detecting unit 112 if there is a plurality of positions touched by the user, touch detecting unit 112 successively outputs coordinates of each of the positions.
  • touch detecting unit 112 by monitoring the outputs of touch detecting unit 112 , it is possible to know the number of touched positions and their coordinates successively. Detection of touch operation when a touch-panel is used for touch detecting unit 112 will be described with reference to FIG. 2 .
  • FIG. 2 shows an infrared scanning type touch-panel (touch detecting unit 112 ).
  • the touch-panel has arrays of light emitting diodes (hereinafter denoted as LED arrays) 200 and 202 arranged in a line on adjacent two sides of a rectangular writing surface, respectively, and two arrays of photodiodes (hereinafter referred to as PD arrays) 210 and 212 arranged in a line opposite to LED arrays 200 and 202 , respectively.
  • Infrared rays are emitted from each LED of LED arrays 200 and 202 , and the infrared rays are detected by each PD of opposite PD arrays 210 and 212 .
  • infrared rays output from LEDs of LED arrays 200 and 202 are represented by arrows.
  • the touch-panel includes, for example, a micro computer (a device including a CPU, a memory and an input/output circuit), and controls emission of each LED.
  • Each PD outputs a voltage corresponding to the intensity of received light.
  • the output voltage from the PD is amplified by an amplifier. Since signals are output simultaneously from the plurality of PDs of PD arrays 210 and 212 , the output signals are once saved in a buffer and then output as serial signals in accordance with the order of arrangement of PDs, and transmitted to the micro computer.
  • the order of serial signals output from PD array 210 represents the X coordinate.
  • the order of serial signals output from PD array 212 represents the Y coordinate.
  • the micro computer detects a portion where the signal levels of received two serial signals decreased, and thereby finds the position coordinates of the touched position.
  • the micro computer transmits the determined position coordinates to CPU 102 .
  • the process for detecting the touched position is repeated periodically at prescribed detection interval and, therefore, if one point is kept touched for a time period longer than the detection interval, it follows that the same coordinate data is output repeatedly.
  • the touched position can be detected in the similar manner when the user touches touch detecting unit 112 with his/her finger without using touch pen 220 .
  • a touch panel other than the infrared scanning type panel (such as a capacitive type, surface acoustic wave type or resistive type touch panel) may be used as touch detecting unit 112 .
  • a capacitive touch panel When a capacitive touch panel is used, a position can be detected even when a finger or the like is not actually touching (non-contact), if it is close enough to the sensor.
  • FIG. 3 On display unit 114 , a screen image such as shown in FIG. 3 is displayed.
  • the display screen image of display unit 114 is divided to a drawing area 230 and a function button area 240 .
  • Drawing area 230 is for the user to draw an image by touching operations.
  • XY coordinates of touched position and track of its movement are transmitted from touch detecting unit 112 to CPU 102 as described above.
  • CPU 102 writes a prescribed value in a corresponding memory address on VRAM 118 .
  • pixel values of image data on VRAM 118 may be changed, here, it is assumed that VRAM 118 is provided with an area (hereinafter also referred to as an overlay area) for storing drawing data, separate from the area for storing image data.
  • an overlay area for storing drawing data
  • Display control unit 116 superimposes and displays on display unit 114 the image data with the drawing data (the data in the overlay area). Specifically, on a point where the drawing data exists (the pixel having “1” recorded in the overlay area), the drawing data (preset color) is displayed, and on a point where the drawing data does not exist (the pixel having “0” recorded in the overlay area), the image data is displayed.
  • function buttons 242 each having specific function allocated thereto are displayed.
  • a page operation area 250 is displayed.
  • a NEXT button 252 On this area, a NEXT button 252 , a PREVIOUS button 254 and a page number indication box 256 are displayed.
  • NEXT button 252 feeds the displayed page (image data) to the right and shows the next page.
  • PREVIOUS button 254 feeds the displayed page to the left and shows the previous page.
  • Page number indication box indicates the page number of the currently displayed page, of the plurality of pages as the object of display.
  • the position of page operation area 250 is fixed and it does not move even during scrolling.
  • the data for displaying page operation area 250 on display unit 114 may be stored in an overlay area separate from the overlay area for drawing.
  • Functions allocated to function buttons 242 may include: a function of drawing by a touch operation (allowing selection of types of pens for drawing); a function of opening a file (image data) saved in storage unit 108 ; an erasure function of deleting a drawing in a prescribed area; a function of saving displayed image data in storage unit 108 ; and a function of printing displayed image data.
  • Each function button 242 is displayed as an icon.
  • CPU 102 determines whether or not touch detecting unit 112 is touched. As described above, CPU 102 determines whether or not coordinate data is received from touch detecting unit 112 . Position coordinates (X coordinate, Y coordinate) of the touched point are output in the order of touching, from touch detecting unit 112 . If it is determined that there is a touch, the control proceeds to step 302 . Otherwise, the control proceeds to step 306 .
  • CPU 102 stores coordinate data that have been received for a prescribed time period (coordinate data of touched points) in RAM 106 in a manner that represents the order of reception.
  • CPU 102 determines whether or not the program is to be ended. When an end button as one of the function buttons 242 is pressed, CPU 102 ends the program. Otherwise, the control returns to step 300 , and CPU 102 waits for a touch.
  • CPU 102 determines whether or not the touch detected at step 302 is a multi-touch (whether a plurality of points are touched simultaneously). Specifically, CPU 102 calculates, among the plurality of coordinate data stored at step 302 , the distance between the first received coordinate data and the coordinate data received next. If the distance is a prescribed value or larger, CPU 102 determines that it is a multi-touch, and if the distance is smaller than the prescribed value, determines that it is a single touch.
  • the touch detection interval (period) is short and touching to each point involves slight time difference and, therefore, even in the case of multi-touch, coordinate data are output in order of touching from touch detecting unit 112 . Therefore, by calculating the distance between the coordinate data received first and each of some coordinate data received second and thereafter, it is possible to determine whether the touch was a single touch or a multi-touch.
  • the number of points touched simultaneously in the multi-touch operation is not limited to two and may be three or more. Therefore, distances are calculated for a plurality of continuous points.
  • the distance used as a reference for determining a multi-touch may be set appropriately. What is necessary is that the distance as the reference is larger than the normal distance the user moves the touched point in the detection interval. If it is determined to be a multi-touch, the control proceeds to step 320 . If it is determined not to be a multi-touch, the control proceeds to step 310 .
  • CPU 102 draws an image in accordance with the track of movement of the touched point, on the image displayed on display unit 114 .
  • CPU 102 reads the plurality of coordinate data stored at step 302 from RAM 106 , and writes prescribed data on corresponding points in the overlay area of VRAM 118 and on points on lines connecting these points in order. Consequently, the image data and the drawing data (data in the overlay area) are combined and displayed on display unit 114 , as if drawing is done on the image. Since the process at step 312 is repeated as described later, the overlay area is overwritten with existing drawing data left as it is.
  • CPU 102 temporarily stores the contents of drawing in RAM 106 . What is required is that the drawing can be reproduced and, hence, the method of storage may appropriately be selected.
  • the coordinate data stored in RAM 106 at step 302 with the order maintained may be retained as they are.
  • the data in the overlay area of VRAM 118 may be stored as two-dimensional image data in RAM 106 .
  • CPU 102 determines whether or not the touch is maintained. Specifically, CPU 102 determines whether or not coordinate data are continuously received from touch detecting unit 112 . For instance, if touch pen 220 or the user's finger moves away, reception of coordinate data from touch detecting unit 112 stops. If it is determined that the touch is maintained, control returns to step 302 , and the process following step 302 is repeated. If it is not determined that the touch is maintained, the control returns to step 300 and CPU 102 again waits for a next touch.
  • CPU 102 re-stores the coordinate data that have been stored at step 302 in RAM 106 as coordinate data corresponding to each multi-touch position, that is, for each track, in RAM 106 .
  • the coordinate data of first touched point hereinafter also referred to as “start point”. Therefore, it is possible to determine which of the plurality of coordinate data other than the start point stored in RAM 106 corresponds to the track of which of the plurality of start points, based on the distance from the start point.
  • the coordinate data of the start point and the coordinate data of points on the track of movement of the start point are stored as sequence data, in the order of detection.
  • sequence data 400 shown on the left side of FIG. 5 are stored in RAM 106 .
  • (x 0 , y 0 ) and (x 1 , y 1 ) are assumed to be the coordinates of two points that were touched first.
  • the point (x 0 , y 0 ) was touched little earlier than the point (x 1 , y 1 ).
  • two data sequences having coordinate data 410 of the first start point and coordinate data 420 of the second start point as heads, respectively, such as shown on the right side of FIG. 5 are stored in RAM 106 .
  • Coordinate data 410 of the start point and the following data sequence 412 represent one track.
  • Coordinate data 420 of the start point and the following data sequence 422 represent another track.
  • the manner of storage is similar if the number of multi-touch points is three or more.
  • CPU 102 draws an image on the image displayed on display unit 114 , in accordance with the track of touched points as at step 310 .
  • CPU 102 draws using the coordinate data re-stored in RAM 106 as shown, for example, in FIG. 5 . All tracks of simultaneously touched points may be drawn, or a track or tracks of only one or some of the points (for example, a track of the earliest-touched point) may be drawn.
  • FIG. 6 shows an example in which drawings 500 are made and thereafter, a user 510 flicked to the right while multi-touching with two fingers. By this operation, erroneous drawing 502 is made on the image. In FIG. 6 , only a drawing corresponding to one of the two tracks is displayed. If step 312 has already been executed and drawing with single touch has been made, the contents of drawing by the multi-touch is added to the drawn contents.
  • CPU 102 temporarily stores the contents drawn at step 322 in RAM 106 .
  • the specific method is the same as that of step 312 . If the coordinate data re-stored at step 320 is to be retained as it is, different from step 312 , only the coordinate data representing the track drawn at step 322 may be retained.
  • CPU 102 determines whether the multi-touch is maintained. Specifically, CPU 102 determines whether or not a plurality of coordinate data are continuously received from touch detecting unit 112 and whether these coordinate data correspond to the tracks of a plurality of start points determined at step 320 .
  • CPU 102 determines whether or not a plurality of coordinate data are continuously received from touch detecting unit 112 and whether these coordinate data correspond to the tracks of a plurality of start points determined at step 320 .
  • the user first touched with two fingers and then moved one finger away. Then, the track. that has been made by the moved finger is lost, and the coordinate data received from touch detecting unit 112 come to be only the coordinate data representing one track.
  • the control proceeds to step 328 . If it is not determined that multi-touch is maintained, the control proceeds to step 314 , and CPU 102 determines whether or not single touch is maintained, as described above.
  • CPU 102 determines whether or not the detected multi-touch operation is an operation allocated to scrolling (an operation designating a scroll). Specifically, for each sequence of coordinate data corresponding to the tracks of the plurality of touch points stored at step 320 , CPU 102 determines a vector from the coordinate data of the start point to the coordinate data of the last point, and determines whether the vector is of a prescribed length or longer and whether the vector is in the positive direction along the X axis (whether X component is positive). If the vector is in the positive direction along the X axis, the detected multi-touch operation is determined to be an operation allocated to a scroll to the right.
  • the detected multi-touch operation is determined to be an operation allocated to a scroll to the left. If it is determined to be an operation allocated to a scroll, the control proceeds to step 330 . Otherwise, the control returns to step 320 . The process following step 320 is repeated until the vector reaches the prescribed length.
  • CPU 102 deletes the coordinate data representing the track of touched point stored in RAM 106 at step 324 . Specifically, CPU 102 deletes the drawing data written after the determination of multi-touch (for example, write data “0”), while maintaining the drawing data (for example, drawing data drawn with single touch) written before the determination of multi-touch in the overlay area. Therefore, on the screen image of display unit 114 , the erroneous line drawn by the multi-touch operation is erased.
  • FIG. 7 shows a displayed screen image on display unit 114 during a scroll. An arrow 520 represents a scroll to the right. In FIG. 7 , drawing 500 formed by the user and displayed in FIG. 6 is maintained, while erroneous drawing 502 is erased. Since FIG. 7 shows a state in which scroll to the right has already been done to some extent, part of the drawing on the upper right portion is not shown.
  • CPU 102 executes the right or left scroll, in accordance with the result of determination at step 328 .
  • CPU 102 saves the contents of drawing (with the erroneous drawing erased) that have been temporarily stored in RAM 106 at steps 312 and 324 in storage unit 108 , and the control returns to step 300 . Since the contents drawn in the overlay area are saved in storage unit 108 , the image with drawing 500 intended by the user can again be displayed if a left scroll is designated later.
  • the background may be a uniform color image (for example, white, black or gray), provided that a function of drawing in response to a touch operation on touch detecting unit 112 is realized.
  • the function to be allocated may be any operation other than drawing and it may be an upward/downward scroll, a scroll to a diagonal direction, or a page switch without scroll. Further, the function to be allocated may be a function (operation) other than drawing, allocated to a function button. If an operation allocated to a scroll in a direction other than the left/right direction is to be detected, the scrolling direction may be determined considering not only the X component but also the Y component of the vector.
  • a plurality of users touch electronic blackboard apparatus 100 , each with a touch pen or a finger, and to draw a plurality of images simultaneously.
  • what is necessary is to determine whether the distance of points touched simultaneously is about the distance between one's fingers (for example, at most a few centimeters).
  • the operation to which a function other than drawing is allocated it is not limited to a multi-touch operation, and it may be a single-touch operation provided that the operation can be distinguished from the normally conducted touch operation for drawing.
  • the allocated operation other than drawing may be executed.
  • CPU 102 determines whether or not an operation is a multi-touch operation
  • a micro computer in touch detecting unit 112 may determine whether or not an operation is a multi-touch operation, and may transmit the result to CPU 102 .
  • An electronic blackboard apparatus in accordance with the second embodiment of the present invention has the same configuration as that shown in FIG. 1 representing the electronic blackboard apparatus in accordance with the first embodiment.
  • the method of detecting a touch input and the displayed screen images of the electronic blackboard apparatus in accordance with the second embodiment are also the same as those described with reference to FIGS. 2 and 3 regarding the electronic blackboard apparatus in accordance with the first embodiment. Therefore, in the following, the electronic blackboard apparatus in accordance with the second embodiment will be described as “electronic blackboard apparatus 100 ”, referring to FIGS. 1 to 3 where appropriate.
  • a program realizing the page switch function of electronic blackboard apparatus 100 in accordance with the second embodiment using the hardware shown in FIG. 1 when executed by CPU 102 shown in FIG. 1 has the following control structure.
  • the program includes steps 600 , 602 and 604 .
  • CPU 102 monitors an output of touch detecting unit 112 , and in response to an output detecting a touch by the user from touch detecting unit 112 , the control flow proceeds to the next step.
  • CPU 102 again detects a new position of touching by a finger based on an output from touch detecting unit 112 , and the control proceeds to the next step.
  • CPU 102 determines whether or not the touch by the user is a multi-touch made with two or more fingers.
  • Touch detecting unit 112 has a function of outputting coordinate data whose number corresponds to the number of touching, in response to the touching by the user. Therefore, the determination as described above can be made based on the outputs from touch detecting unit 112 .
  • the program further includes a step 606 .
  • CPU 102 stores, if the determination at step 604 is positive, the touch positions detected at steps 600 and 602 in RAM 106 (see FIG. 1 ). Since a multi-touch is detected, here, coordinate values corresponding in number to the number of detected touches are stored. In the subsequent process, the coordinate data of each finger are detected for every detected touch position, and the series of data are stored as a sequence in RAM 106 .
  • the program further includes steps 608 , 610 and 630 .
  • CPU 102 detects the present positions of fingers from the outputs of touch detecting unit 112 .
  • CPU 102 determines whether any one of the multi-touch detection outputs is lost (whether or not any finger has been moved away from touch detecting unit 112 ). If the determination at step 610 is positive, at step 630 , CPU 102 compares the X coordinate of the last detected finger position with the X coordinate of the first finger position stored at step 606 , and determines whether or not an absolute value of difference between the two coordinates is larger than a threshold value D TH1 .
  • direction of page movement is along the lateral direction of drawing area 230 ( FIG. 3 ) and, therefore, X coordinates of finger positions are compared as described above.
  • a page displayed in drawing area 230 is changed in accordance with the direction of finger movement. Specifically, if the direction of finger movement is to the left in FIG. 3 , the next page of the current page is displayed on drawing area 230 , and if it is to the right, the previous page of the current page is displayed on drawing area 230 . If the determination at step 630 is negative, at step 634 , CPU 102 cancels scrolling of the image displayed in drawing area 230 . In the present embodiment, if fingers are slid to the left/right in the state of multi-touch, the image scrolls to the left/right correspondingly.
  • step 634 the display on drawing area 230 is returned from the scrolled image to the display of original page.
  • CPU 102 determines whether or not finger positions detected at step 608 are all in the left direction when viewed from the finger positions recorded at step 606 . If the determination is positive, at step 614 , CPU 102 calculates the amount of movement (absolute value of difference in X coordinate values), and determines whether or not the value is larger than a threshold value D TH2 .
  • the threshold value D TH2 here is larger than the threshold value D TH1 at step 630 .
  • the amount of movement is an average value of all amounts of movement of multi-touch finger positions.
  • step 616 CPU 102 sets the screen image to be displayed on drawing area 230 to be the image of previous one page. Specifically, if the current image is of the second page, the image of the first page is displayed on drawing area 230 by the process of step 616 . Thereafter, the control flow returns to step 600 .
  • step 618 CPU 102 scrolls the screen image displayed on drawing area 230 to the left by the same length as the amount of movement of finger positions calculated at step 614 . At this time, if there is a previous page, the left end of the page is displayed at the right side of drawing area 230 . Then, the control flow proceeds to step 608 .
  • CPU 102 determines whether or not the finger positions detected at step 608 are in the right direction when viewed from the finger positions recorded at step 606 . If the determination is positive, at step 622 , CPU 102 calculates the amount of movement (absolute value of difference in X coordinate values), and determines whether or not the value is larger than a threshold value D TH3 .
  • the threshold value here is the same as the threshold value D TH2 at step 614 .
  • step 624 CPU 102 sets the screen image to be displayed on drawing area 230 to be the image of next one page. Specifically, if the current image is of the second page, the image of the third page is displayed on drawing area 230 . Thereafter, the control flow returns to step 600 . On the other hand, if the determination at step 622 is negative, CPU 102 scrolls the screen image displayed on drawing area 230 to the right by the same length as the amount of movement of finger positions calculated at step 622 . At this time, if there is a next page, the right end of the page is displayed at the left side of drawing area 230 . Then, the control flow proceeds to step 608 .
  • step 628 CPU 102 executes a predetermined process in accordance with the values of X and Y coordinates of finger positions or history thereof.
  • the process that takes place at step 628 may include a process of updating display, when a specific button is pressed with multi-touch, when a specific plurality of buttons are pressed with multi-touch, when a so-called pinch-out is done with the space between fingers made wider, or a so-called pinch-in is done with the space between fingers made narrower.
  • step 628 such a process is done. Since the contents of processing at step 628 are not related to the present invention, detailed description thereof will not be given here, for simplicity of description.
  • the control flow returns to step 608 .
  • control structure of the routine realizing the process for multi-touch operation is as described above.
  • step 604 If it is determined at step 604 that the detected finger position is one and it is not a multi-touch operation, the process following step 636 is executed.
  • step 636 CPU 102 determines whether or not the finger has moved away from drawing area 230 of touch detecting unit 112 .
  • step 638 based on the finger position at the start of touching detected at step 600 and the finger position immediately before detection of the finger moved away at step 604 , CPU 102 executes a process in accordance with these positions.
  • the process when function button 242 , NEXT button 252 or PREVIOUS button 254 shown in FIG. 3 is pressed corresponds to the process that is executed here. Thereafter, the control flow returns to step 600 , and the output of touch detecting unit 112 is monitored until the next touch is detected.
  • step 636 determines whether or not the detected finger position is in drawing area 230 . If the determination at step 640 is positive, at step 642 , CPU 102 draws an image in accordance with drawing settings at that time on the finger position. Thereafter, the control flow returns to step 602 . If the determination at step 640 is negative, drawing is unnecessary and, therefore, the control flow directly returns to step 602 .
  • Electronic blackboard apparatus 100 operates in the following manner. In the following, the operation of electronic blackboard apparatus 100 related mainly to the page feed will be described. Operations of portions not specifically related to the present invention will not be given here.
  • the third page of a document is displayed on the screen of display unit 114 .
  • a user 680 touches the screen surface of display unit 114 with two fingers, and slides the two fingers to the right as represented by an arrow 682 , with the fingers kept in touch with the surface.
  • the amount of sliding here is a distance D 1 , which is smaller than the threshold value D TH1 .
  • the control flow of the program at this time will be described.
  • the first finger position is detected and, thereafter, the next finger position is detected at step 602 .
  • the operation is multi-touch operation and, therefore, the determination at step 604 is YES.
  • the first touched positions of two fingers are stored in RAM 106 at step 606 .
  • current finger positions are again detected. Assuming that the user continues sliding as shown in FIG. 9 , the determination at step 610 is negative. Therefore, at step 612 , whether or not the two finger positions are moving to the left is determined. Here, the two finger positions are both moving to the right. Therefore, the result of determination is negative. As a result, the control proceeds to step 620 .
  • step 620 whether or not the finger positions are moving to the right is determined. In the state shown in FIG. 9 , the result of determination is positive. Therefore, next, at step 622 , whether or not the amount of movement of the fingers is larger than the threshold value D TH1 is determined. At the beginning of repetition, the amount of movement is smaller than the threshold value D TH1 . Therefore, the result of determination is negative. As a result, the screen image is displayed scrolled to the right by the amount equal to the amount of movement of the fingers (average of the amounts of movement of two finger positions) at step 626 . Thereafter, the control returns to step 608 and the next repetition process starts.
  • the display on display unit 114 is as shown in FIG. 10 .
  • D 1 the distance of finger movement
  • D TH1 the threshold value
  • the display on display unit 114 is as shown in FIG. 10 .
  • the screen image is scrolled to the right by the same distance as D 1 .
  • a right end portion 696 of the image of the next page is displayed.
  • step 632 a process for feeding the screen image by one page to the right is executed. Specifically, as shown by an arrow 720 in FIG. 11 , the image of the next page is scrolled to the right by one page, and the next page is displayed on display unit 114 as shown in FIG. 12 .
  • the screen image before movement is the third page
  • the image after movement is the fourth page.
  • step 600 the control proceeds to step 600 and CPU 102 executes the process of monitoring the output of touch detecting unit 112 until the user touches the screen image next time.
  • the user 680 further continues sliding to the right and the distance D 3 becomes larger than the threshold value D TH2 .
  • the result of determination at step 622 becomes positive, and at step 624 , the screen image is fed to the right by one page. Specifically, as shown in FIG. 14 , the screen image of the fourth page is displayed on display unit 114 . Thereafter, the control returns from step 624 to step 600 , and the next position of touching by the user is detected. If the user 680 continues sliding of his/her fingers to the right as shown by an arrow 760 , this sliding of the user is detected as a new touch at step 600 . Therefore, through the process of steps 600 , 602 and 604 , the operation described above is repeated, using the position touched by the user 680 at the time point of page switching as a head position.
  • step 610 When an upward/downward sliding, a pinch-out or a pinch-in takes place with multi-touch, the control flows through steps 610 , 612 , 620 and 628 of FIG. 8 while the sliding is being done, and the process in accordance with the user operation is executed. If any operation is done with a single-touch, the control proceeds from step 604 to step 636 . Until the finger is moved away, the control proceeds from step 640 to 642 if the operation is in drawing area 230 , and by the process at step 642 , an image is drawn at the finger position. If the operation is out of drawing are 230 , nothing happens in the present embodiment.
  • step 638 based on the first detected finger position and the finger position immediately before the finger is moved away, a process in accordance with these positions is executed. If it is the case that the user touched any of function buttons 242 and moved away the finger, a predetermined process corresponding to the touched function button 242 is executed at step 638 .
  • the operation is as follows. During sliding, the screen image is scrolled in accordance with the amount of sliding. When the fingers are removed after sliding by the distance D 1 and the distance D 1 is equal to or smaller than the threshold value D TH1 , scroll is cancelled and the screen image before sliding is resumed. If the distance D 1 is larger than the threshold value D TH1 , the screen image if moved by one page. In the second embodiment described above, the next page is displayed when slid to the right, and the previous page is displayed when slid to the left.
  • the screen image is scrolled. Therefore, it is possible for the user to intuitively understand that the screen image can be scrolled by a multi-touch operation. If touching fingers are removed in the course of multi-touch sliding and the amount of sliding is small, the scroll is canceled and the original screen image is resumed. Even if multi-touch sliding operation is done erroneously during writing on the screen image, the screen image returns to the normal state immediately when the fingers are moved away. Therefore, even when an erroneous operation is done, the influence on the writing operation is small. On the other hand, if the fingers are moved away after sliding over a certain distance, switching to the next page (page feed) occurs.
  • the page feed can be realized by an intuitive operation of multi-touch sliding operation. If sliding is further continued with the fingers kept touched on the screen, page feed is automatically executed while sliding continues, and the sliding operation can further be continued. Therefore, advantageously, the operation of continuous page feed is made simple.
  • step 610 of FIG. 8 it is determined at step 610 of FIG. 8 that sliding ended if only one of multi-touching fingers is moved away from the input screen image.
  • the present invention is not limited to such an embodiment. It may be determined that sliding ended only when all fingers are moved away. In other words, if it is a multi-touch operation when sliding starts, sliding may be continued thereafter even when the multi-touching is lost, and similar effects as in the embodiment above can be attained.
  • determination as to whether an operation is a multi-touch operation or not is made at step 604 of FIG. 8 , and if it is determined not to be a multi-touch operation, a process such as drawing, different from the page switching or scroll process is executed.
  • the present invention is not limited to such an embodiment.
  • the process of page switching or scroll may be executed only when the operation is a multi-touch with three or more fingers, and, if touched by one or two fingers, a process such as drawing, other than the page switching or scroll may be executed even if it is a multi-touch with two fingers.
  • the process of page switching or scroll may be executed only when the operation is a multi-touch with N fingers, and a process such as drawing, other than the page switching or scroll may be executed even if it is a multi-touch with N ⁇ 1 or smaller number of fingers, with N being an integer larger than 1.
  • use of the average value of amounts of movement of a plurality of fingers in the second embodiment is not limiting.
  • the amount of movement may be calculated based on only the first detected one finger position.
  • threshold values D TH1 and D TH2 mentioned above may be changed in accordance with the speed of movement of finger positions during sliding.
  • the direction of sliding and page turning may be the upward/downward direction (Y axis direction). Sliding may be done in consideration of both X and Y axes directions. In other words, the present invention is applicable to an embodiment allowing sliding in a diagonal direction.
  • the first and second threshold values may be determined separately for the X axis direction and Y axis direction, or determined to be the same regardless of the direction.
  • the length of sliding may not be divided to components in the X axis direction and Y axis direction, and whether a page is to be turned or not may be determined based on the length of the track of sliding (distance between the start point and end point).
  • the present invention is not limited to such an embodiment.
  • a dedicated pen may be used.
  • any method may be used provided that touching at a plurality of positions can be detected.

Abstract

An electronic blackboard includes a display panel and a touch-panel, draws a line corresponding to a track formed by movement of the touched position on the displayed image, and erases an erroneously drawn line corresponding to a track specifying an operation other than drawing. In another aspect, an electronic blackboard includes a touch screen, a scroll unit for scrolling an image on the display screen when positions of multi-touch move in one same direction, and a page switching unit for detecting, after detection of multi-touch input, that a finger is moved away, and executing a process of returning the image on the display screen to a state before scrolling or a process of switching the image by one page in accordance with the moving direction of multi-touch positions, depending on whether the amount of movement of touch inputs is equal to or smaller than a threshold value.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This nonprovisional application claims priority under 35 U.S.C. §119(a) on Patent Applications No. 2011-027236 and No. 2011-027237 filed in Japan on Feb. 10, 2011, the entire contents of which are hereby incorporated by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to an image display apparatus having a display device and an input device integrated together, allowing touch-input. More specifically, the present invention relates to a touch drawing display apparatus that has functions of drawing operation and other operation regarding touch operations, and operation method thereof, as well as to an image display apparatus switching images page by page by touching and sliding images with a plurality of fingers and the like and a controller for the display apparatus.
  • 2. Description of the Background Art
  • As an image display apparatus allowing touch-input, a so-called electronic blackboard having a display device with a large screen, allowing drawing of an image or other processes upon detection of touching of the display device by the user has been known. The electronic blackboard is useful when summarizing opinions of participants or finding a preferable solution to a problem, for example, at a meeting. Electronic blackboards having various configurations have come to be practically used, and one configured as a computer system having a combination of a display device with a large display screen and an input device for detecting two-dimensional position coordinates such as a touch-panel has been used.
  • Generally, an electronic blackboard apparatus successively reads pieces of information related to position coordinates designated by a pen or a finger and pieces of information related to amount of movement, and displays a track of inputs on the display device based on the read pieces of information. Consequently, the apparatus can realize operations as an electronic blackboard such as handwriting input.
  • Regarding the touch operation, Japanese Patent Laying-Open No. 6-175776 (hereinafter denoted as '776 Reference) discloses a projection type presentation device allowing drawing on a displayed screen image using touch operation. There is a problem that recognition of an original image becomes difficult when writing is repeated on the same screen image. In order to solve this problem, in the device disclosed in this reference, drawing is done while touch operation is continued in an image forming area, and when the touch operation in the image forming area once ends and then a touch operation in the image forming area is detected next, the former image is automatically deleted.
  • If a specific function other than drawing is allocated to a touch operation different from a simple touch operation for drawing in an electronic blackboard, while an image is being drawn in a drawing mode, it takes some time until a touch operation is determined to be the one for executing the specific function. Typically in an electronic blackboard, drawing is done immediately after the touching, so as not to cause any time lag between the touch operation and drawing. Any time lag is stressful for the user. Such a configuration sometimes leads to an erroneous drawing not intended by the user. By way of example, it is often the case that an operation of flicking with multi-touch (touching a plurality of points simultaneously) is allocated to an operation of scrolling screen images. In such a situation, if the user is drawing images with single-touch operation (touching one point at a time) and then makes a multi-touch flick operation for scrolling screen images, it is necessary to monitor the tracks of touched points for a prescribed time period to determine that it is a flick operation. As the image is drawn immediately in response to the touching to avoid any stress to the user, it follows that an erroneous image not intended by the user have been drawn by the time when the operation is determined to be the multi-touch flick operation. While the image with the erroneous drawing disappears from the screen image when the screen images are scrolled, drawing information containing the erroneous drawing is stored as it is. Therefore, when scrolled back to the original screen image, the erroneous drawing again appears on the screen image. When printing is done, similarly, the erroneous drawing is also printed.
  • This problem is not limited to the electronic blackboard, and it occurs in display devices allowing display of images drawn by touching, such as tablet-type terminals. This problem cannot be solved by the technique disclosed in '776 Reference.
  • Advantages of an electronic blackboard include that it is possible to display or write (draw) images separately on a plurality of screen images. Each unit of display of such images is referred to as a “page” as an analogy to a book.
  • In an electronic blackboard allowing display of a plurality of pages of images, how to switch pages easily is an issue. The easiest method may be to provide a button for switching pages on the screen image. This method, however, is problematic particularly when the screen image is large, as it becomes difficult for the user to press the button depending on the position of the user.
  • In view of the foregoing, other than the method using the button, a method of switching the screen images by touching the screen image with a finger and sliding the same is proposed. Here, if it is too simple, the operation would be the same as the usual drawing operation. Therefore, in order to distinguish the two different operations, Japanese Patent Laying-Open No. 11-102274 (hereinafter referred to as '274 Reference) proposes a device in which the screen image is switched if the screen image is touched by a plurality of fingers and the like (a so-called multi-touch) and the plurality of touched positions are slid by more than a prescribed value in the same direction.
  • Japanese Patent Laying-Open Nos. 8-76926 (hereinafter referred to as '926 Reference) and 9-231004 (hereinafter referred to as '004 Reference) disclose methods as further development of the idea described above.
  • According to the method described in '926 Reference, when an operation of touching and sliding is done, a page is turned, and the number of pages turned at one time is changed depending on the number of touching fingers (the number of touched positions). According to the method disclosed in '004 Reference, scrolling of images, turning of pages and moving of cursors are executed in accordance with the movement and number of fingers touching the display screen surface.
  • According to the techniques disclosed in '274, '926 and '004 References described above, it is possible to turn pages by the simple operation of touching and sliding. There is still a problem to be solved. Specifically, pages may be turned erroneously while normal input is being done. Particularly when input is being done using a finger, it is possible that a finger other than the finger used for input happens to touch the screen surface and is detected as the multi-touch slide input and, as a result, a page is turned unintentionally. Therefore, a mechanism that can prevent unintended turning of a page even when such an erroneous input is made has been desired. Further, a mechanism that allows the user to easily understand what manner of input is necessary to turn a page is also necessary.
  • SUMMARY OF THE INVENTION
  • In view of the problems described above, it is desired to provide a touch drawing display apparatus and operation method thereof that can maintain correct drawing of images even when a touch operation other than the image drawing is made in the drawing mode.
  • Further, it is desired to provide an image display apparatus allowing touch-input that allows the user to intuitively understand the method of page-by-page switching of screen images using multi-touch and that can prevent any trouble of display when an erroneous operation is made on such an occasion.
  • Further, it is desired to provide an image display apparatus allowing touch-input that allows the user to intuitively understand the method of page-by-page switching of screen images using multi-touch, and of which switching operation is easy.
  • According to a first aspect, the present invention provides a touch drawing display apparatus, including: a display unit that displays an image; a detecting unit that is arranged on the display unit and detects a touched position; a drawing unit that draws, on an image displayed on the display unit, a line corresponding to a track formed by movement of the detected position; and an erasing unit that erases, when an operation other than drawing is specified by the track, a line drawn from a start point to an end point of the track specifying the operation.
  • Preferably, the track for specifying the operation other than drawing is a track formed by a multi-touch operation of simultaneously touching a plurality of positions of the detecting unit.
  • According to a second aspect, the present invention provides a method of operating a touch drawing display apparatus that includes a display unit that displays an image and a detecting unit that is arranged on the display unit and detects a touched position, including the steps of: drawing, on an image displayed on the display unit, a line corresponding to a track formed by movement of the detected position; determining whether or not an operation other than drawing is specified by the track while the drawing step is being executed; and erasing, if an operation other than drawing is specified by the track, a line drawn from a start point to an end point of the track specifying the operation.
  • As described above, according to the present invention, when a touch operation other than that for drawing images is made in the drawing mode, the erroneous image drawn during the determination of the touch operation other than that for drawing is not retained, and the drawn image as intended by the user can be stored. Therefore, when the object image is displayed again, the erroneous image is not displayed and only the intended image is displayed.
  • According to a third aspect, the present invention provides an image display apparatus allowing touch-input, including: a touch detecting unit that has a display screen, displays page images page by page on the display screen, and detects positions and the number of touch inputs that designate positions on the display screen; a scroll unit that scrolls, when a plurality of touch inputs are detected by the touch detecting unit and their positions on the display screen move in one same direction, an image displayed on the display screen along with movement of positions of the plurality of touch inputs; and a first page switching unit that detects, after the plurality of touch inputs are detected by the touch detecting unit, decrease in the number of touch inputs from an output of the touch detecting unit and, depending on whether an amount of movement of the touch inputs is not larger than a first threshold value, selectively executes a process of returning the image on the display screen to a state before scrolling and a process of switching the image on the display screen by one page in accordance with the direction of movement of the plurality of touch inputs.
  • When touching of the display screen surface with a plurality of fingers, that is, so-called multi-touch, is detected by the touch detecting unit and the touched positions move in the same direction, the scroll unit scrolls the screen image in that direction. Therefore, the user can intuitively understand that the screen image can be scrolled by multi-touch sliding operation. On the other hand, if the amount of movement is not larger than the first threshold value and the number of multi-touching decreases after the multi-touch sliding is done, the scroll made by that time point is resumed to the original state before scrolling. Therefore, even when the user erroneously touches the display screen surface with a plurality of fingers and slides, the original screen image can be resumed if the user becomes aware and moves his/her fingers away immediately. If the multi-touch sliding exceeding the first threshold value is done and then the user moves away his/her fingers from the screen, the screen image is switched by one page in accordance with the direction of sliding. Thus, the pages of screen image can be switched by the simple operation of multi-touch sliding.
  • As a result, an image display apparatus allowing touch-input that allows the user to intuitively understand the method of page-by-page switching of screen images using multi-touch and that can prevent any trouble of display when an erroneous operation is made on such an occasion can be provided.
  • According to a fourth aspect, the present invention provides a controller for a display apparatus, used for the display apparatus that has a display screen, displays an image received from outside on the display screen and detects and outputs, to the outside, positions and the number of touch inputs that designate positions on the display screen. The controller for the display apparatus includes: a scroll unit that generates, when it is detected that a plurality of touch inputs are detected by the display apparatus and their positions on the display screen move in one same direction, an image by scrolling the image displayed on the display screen along with the movement of positions of the plurality of touch positions, and applies the generated image to the display apparatus; and a page switching unit that detects, after the plurality of touch inputs are detected by the display apparatus, decrease in the number of touch inputs from an output of the display apparatus, depending on whether an amount of movement of the touch inputs is not larger than a first threshold value, selectively executes a process of generating page image data for returning the image on the display screen to a state before scrolling and a process of generating page image data for switching the image on the display screen by one page in accordance with the direction of movement of the plurality of touch inputs, and transmits the generated page image data to the display apparatus.
  • According to a fifth aspect, the present invention provides an image display apparatus allowing touch input, including: a touch detecting unit that has a display screen, displays page images page by page on the display screen, and detects positions and the number of touch inputs that designate positions on the display screen; a scroll unit that scrolls, when a plurality of touch inputs are detected by the touch detecting unit and their positions on the display screen move in one same direction, an image displayed on the display screen along with movement of positions of the plurality of touch inputs; a page switching unit that switches, in response to an amount of movement of the plurality of touch inputs exceeding a threshold value during scrolling of the image by the scroll unit, the image on the display screen by one page in accordance with the direction of movement of the plurality of touch inputs; and a returning unit that detects, from an output of the touch detecting unit, decrease of the number of plurality of touch inputs before page switching by the page switching unit during scrolling of the image by the scroll unit, and returns scrolling of the image by the scroll unit to a state before scrolling.
  • The plurality of touched positions are detected by the touch detecting unit, and when these positions move in the same direction, the scroll unit moves the screen image along the direction of movement of the touched positions. Therefore, the user can intuitively understand that page switching is possible by the so-called multi-touch. Further, if the plurality of touched positions move in the same direction and the threshold value is exceeded, the display of screen image is switched in accordance with the direction of movement of touched positions in the course of scrolling. The screen image can be switched by the simple operation of so-called multi-touch. Further, for this purpose, what is necessary is simply to move the touched positions continuously without moving the fingers and the like away from the screen surface. Further, if the number of touch inputs decreases before page switching takes place, the screen image returns to the state before the scrolling and, therefore, the original state can easily be resumed even when an erroneous operation is made.
  • As a result, an image display apparatus allowing touch-input that allows the user to easily understand that page-by-page switching of screen images is possible by multi-touch input and allows easy switching of page-by-page switching of screen images can be provided.
  • According to a sixth aspect, the present invention provides a controller for a display apparatus, used for the display apparatus that has a display screen, displays an image received from outside on the display screen and detects and outputs, to the outside, positions and the number of touch inputs that designate a position on the display screen. The controller includes: a scroll unit that generates, when it is detected that a plurality of touch inputs are detected by the display apparatus and their positions on the display screen move in one same direction, an image by scrolling the image displayed on the display screen along with the movement of positions of the plurality of touch positions, and applies the generated image to the display apparatus; a page switching unit that switches, in response to an amount of movement of the plurality of touch inputs exceeding a threshold value during scrolling of the image by the scroll unit, the image on the display screen by one page in accordance with the direction of movement of the plurality of touch inputs; and a returning unit that detects, from an output of the display apparatus, decrease of the number of plurality of touch inputs before page switching by the page switching unit during scrolling of the image by the scroll unit, and returns scrolling of the image by the scroll unit to a state before scrolling.
  • As described above, according to the present invention, the user can intuitively understand that page-by-page switching of screen images can be done by using multi-touch, and any trouble in display can be prevented even when an erroneous operation is made on such an occasion. Further, by a simple operation of multi-touch sliding, page of images can be switched by one page. It is unnecessary to stop the multi-touch sliding operation.
  • The foregoing and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram showing a schematic configuration of an electronic blackboard in accordance with an embodiment of the present invention.
  • FIG. 2 shows an example of a method of detecting a touch-input.
  • FIG. 3 shows an example of a displayed screen image.
  • FIG. 4 is a flowchart representing a control structure of a program realizing erasure of an erroneous drawing in the electronic blackboard apparatus in accordance with a first embodiment of the present invention.
  • FIG. 5 shows coordinate data structure stored when a multi-touch operation is carried out in accordance with the first embodiment of the present invention.
  • FIG. 6 shows a scroll operation realized by the multi-touch operation in accordance with the first embodiment of the present invention.
  • FIG. 7 shows a scroll of a screen image in accordance with the first embodiment of the present invention.
  • FIG. 8 is a flowchart representing a control structure of a program realizing page switching by a multi-touch, in the electronic blackboard apparatus in accordance with a second embodiment of the present invention.
  • FIG. 9 shows a multi-touch sliding operation in accordance with the second embodiment of the present invention.
  • FIG. 10 shows an operation of sliding by a distance D1 with multi-touch and then moving the fingers away, in accordance with the second embodiment of the present invention.
  • FIG. 11 shows an operation of sliding by a distance D2 longer than D1 with multi-touch and then moving the fingers away, in accordance with the second embodiment of the present invention.
  • FIG. 12 shows a screen image after a page switch, displayed when an operation of sliding by a distance D2 and then moving the fingers away is done, in accordance with the second embodiment of the present invention.
  • FIG. 13 shows a screen image displayed when an operation of sliding by a distance D3 with multi-touch is done, in accordance with the second embodiment of the present invention.
  • FIG. 14 shows a screen image displayed when the sliding operation is continued even after the screen image is switched by one page from the screen image shown in FIG. 13.
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • In the following description and in the drawings, the same components are denoted by the same reference characters. Their names and functions are also the same. Therefore, detailed description thereof will not be repeated.
  • In the following, the term “touch” means that a position is made detectable by an input position detecting device, and “touch” may include touching and pressing the detecting device, just touching and not pressing the detecting device, and coming very close to but not touching the detecting device. As will be described later, a contact type as well as a non-contact device may be used as the input position detecting device. Where the non-contact type detection device is used, “touch” means coming very close to the detecting device, that is, to a distance that allows detection of the input position.
  • First Embodiment
  • [Configuration]
  • Referring to FIG. 1, an electronic blackboard apparatus 100 in accordance with the first embodiment of the present invention includes: a central processing unit (hereinafter denoted as CPU) 102; a read only memory (hereinafter denoted as ROM) 104; a random access memory (hereinafter denoted as RAM) 106; a storage unit 108; an interface unit (hereinafter denoted as IF unit) 110; a touch detecting unit 112; a display unit 114; a display control unit 116; a video memory (hereinafter denoted as VRAM) 118; and a bus 120. CPU 102 is for overall control of electronic blackboard apparatus 100.
  • ROM 104 is a non-volatile storage storing programs and data necessary for controlling operations of electronic blackboard apparatus 100. RAM 106 is a volatile storage. Storage unit 108 is a non-volatile storage retaining data even when power conduction is shut off, and implemented, for example, by a hard disk drive, a flash memory or the like. Storage unit 108 may be configured as a detachable unit. CPU 102 reads a program from ROM 104 to RAM 106 through bus 120 and executes the program using a part of RAM 106 as a work area. CPU 102 controls various units and components of electronic blackboard apparatus 100 in accordance with a program or programs stored in ROM 104.
  • To bus 120, CPU 102, ROM 104, RAM 106, storage unit 108, IF unit 110, touch detecting unit 112, display control unit 116 and VRAM 118 are connected. Data (including control information) is exchanged between each of the units through bus 120.
  • IF unit 110 is for establishing connection with the outside such as a network, and transmits/receives image data to and from a computer or the like connected to the network. Image data received from the outside through IF unit 110 is recorded in storage unit 108.
  • Display unit 114 is a display panel (such as a liquid crystal panel) for displaying images. Display control unit 116 is provided with a driving unit for driving display unit 114. Display control unit 116 reads image data stored in VRAM 118 at prescribed timing, generates signals for displaying as an image on display unit 114, and outputs the generated signals to display unit 114. The image data to be displayed is read by CPU 102 from storage unit 108 and transmitted to VRAM 118.
  • Touch detecting unit 112 is, for example, a touch-panel, detecting a touch operation by a user. Here, if there is a plurality of positions touched by the user, touch detecting unit 112 successively outputs coordinates of each of the positions. As a result, by monitoring the outputs of touch detecting unit 112, it is possible to know the number of touched positions and their coordinates successively. Detection of touch operation when a touch-panel is used for touch detecting unit 112 will be described with reference to FIG. 2.
  • FIG. 2 shows an infrared scanning type touch-panel (touch detecting unit 112). The touch-panel has arrays of light emitting diodes (hereinafter denoted as LED arrays) 200 and 202 arranged in a line on adjacent two sides of a rectangular writing surface, respectively, and two arrays of photodiodes (hereinafter referred to as PD arrays) 210 and 212 arranged in a line opposite to LED arrays 200 and 202, respectively. Infrared rays are emitted from each LED of LED arrays 200 and 202, and the infrared rays are detected by each PD of opposite PD arrays 210 and 212. In FIG. 2, infrared rays output from LEDs of LED arrays 200 and 202 are represented by arrows.
  • The touch-panel includes, for example, a micro computer (a device including a CPU, a memory and an input/output circuit), and controls emission of each LED. Each PD outputs a voltage corresponding to the intensity of received light. The output voltage from the PD is amplified by an amplifier. Since signals are output simultaneously from the plurality of PDs of PD arrays 210 and 212, the output signals are once saved in a buffer and then output as serial signals in accordance with the order of arrangement of PDs, and transmitted to the micro computer. The order of serial signals output from PD array 210 represents the X coordinate. The order of serial signals output from PD array 212 represents the Y coordinate.
  • When a user touches a point on the touch-panel with a touch pen 220, the infrared ray is intercepted by the tip of touch pen 220. Therefore, the output voltage of PD that has been receiving the infrared ray before the interception drops. Since the signal portion from the PD that corresponds to the touched position (XY coordinates) decreases, the micro computer detects a portion where the signal levels of received two serial signals decreased, and thereby finds the position coordinates of the touched position. The micro computer transmits the determined position coordinates to CPU 102. The process for detecting the touched position is repeated periodically at prescribed detection interval and, therefore, if one point is kept touched for a time period longer than the detection interval, it follows that the same coordinate data is output repeatedly. The touched position can be detected in the similar manner when the user touches touch detecting unit 112 with his/her finger without using touch pen 220.
  • The technique for detecting the touched position described above is well known and, therefore, further description will not be given here. A touch panel other than the infrared scanning type panel (such as a capacitive type, surface acoustic wave type or resistive type touch panel) may be used as touch detecting unit 112. When a capacitive touch panel is used, a position can be detected even when a finger or the like is not actually touching (non-contact), if it is close enough to the sensor.
  • On display unit 114, a screen image such as shown in FIG. 3 is displayed. The display screen image of display unit 114 is divided to a drawing area 230 and a function button area 240. Drawing area 230 is for the user to draw an image by touching operations. Specifically, XY coordinates of touched position and track of its movement are transmitted from touch detecting unit 112 to CPU 102 as described above. In accordance with the received coordinate data, CPU 102 writes a prescribed value in a corresponding memory address on VRAM 118. While pixel values of image data on VRAM 118 may be changed, here, it is assumed that VRAM 118 is provided with an area (hereinafter also referred to as an overlay area) for storing drawing data, separate from the area for storing image data. Assuming that data of a memory address where drawing is not done of the overlay area is “0”, CPU 102 writes “1” to a memory address that corresponds to the drawn position. Display control unit 116 superimposes and displays on display unit 114 the image data with the drawing data (the data in the overlay area). Specifically, on a point where the drawing data exists (the pixel having “1” recorded in the overlay area), the drawing data (preset color) is displayed, and on a point where the drawing data does not exist (the pixel having “0” recorded in the overlay area), the image data is displayed.
  • On function button area 240, function buttons 242 each having specific function allocated thereto are displayed. At a lower portion of drawing area 230, a page operation area 250 is displayed. On this area, a NEXT button 252, a PREVIOUS button 254 and a page number indication box 256 are displayed. When touched, NEXT button 252 feeds the displayed page (image data) to the right and shows the next page. When touched, PREVIOUS button 254 feeds the displayed page to the left and shows the previous page. Page number indication box indicates the page number of the currently displayed page, of the plurality of pages as the object of display. The position of page operation area 250 is fixed and it does not move even during scrolling. By way of example, the data for displaying page operation area 250 on display unit 114 may be stored in an overlay area separate from the overlay area for drawing.
  • When the user touches NEXT button 252, coordinate data of the touched position is transmitted from touch detecting unit 112 to CPU 102. CPU 102 determines that the received coordinate data represents a position in the area where NEXT button 252 is displayed. As a result, CPU 102 transmits image data of the next page to VRAM 118, and transmits a page feed command to display control unit 116. In response, display control unit 116 generates a signal for displaying an image in the course of page feed from the image data corresponding to the currently displayed page and the image data corresponding to the next page on VRAM 118, and outputs it to display unit 114. Consequently, an image in the course of page feed appears on display unit 114. It is noted that only a frame of the currently displayed image may be moved on the display screen image of display unit 114 in the course of page feed.
  • Functions allocated to function buttons 242 may include: a function of drawing by a touch operation (allowing selection of types of pens for drawing); a function of opening a file (image data) saved in storage unit 108; an erasure function of deleting a drawing in a prescribed area; a function of saving displayed image data in storage unit 108; and a function of printing displayed image data. Each function button 242 is displayed as an icon.
  • In the following, a method of operating electronic blackboard apparatus 100, particularly a control structure of a program realizing the process for not leaving any erroneous drawing when a function other than drawing is allocated to a special touch operation, will be described. In the following description, it is assumed that the function button for drawing has been pressed in the screen image shown in FIG. 3 and electronic blackboard apparatus 100 is in a mode that drawing starts when drawing area 230 of touch detecting unit 112 is touched. Further, it is assumed that the special touch operation is a flick operation to the right/left with multi-touch at two or more points. Scroll to the right is allocated to a right-flick operation with multi-touch, and scroll to the left is allocated to a left-flick operation with multi-touch. It is further assumed that no process other than drawing is allocated to any other multi-touch operation.
  • At step 300, CPU 102 determines whether or not touch detecting unit 112 is touched. As described above, CPU 102 determines whether or not coordinate data is received from touch detecting unit 112. Position coordinates (X coordinate, Y coordinate) of the touched point are output in the order of touching, from touch detecting unit 112. If it is determined that there is a touch, the control proceeds to step 302. Otherwise, the control proceeds to step 306.
  • At step 302, CPU 102 stores coordinate data that have been received for a prescribed time period (coordinate data of touched points) in RAM 106 in a manner that represents the order of reception. At step 306, CPU 102 determines whether or not the program is to be ended. When an end button as one of the function buttons 242 is pressed, CPU 102 ends the program. Otherwise, the control returns to step 300, and CPU 102 waits for a touch.
  • At step 304, CPU 102 determines whether or not the touch detected at step 302 is a multi-touch (whether a plurality of points are touched simultaneously). Specifically, CPU 102 calculates, among the plurality of coordinate data stored at step 302, the distance between the first received coordinate data and the coordinate data received next. If the distance is a prescribed value or larger, CPU 102 determines that it is a multi-touch, and if the distance is smaller than the prescribed value, determines that it is a single touch.
  • The touch detection interval (period) is short and touching to each point involves slight time difference and, therefore, even in the case of multi-touch, coordinate data are output in order of touching from touch detecting unit 112. Therefore, by calculating the distance between the coordinate data received first and each of some coordinate data received second and thereafter, it is possible to determine whether the touch was a single touch or a multi-touch. The number of points touched simultaneously in the multi-touch operation is not limited to two and may be three or more. Therefore, distances are calculated for a plurality of continuous points. The distance used as a reference for determining a multi-touch may be set appropriately. What is necessary is that the distance as the reference is larger than the normal distance the user moves the touched point in the detection interval. If it is determined to be a multi-touch, the control proceeds to step 320. If it is determined not to be a multi-touch, the control proceeds to step 310.
  • At step 310, CPU 102 draws an image in accordance with the track of movement of the touched point, on the image displayed on display unit 114. Specifically, CPU 102 reads the plurality of coordinate data stored at step 302 from RAM 106, and writes prescribed data on corresponding points in the overlay area of VRAM 118 and on points on lines connecting these points in order. Consequently, the image data and the drawing data (data in the overlay area) are combined and displayed on display unit 114, as if drawing is done on the image. Since the process at step 312 is repeated as described later, the overlay area is overwritten with existing drawing data left as it is.
  • At step 312, CPU 102 temporarily stores the contents of drawing in RAM 106. What is required is that the drawing can be reproduced and, hence, the method of storage may appropriately be selected. By way of example, the coordinate data stored in RAM 106 at step 302 with the order maintained may be retained as they are. Alternatively, the data in the overlay area of VRAM 118 may be stored as two-dimensional image data in RAM 106.
  • At step 314, CPU 102 determines whether or not the touch is maintained. Specifically, CPU 102 determines whether or not coordinate data are continuously received from touch detecting unit 112. For instance, if touch pen 220 or the user's finger moves away, reception of coordinate data from touch detecting unit 112 stops. If it is determined that the touch is maintained, control returns to step 302, and the process following step 302 is repeated. If it is not determined that the touch is maintained, the control returns to step 300 and CPU 102 again waits for a next touch.
  • At step 320, CPU 102 re-stores the coordinate data that have been stored at step 302 in RAM 106 as coordinate data corresponding to each multi-touch position, that is, for each track, in RAM 106. By the process of step 304, for each of the plurality of points touched simultaneously, the coordinate data of first touched point (hereinafter also referred to as “start point”) have been determined. Therefore, it is possible to determine which of the plurality of coordinate data other than the start point stored in RAM 106 corresponds to the track of which of the plurality of start points, based on the distance from the start point. In accordance with the result of determination, the coordinate data of the start point and the coordinate data of points on the track of movement of the start point are stored as sequence data, in the order of detection. Assume, for example, that two points were touched simultaneously. At step 302, sequence data 400 shown on the left side of FIG. 5 are stored in RAM 106. Of these, (x0, y0) and (x1, y1) are assumed to be the coordinates of two points that were touched first. The point (x0, y0) was touched little earlier than the point (x1, y1). Here, two data sequences having coordinate data 410 of the first start point and coordinate data 420 of the second start point as heads, respectively, such as shown on the right side of FIG. 5, are stored in RAM 106. Coordinate data 410 of the start point and the following data sequence 412 represent one track. Coordinate data 420 of the start point and the following data sequence 422 represent another track. The manner of storage is similar if the number of multi-touch points is three or more.
  • At step 322, CPU 102 draws an image on the image displayed on display unit 114, in accordance with the track of touched points as at step 310. CPU 102 draws using the coordinate data re-stored in RAM 106 as shown, for example, in FIG. 5. All tracks of simultaneously touched points may be drawn, or a track or tracks of only one or some of the points (for example, a track of the earliest-touched point) may be drawn. FIG. 6 shows an example in which drawings 500 are made and thereafter, a user 510 flicked to the right while multi-touching with two fingers. By this operation, erroneous drawing 502 is made on the image. In FIG. 6, only a drawing corresponding to one of the two tracks is displayed. If step 312 has already been executed and drawing with single touch has been made, the contents of drawing by the multi-touch is added to the drawn contents.
  • At steps 324, CPU 102 temporarily stores the contents drawn at step 322 in RAM 106. The specific method is the same as that of step 312. If the coordinate data re-stored at step 320 is to be retained as it is, different from step 312, only the coordinate data representing the track drawn at step 322 may be retained.
  • At step 326, CPU 102 determines whether the multi-touch is maintained. Specifically, CPU 102 determines whether or not a plurality of coordinate data are continuously received from touch detecting unit 112 and whether these coordinate data correspond to the tracks of a plurality of start points determined at step 320. By way of example, assume that the user first touched with two fingers and then moved one finger away. Then, the track. that has been made by the moved finger is lost, and the coordinate data received from touch detecting unit 112 come to be only the coordinate data representing one track. Considering a situation that three or more points are touched at first, it is determined that multi-touch is maintained if simultaneous touching of at least two points is continued. If it is determined that multi-touch is maintained, the control proceeds to step 328. If it is not determined that multi-touch is maintained, the control proceeds to step 314, and CPU 102 determines whether or not single touch is maintained, as described above.
  • At step 328, CPU 102 determines whether or not the detected multi-touch operation is an operation allocated to scrolling (an operation designating a scroll). Specifically, for each sequence of coordinate data corresponding to the tracks of the plurality of touch points stored at step 320, CPU 102 determines a vector from the coordinate data of the start point to the coordinate data of the last point, and determines whether the vector is of a prescribed length or longer and whether the vector is in the positive direction along the X axis (whether X component is positive). If the vector is in the positive direction along the X axis, the detected multi-touch operation is determined to be an operation allocated to a scroll to the right. If the vector is of a prescribed length or longer and in the negative direction along the X axis (X component is negative), the detected multi-touch operation is determined to be an operation allocated to a scroll to the left. If it is determined to be an operation allocated to a scroll, the control proceeds to step 330. Otherwise, the control returns to step 320. The process following step 320 is repeated until the vector reaches the prescribed length.
  • At step 330, CPU 102 deletes the coordinate data representing the track of touched point stored in RAM 106 at step 324. Specifically, CPU 102 deletes the drawing data written after the determination of multi-touch (for example, write data “0”), while maintaining the drawing data (for example, drawing data drawn with single touch) written before the determination of multi-touch in the overlay area. Therefore, on the screen image of display unit 114, the erroneous line drawn by the multi-touch operation is erased. FIG. 7 shows a displayed screen image on display unit 114 during a scroll. An arrow 520 represents a scroll to the right. In FIG. 7, drawing 500 formed by the user and displayed in FIG. 6 is maintained, while erroneous drawing 502 is erased. Since FIG. 7 shows a state in which scroll to the right has already been done to some extent, part of the drawing on the upper right portion is not shown.
  • At step 332, CPU 102 executes the right or left scroll, in accordance with the result of determination at step 328. Before executing the scroll, CPU 102 saves the contents of drawing (with the erroneous drawing erased) that have been temporarily stored in RAM 106 at steps 312 and 324 in storage unit 108, and the control returns to step 300. Since the contents drawn in the overlay area are saved in storage unit 108, the image with drawing 500 intended by the user can again be displayed if a left scroll is designated later.
  • By the above-described process, in electronic blackboard apparatus 100 in accordance with the first embodiment, when a left or right flick operation is done with multi-touch by a user, the erroneous line drawn by the multi-touch before the start of scrolling is not left and only the drawing intended by the user is saved, and then the scroll can be executed.
  • Though an example in which the image read from storage unit 108 is displayed as a background has been described above, it is not limiting. The background may be a uniform color image (for example, white, black or gray), provided that a function of drawing in response to a touch operation on touch detecting unit 112 is realized.
  • Though an example in which the special touch operation represents a left/right scroll operation, that is, an example in which the scroll function is allocated to the special touch operation, has been described above, it is not limiting. The function to be allocated may be any operation other than drawing and it may be an upward/downward scroll, a scroll to a diagonal direction, or a page switch without scroll. Further, the function to be allocated may be a function (operation) other than drawing, allocated to a function button. If an operation allocated to a scroll in a direction other than the left/right direction is to be detected, the scrolling direction may be determined considering not only the X component but also the Y component of the vector.
  • Further, it is possible that a plurality of users touch electronic blackboard apparatus 100, each with a touch pen or a finger, and to draw a plurality of images simultaneously. In order to distinguish such drawing operations from the operation allocated to the scroll described above (the operation of left/right flick with multi-touch), what is necessary is to determine whether the distance of points touched simultaneously is about the distance between one's fingers (for example, at most a few centimeters).
  • As to the operation to which a function other than drawing is allocated, it is not limited to a multi-touch operation, and it may be a single-touch operation provided that the operation can be distinguished from the normally conducted touch operation for drawing. By way of example, if a track from the start point to the end point of a single-touch operation (a figure drawn continuously without lifting one's finger or pen) is a prescribed shape, the allocated operation other than drawing may be executed.
  • Though an example in which CPU 102 determines whether or not an operation is a multi-touch operation has been described above, it is not limiting. By way of example, a micro computer in touch detecting unit 112 may determine whether or not an operation is a multi-touch operation, and may transmit the result to CPU 102.
  • Though an electronic blackboard apparatus has been described above, it is not limiting, and the present invention is generally applicable to display apparatuses that allow drawing and operation of screen images by touching, including tablet type terminal devices.
  • Second Embodiment
  • [Configuration]
  • An electronic blackboard apparatus in accordance with the second embodiment of the present invention has the same configuration as that shown in FIG. 1 representing the electronic blackboard apparatus in accordance with the first embodiment. The method of detecting a touch input and the displayed screen images of the electronic blackboard apparatus in accordance with the second embodiment are also the same as those described with reference to FIGS. 2 and 3 regarding the electronic blackboard apparatus in accordance with the first embodiment. Therefore, in the following, the electronic blackboard apparatus in accordance with the second embodiment will be described as “electronic blackboard apparatus 100”, referring to FIGS. 1 to 3 where appropriate.
  • A program realizing the page switch function of electronic blackboard apparatus 100 in accordance with the second embodiment using the hardware shown in FIG. 1 when executed by CPU 102 shown in FIG. 1 has the following control structure.
  • Referring to FIG. 8, the program includes steps 600, 602 and 604. At step 600, CPU 102 monitors an output of touch detecting unit 112, and in response to an output detecting a touch by the user from touch detecting unit 112, the control flow proceeds to the next step. At step 602, CPU 102 again detects a new position of touching by a finger based on an output from touch detecting unit 112, and the control proceeds to the next step. At step 604, based on the results from steps 600 and 602, CPU 102 determines whether or not the touch by the user is a multi-touch made with two or more fingers. Touch detecting unit 112 has a function of outputting coordinate data whose number corresponds to the number of touching, in response to the touching by the user. Therefore, the determination as described above can be made based on the outputs from touch detecting unit 112.
  • The program further includes a step 606. At step 606, CPU 102 stores, if the determination at step 604 is positive, the touch positions detected at steps 600 and 602 in RAM 106 (see FIG. 1). Since a multi-touch is detected, here, coordinate values corresponding in number to the number of detected touches are stored. In the subsequent process, the coordinate data of each finger are detected for every detected touch position, and the series of data are stored as a sequence in RAM 106.
  • The program further includes steps 608, 610 and 630. At step 608, following step 606, CPU 102 detects the present positions of fingers from the outputs of touch detecting unit 112. At step 610, based on the result of detection at step 608, CPU 102 determines whether any one of the multi-touch detection outputs is lost (whether or not any finger has been moved away from touch detecting unit 112). If the determination at step 610 is positive, at step 630, CPU 102 compares the X coordinate of the last detected finger position with the X coordinate of the first finger position stored at step 606, and determines whether or not an absolute value of difference between the two coordinates is larger than a threshold value DTH1. In the present embodiment, direction of page movement is along the lateral direction of drawing area 230 (FIG. 3) and, therefore, X coordinates of finger positions are compared as described above.
  • If the determination at step 630 is positive, at step 632, a page displayed in drawing area 230 is changed in accordance with the direction of finger movement. Specifically, if the direction of finger movement is to the left in FIG. 3, the next page of the current page is displayed on drawing area 230, and if it is to the right, the previous page of the current page is displayed on drawing area 230. If the determination at step 630 is negative, at step 634, CPU 102 cancels scrolling of the image displayed in drawing area 230. In the present embodiment, if fingers are slid to the left/right in the state of multi-touch, the image scrolls to the left/right correspondingly. Therefore, if the length of sliding is equal to or smaller than the threshold value DTH1, it is necessary that the screen image display must be returned to the original state. At step 634, the display on drawing area 230 is returned from the scrolled image to the display of original page. After the process of steps 632 and 634, the control flow returns to step 600, to wait for detection of the next position of touching.
  • If the determination at step 610 is negative, that is, if it is determined that all positions of multi-touch are maintained, at step 612, CPU 102 determines whether or not finger positions detected at step 608 are all in the left direction when viewed from the finger positions recorded at step 606. If the determination is positive, at step 614, CPU 102 calculates the amount of movement (absolute value of difference in X coordinate values), and determines whether or not the value is larger than a threshold value DTH2. In the present embodiment, the threshold value DTH2 here is larger than the threshold value DTH1 at step 630. Further, in the present embodiment, the amount of movement is an average value of all amounts of movement of multi-touch finger positions.
  • If the determination at step 614 is positive, at step 616, CPU 102 sets the screen image to be displayed on drawing area 230 to be the image of previous one page. Specifically, if the current image is of the second page, the image of the first page is displayed on drawing area 230 by the process of step 616. Thereafter, the control flow returns to step 600. On the other hand, if the determination at step 614 is negative, at step 618, CPU 102 scrolls the screen image displayed on drawing area 230 to the left by the same length as the amount of movement of finger positions calculated at step 614. At this time, if there is a previous page, the left end of the page is displayed at the right side of drawing area 230. Then, the control flow proceeds to step 608.
  • If the determination at step 612 is negative, at step 620, CPU 102 determines whether or not the finger positions detected at step 608 are in the right direction when viewed from the finger positions recorded at step 606. If the determination is positive, at step 622, CPU 102 calculates the amount of movement (absolute value of difference in X coordinate values), and determines whether or not the value is larger than a threshold value DTH3. The threshold value here is the same as the threshold value DTH2 at step 614.
  • If the determination at step 622 is positive, at step 624, CPU 102 sets the screen image to be displayed on drawing area 230 to be the image of next one page. Specifically, if the current image is of the second page, the image of the third page is displayed on drawing area 230. Thereafter, the control flow returns to step 600. On the other hand, if the determination at step 622 is negative, CPU 102 scrolls the screen image displayed on drawing area 230 to the right by the same length as the amount of movement of finger positions calculated at step 622. At this time, if there is a next page, the right end of the page is displayed at the left side of drawing area 230. Then, the control flow proceeds to step 608.
  • If the determination at step 620 is negative, that is, if X coordinates of finger positions are unchanged, the control proceeds to step 628. At step 628, CPU 102 executes a predetermined process in accordance with the values of X and Y coordinates of finger positions or history thereof. The process that takes place at step 628 may include a process of updating display, when a specific button is pressed with multi-touch, when a specific plurality of buttons are pressed with multi-touch, when a so-called pinch-out is done with the space between fingers made wider, or a so-called pinch-in is done with the space between fingers made narrower. At step 628, such a process is done. Since the contents of processing at step 628 are not related to the present invention, detailed description thereof will not be given here, for simplicity of description. After step 628, the control flow returns to step 608.
  • The control structure of the routine realizing the process for multi-touch operation is as described above.
  • If it is determined at step 604 that the detected finger position is one and it is not a multi-touch operation, the process following step 636 is executed. At step 636, CPU 102 determines whether or not the finger has moved away from drawing area 230 of touch detecting unit 112.
  • If the determination at step 636 is positive, at step 638, based on the finger position at the start of touching detected at step 600 and the finger position immediately before detection of the finger moved away at step 604, CPU 102 executes a process in accordance with these positions. By way of example, the process when function button 242, NEXT button 252 or PREVIOUS button 254 shown in FIG. 3 is pressed corresponds to the process that is executed here. Thereafter, the control flow returns to step 600, and the output of touch detecting unit 112 is monitored until the next touch is detected.
  • If the determination at step 636 is negative, it means that only one finger position is continuously being detected. Here, at step 640, CPU 102 determines whether or not the detected finger position is in drawing area 230. If the determination at step 640 is positive, at step 642, CPU 102 draws an image in accordance with drawing settings at that time on the finger position. Thereafter, the control flow returns to step 602. If the determination at step 640 is negative, drawing is unnecessary and, therefore, the control flow directly returns to step 602.
  • [Operation]
  • Electronic blackboard apparatus 100 operates in the following manner. In the following, the operation of electronic blackboard apparatus 100 related mainly to the page feed will be described. Operations of portions not specifically related to the present invention will not be given here.
  • Referring to FIG. 9, assume that the third page of a document is displayed on the screen of display unit 114. Here, assume that a user 680 touches the screen surface of display unit 114 with two fingers, and slides the two fingers to the right as represented by an arrow 682, with the fingers kept in touch with the surface. The amount of sliding here is a distance D1, which is smaller than the threshold value DTH1.
  • Referring to FIG. 8, the control flow of the program at this time will be described. At step 600, the first finger position is detected and, thereafter, the next finger position is detected at step 602. Here, the operation is multi-touch operation and, therefore, the determination at step 604 is YES. The first touched positions of two fingers are stored in RAM 106 at step 606. At step 608, current finger positions are again detected. Assuming that the user continues sliding as shown in FIG. 9, the determination at step 610 is negative. Therefore, at step 612, whether or not the two finger positions are moving to the left is determined. Here, the two finger positions are both moving to the right. Therefore, the result of determination is negative. As a result, the control proceeds to step 620. At step 620, whether or not the finger positions are moving to the right is determined. In the state shown in FIG. 9, the result of determination is positive. Therefore, next, at step 622, whether or not the amount of movement of the fingers is larger than the threshold value DTH1 is determined. At the beginning of repetition, the amount of movement is smaller than the threshold value DTH1. Therefore, the result of determination is negative. As a result, the screen image is displayed scrolled to the right by the amount equal to the amount of movement of the fingers (average of the amounts of movement of two finger positions) at step 626. Thereafter, the control returns to step 608 and the next repetition process starts.
  • When the user continues sliding, the process described above is repeated time and again. As a result, if the distance D1 of finger movement is equal to or smaller than the threshold value DTH1, the display on display unit 114 is as shown in FIG. 10. Referring to FIG. 10, assume that user 680 touches the display surface with his/her fingers and slides the two fingers by the distance D1 (≦DTH1) with the fingers kept in touch with the display surface. The screen image is scrolled to the right by the same distance as D1. As a result, on the left end of the screen image, a right end portion 696 of the image of the next page is displayed.
  • Assume that after sliding his/her fingers by the distance D1 (≦DTH1) to position 692, the user moves his/her fingers away from the display surface as shown by an arrow 690 and stops sliding. Here, the result of determination at step 610 of FIG. 8 becomes positive, and the control proceeds to step 630. Since D1≦DTH1, the determination at step 630 is negative, and scroll is canceled at step 634. As a result, the screen image returns to the state shown in FIG. 9.
  • On the other hand, referring to FIG. 11, assume that after sliding his/her fingers by the distance D2 (DTH1<D2≦DTH2) to position 716, user 680 moves his/her fingers away from the display surface as shown by an arrow 712 and stops sliding. Immediately before the fingers are moved away, a right end portion 718 of the image of next page is displayed at the left end of the screen image, as in the example of FIG. 10. The width of right end portion 718 is substantially the same as distance D2. When the fingers are moved away, here, the result of determination at step 610 becomes positive and the control proceeds to step 630. As the result of determination at step 630 is positive, at step 632, a process for feeding the screen image by one page to the right is executed. Specifically, as shown by an arrow 720 in FIG. 11, the image of the next page is scrolled to the right by one page, and the next page is displayed on display unit 114 as shown in FIG. 12. In the example shown in FIGS. 11 and 12, the screen image before movement is the third page, and the image after movement is the fourth page.
  • Referring to FIG. 8, thereafter, the control proceeds to step 600 and CPU 102 executes the process of monitoring the output of touch detecting unit 112 until the user touches the screen image next time.
  • Here, assume that different from the example of FIG. 10, even after sliding the fingers by the distance D2, the user does not stop sliding but further continues sliding to the right. Here, the result of determination at step 610 is negative, and the control proceeds to steps 612, 620 and 622. Referring to FIG. 13, while the slid distance D3 satisfies the relation D3≦DTH2, the determination at step 622 is negative. Therefore, at step 626, the screen image attains to the state in which the screen image is scrolled to the right by the distance D3. In the example shown in FIG. 13, a right end portion 744 of the next screen image is displayed at the left end of the screen image. The width of right end portion 744 is substantially the same as distance D3. Thereafter, the control returns to step 608, and the next repetition starts.
  • Assume that, in this state, the user 680 further continues sliding to the right and the distance D3 becomes larger than the threshold value DTH2. Here, the result of determination at step 622 becomes positive, and at step 624, the screen image is fed to the right by one page. Specifically, as shown in FIG. 14, the screen image of the fourth page is displayed on display unit 114. Thereafter, the control returns from step 624 to step 600, and the next position of touching by the user is detected. If the user 680 continues sliding of his/her fingers to the right as shown by an arrow 760, this sliding of the user is detected as a new touch at step 600. Therefore, through the process of steps 600, 602 and 604, the operation described above is repeated, using the position touched by the user 680 at the time point of page switching as a head position.
  • The description above relates to sliding by the user to the right direction. It is apparent from the description that similar operation takes place when the sliding is to the left. Therefore, detailed description thereof will not be repeated.
  • When an upward/downward sliding, a pinch-out or a pinch-in takes place with multi-touch, the control flows through steps 610, 612, 620 and 628 of FIG. 8 while the sliding is being done, and the process in accordance with the user operation is executed. If any operation is done with a single-touch, the control proceeds from step 604 to step 636. Until the finger is moved away, the control proceeds from step 640 to 642 if the operation is in drawing area 230, and by the process at step 642, an image is drawn at the finger position. If the operation is out of drawing are 230, nothing happens in the present embodiment.
  • If the finger is moved away in the single-touch operation, the control proceeds to steps 604, 636 and 638. At step 638, based on the first detected finger position and the finger position immediately before the finger is moved away, a process in accordance with these positions is executed. If it is the case that the user touched any of function buttons 242 and moved away the finger, a predetermined process corresponding to the touched function button 242 is executed at step 638.
  • As described above, in electronic blackboard apparatus 100 in accordance with the second embodiment, if the user slides his/her fingers in the lateral direction on the screen image with multi-touch, the operation is as follows. During sliding, the screen image is scrolled in accordance with the amount of sliding. When the fingers are removed after sliding by the distance D1 and the distance D1 is equal to or smaller than the threshold value DTH1, scroll is cancelled and the screen image before sliding is resumed. If the distance D1 is larger than the threshold value DTH1, the screen image if moved by one page. In the second embodiment described above, the next page is displayed when slid to the right, and the previous page is displayed when slid to the left.
  • If the user slides his/her fingers by a distance longer than the threshold value DTH1, and the slid distance D1 is equal to or smaller than DTH2, page switch does not takes place, and the screen image is scrolled in accordance with the amount of sliding. If the distance D1 exceeds the threshold value DTH2, page switch automatically takes place. If sliding is further continued, the same operation as when sliding is newly started is executed.
  • According to the second embodiment, if sliding is done with multi-touch, the screen image is scrolled. Therefore, it is possible for the user to intuitively understand that the screen image can be scrolled by a multi-touch operation. If touching fingers are removed in the course of multi-touch sliding and the amount of sliding is small, the scroll is canceled and the original screen image is resumed. Even if multi-touch sliding operation is done erroneously during writing on the screen image, the screen image returns to the normal state immediately when the fingers are moved away. Therefore, even when an erroneous operation is done, the influence on the writing operation is small. On the other hand, if the fingers are moved away after sliding over a certain distance, switching to the next page (page feed) occurs. Therefore, the page feed can be realized by an intuitive operation of multi-touch sliding operation. If sliding is further continued with the fingers kept touched on the screen, page feed is automatically executed while sliding continues, and the sliding operation can further be continued. Therefore, advantageously, the operation of continuous page feed is made simple.
  • In the second embodiment described above, it is determined at step 610 of FIG. 8 that sliding ended if only one of multi-touching fingers is moved away from the input screen image. The present invention, however, is not limited to such an embodiment. It may be determined that sliding ended only when all fingers are moved away. In other words, if it is a multi-touch operation when sliding starts, sliding may be continued thereafter even when the multi-touching is lost, and similar effects as in the embodiment above can be attained.
  • Further, in the second embodiment described above, determination as to whether an operation is a multi-touch operation or not is made at step 604 of FIG. 8, and if it is determined not to be a multi-touch operation, a process such as drawing, different from the page switching or scroll process is executed. The present invention, however, is not limited to such an embodiment. By way of example, the process of page switching or scroll may be executed only when the operation is a multi-touch with three or more fingers, and, if touched by one or two fingers, a process such as drawing, other than the page switching or scroll may be executed even if it is a multi-touch with two fingers. Generally speaking, the process of page switching or scroll may be executed only when the operation is a multi-touch with N fingers, and a process such as drawing, other than the page switching or scroll may be executed even if it is a multi-touch with N−1 or smaller number of fingers, with N being an integer larger than 1.
  • Further, as to the method of calculating the distance D1, use of the average value of amounts of movement of a plurality of fingers in the second embodiment is not limiting. By way of example, the amount of movement may be calculated based on only the first detected one finger position. Further, threshold values DTH1 and DTH2 mentioned above may be changed in accordance with the speed of movement of finger positions during sliding.
  • In the second embodiment described above, it is assumed that sliding and page turning are along the lateral direction (X axis direction) of the screen image. It is clear that the present invention is not limited to such an embodiment. For instance, the direction of sliding and page turning may be the upward/downward direction (Y axis direction). Sliding may be done in consideration of both X and Y axes directions. In other words, the present invention is applicable to an embodiment allowing sliding in a diagonal direction. Here, the first and second threshold values may be determined separately for the X axis direction and Y axis direction, or determined to be the same regardless of the direction. Further, the length of sliding may not be divided to components in the X axis direction and Y axis direction, and whether a page is to be turned or not may be determined based on the length of the track of sliding (distance between the start point and end point).
  • Though an example of drawing and sliding with a finger or fingers has been described in the second embodiment above, the present invention is not limited to such an embodiment. For example, a dedicated pen may be used. As to the method of detecting a touched position on the touch-panel, any method may be used provided that touching at a plurality of positions can be detected.
  • The embodiments as have been described here are mere examples and should not be interpreted as restrictive. The scope of the present invention is determined by each of the claims with appropriate consideration of the written description of the embodiments and embraces modifications within the meaning of, and equivalent to, the languages in the claims.

Claims (7)

1. A touch drawing display apparatus, comprising:
a display unit that displays an image;
a detecting unit that is arranged on said display unit and detects a touched position;
a drawing unit that draws, on an image displayed on said display unit, a line corresponding to a track formed by movement of said detected position; and
an erasing unit that erases, when an operation other than drawing is specified by said track, a line drawn from a start point to an end point of said track specifying said operation.
2. The touch drawing display apparatus according to claim 1, wherein said track for specifying said operation other than drawing is a track formed by a multi-touch operation of simultaneously touching a plurality of positions of said detecting unit.
3. A method of operating a touch drawing display apparatus that includes a display unit that displays an image and a detecting unit that is arranged on said display unit and detects a touched position, comprising the steps of:
drawing, on an image displayed on said display unit, a line corresponding to a track formed by movement of said detected position;
determining whether or not an operation other than drawing is specified by said track while said drawing step is being executed; and
erasing, if an operation other than drawing is specified by said track, a line drawn from a start point to an end point of said track specifying said operation.
4. An image display apparatus allowing touch-input, comprising:
a touch detecting unit that has a display screen, displays page images page by page on the display screen, and detects positions and the number of touch inputs that designate positions on said display screen;
a scroll unit that scrolls, when a plurality of touch inputs are detected by said touch detecting unit and their positions on the display screen move in one same direction, an image displayed on said display screen along with movement of positions of said plurality of touch inputs; and
a first page switching unit that detects, after the plurality of touch inputs are detected by said touch detecting unit, decrease in the number of touch inputs from an output of said touch detecting unit and, depending on whether an amount of movement of the touch inputs is not larger than a first threshold value, selectively executes a process of returning the image on said display screen to a state before scrolling and a process of switching the image on said display screen by one page in accordance with the direction of movement of said plurality of touch inputs.
5. A controller for a display apparatus, used for the display apparatus that has a display screen, displays an image received from outside on the display screen and detects and outputs, to the outside, positions and the number of touch inputs that designate positions on said display screen, comprising:
a scroll unit that generates, when it is detected that a plurality of touch inputs are detected by said display apparatus and their positions on the display screen move in one same direction, an image by scrolling the image displayed on said display screen along with the movement of positions of said plurality of touch positions, and applies the generated image to said display apparatus; and
a page switching unit that detects, after the plurality of touch inputs are detected by said display apparatus, decrease in the number of touch inputs from an output of said display apparatus, depending on whether an amount of movement of the touch inputs is not larger than a first threshold value, selectively executes a process of generating page image data for returning the image on said display screen to a state before scrolling and a process of generating page image data for switching the image on said display screen by one page in accordance with the direction of movement of said plurality of touch inputs, and transmits the generated page image data to said display apparatus.
6. An image display apparatus allowing touch input, comprising:
a touch detecting unit that has a display screen, displays page images page by page on the display screen, and detects positions and number of touch inputs that designate positions on said display screen;
a scroll unit that scrolls, when a plurality of touch inputs are detected by said touch detecting unit and their positions on the display screen move in one same direction, an image displayed on said display screen along with movement of positions of said plurality of touch inputs;
a page switching unit that switches, in response to an amount of movement of said plurality of touch inputs exceeding a threshold value during scrolling of the image by said scroll unit, the image on said display screen by one page in accordance with the direction of movement of said plurality of touch inputs; and
a returning unit that detects, from an output of said touch detecting unit, decrease of the number of said plurality of touch inputs before page switching by said page switching unit during scrolling of the image by said scroll unit, and returns scrolling of the image by said scroll unit to a state before scrolling.
7. A controller for a display apparatus, used for the display apparatus that has a display screen, displays an image received from outside on the display screen and detects and outputs, to the outside, positions and number of touch inputs that designate a position on said display screen, comprising:
a scroll unit that generates, when it is detected that a plurality of touch inputs are detected by said display apparatus and their positions on the display screen move in one same direction, an image by scrolling the image displayed on said display screen along with the movement of positions of said plurality of touch positions, and applies the generated image to said display apparatus;
a page switching unit that switches, in response to an amount of movement of said plurality of touch inputs exceeding a threshold value during scrolling of the image by said scroll unit, the image on said display screen by one page in accordance with the direction of movement of said plurality of touch inputs; and
a returning unit that detects, from an output of said display apparatus, decrease of the number of said plurality of touch inputs before page switching by said page switching unit during scrolling of the image by said scroll unit, and returns scrolling of the image by said scroll unit to a state before scrolling.
US13/370,049 2011-02-10 2012-02-09 Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus Abandoned US20120218203A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/823,681 US10191648B2 (en) 2011-02-10 2015-08-11 Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2011027236A JP5536690B2 (en) 2011-02-10 2011-02-10 Touch drawing display device and operation method thereof
JP2011027237A JP5537458B2 (en) 2011-02-10 2011-02-10 Image display device capable of touch input, control device for display device, and computer program
JP2011-027237(P) 2011-02-10
JP2011-027236(P) 2011-02-10

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/823,681 Division US10191648B2 (en) 2011-02-10 2015-08-11 Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus

Publications (1)

Publication Number Publication Date
US20120218203A1 true US20120218203A1 (en) 2012-08-30

Family

ID=46718650

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/370,049 Abandoned US20120218203A1 (en) 2011-02-10 2012-02-09 Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus
US14/823,681 Active 2032-03-26 US10191648B2 (en) 2011-02-10 2015-08-11 Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/823,681 Active 2032-03-26 US10191648B2 (en) 2011-02-10 2015-08-11 Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus

Country Status (2)

Country Link
US (2) US20120218203A1 (en)
CN (2) CN104636049B (en)

Cited By (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130127758A1 (en) * 2011-11-23 2013-05-23 Samsung Electronics Co., Ltd. Touch input apparatus and method in user terminal
US20130342486A1 (en) * 2012-06-22 2013-12-26 Smart Technologies Ulc Automatic annotation de-emphasis
US20140059626A1 (en) * 2012-08-17 2014-02-27 Flextronics Ap, Llc Systems and methods for providing video on demand in an intelligent television
US20140062914A1 (en) * 2012-09-03 2014-03-06 Acer Incorporated Electronic apparatus and control method using the same
US20140096071A1 (en) * 2012-10-03 2014-04-03 Konica Minolta, Inc. Display system, display device, and image forming device
WO2014082522A1 (en) * 2012-11-30 2014-06-05 小米科技有限责任公司 Interface identifier selection method, device and mobile terminal
US20140160076A1 (en) * 2012-12-10 2014-06-12 Seiko Epson Corporation Display device, and method of controlling display device
US20140181730A1 (en) * 2012-12-21 2014-06-26 Orange Fragmented scrolling of a page
WO2014150725A1 (en) * 2013-03-15 2014-09-25 Qualcomm Incorporated Detection of a gesture performed with at least two control objects
US20140304625A1 (en) * 2013-04-03 2014-10-09 Alibaba Group Holding Limited Page returning
US20150026619A1 (en) * 2013-07-17 2015-01-22 Korea Advanced Institute Of Science And Technology User Interface Method and Apparatus Using Successive Touches
US9076085B2 (en) * 2012-02-15 2015-07-07 Canon Kabushiki Kaisha Image processing apparatus, image processing apparatus control method, and storage medium
US9124739B2 (en) 2013-03-25 2015-09-01 Konica Minolta, Inc. Image forming apparatus, page image displaying device, and display processing method
US20150268827A1 (en) * 2014-03-24 2015-09-24 Hideep Inc. Method for controlling moving direction of display object and a terminal thereof
US20150363026A1 (en) * 2014-06-16 2015-12-17 Touchplus Information Corp. Control device, operation mode altering method thereof, control method thereof and battery power warning method thereof
CN105487687A (en) * 2015-11-23 2016-04-13 广州视睿电子科技有限公司 Handwriting display method and apparatus
US20160260410A1 (en) * 2015-03-03 2016-09-08 Seiko Epson Corporation Display apparatus and display control method
US20160313883A1 (en) * 2013-09-09 2016-10-27 Huawei Technologies Co., Ltd. Screen Capture Method, Apparatus, and Terminal Device
US20170046024A1 (en) * 2015-08-10 2017-02-16 Apple Inc. Devices, Methods, and Graphical User Interfaces for Manipulating User Interface Objects with Visual and/or Haptic Feedback
US9602729B2 (en) 2015-06-07 2017-03-21 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9612741B2 (en) 2012-05-09 2017-04-04 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
US9619076B2 (en) 2012-05-09 2017-04-11 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US9632664B2 (en) 2015-03-08 2017-04-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9639184B2 (en) 2015-03-19 2017-05-02 Apple Inc. Touch input cursor manipulation
US9645732B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US9753639B2 (en) 2012-05-09 2017-09-05 Apple Inc. Device, method, and graphical user interface for displaying content associated with a corresponding affordance
US9778771B2 (en) 2012-12-29 2017-10-03 Apple Inc. Device, method, and graphical user interface for transitioning between touch input to display output relationships
US9785305B2 (en) 2015-03-19 2017-10-10 Apple Inc. Touch input cursor manipulation
US9830048B2 (en) 2015-06-07 2017-11-28 Apple Inc. Devices and methods for processing touch inputs with instructions in a web page
US9886184B2 (en) 2012-05-09 2018-02-06 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US9891811B2 (en) 2015-06-07 2018-02-13 Apple Inc. Devices and methods for navigating between user interfaces
US9959025B2 (en) 2012-12-29 2018-05-01 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US20180136812A1 (en) * 2012-07-16 2018-05-17 Samsung Electronics Co., Ltd. Touch and non-contact gesture based screen switching method and terminal
US9990121B2 (en) 2012-05-09 2018-06-05 Apple Inc. Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US9990107B2 (en) 2015-03-08 2018-06-05 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US9996231B2 (en) 2012-05-09 2018-06-12 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US10037138B2 (en) 2012-12-29 2018-07-31 Apple Inc. Device, method, and graphical user interface for switching between user interfaces
US10042542B2 (en) 2012-05-09 2018-08-07 Apple Inc. Device, method, and graphical user interface for moving and dropping a user interface object
US10048757B2 (en) 2015-03-08 2018-08-14 Apple Inc. Devices and methods for controlling media presentation
US10067653B2 (en) 2015-04-01 2018-09-04 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10073615B2 (en) 2012-05-09 2018-09-11 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US10078442B2 (en) 2012-12-29 2018-09-18 Apple Inc. Device, method, and graphical user interface for determining whether to scroll or select content based on an intensity theshold
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US10095391B2 (en) 2012-05-09 2018-10-09 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
US10126930B2 (en) 2012-05-09 2018-11-13 Apple Inc. Device, method, and graphical user interface for scrolling nested regions
CN108958594A (en) * 2018-05-23 2018-12-07 郑州云海信息技术有限公司 A kind of method for page jump and equipment
US10162452B2 (en) 2015-08-10 2018-12-25 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10175864B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for selecting object within a group of objects in accordance with contact intensity
US10175757B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for touch-based operations performed and reversed in a user interface
US10200598B2 (en) 2015-06-07 2019-02-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10235035B2 (en) 2015-08-10 2019-03-19 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US10248308B2 (en) 2015-08-10 2019-04-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures
US10275087B1 (en) 2011-08-05 2019-04-30 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10346030B2 (en) 2015-06-07 2019-07-09 Apple Inc. Devices and methods for navigating between user interfaces
US10416800B2 (en) 2015-08-10 2019-09-17 Apple Inc. Devices, methods, and graphical user interfaces for adjusting user interface objects
US10437333B2 (en) 2012-12-29 2019-10-08 Apple Inc. Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture
CN110471640A (en) * 2018-10-26 2019-11-19 珠海中电数码科技有限公司 A kind of multi-screen interaction method and its system
US10496260B2 (en) 2012-05-09 2019-12-03 Apple Inc. Device, method, and graphical user interface for pressure-based alteration of controls in a user interface
US10620781B2 (en) 2012-12-29 2020-04-14 Apple Inc. Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics
US10891044B1 (en) * 2016-10-25 2021-01-12 Twitter, Inc. Automatic positioning of content items in a scrolling display for optimal viewing of the items
US11119564B2 (en) * 2012-05-23 2021-09-14 Kabushiki Kaisha Square Enix Information processing apparatus, method for information processing, and game apparatus for performing different operations based on a movement of inputs
US20210311622A1 (en) * 2020-04-02 2021-10-07 Beijing Dajia Internet Information Technology Co., Ltd. Method and apparatus for obtaining content
US11169684B2 (en) * 2017-02-15 2021-11-09 Canon Kabushiki Kaisha Display control apparatuses, control methods therefor, and computer readable storage medium
US11209921B2 (en) * 2015-09-30 2021-12-28 Ricoh Company, Ltd. Electronic blackboard, storage medium, and information display method
US11368760B2 (en) 2012-08-17 2022-06-21 Flextronics Ap, Llc Applications generating statistics for user behavior

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014079043A1 (en) * 2012-11-23 2014-05-30 华为技术有限公司 Method and device for realizing remote browsing
JP5761216B2 (en) * 2013-01-22 2015-08-12 カシオ計算機株式会社 Information processing apparatus, information processing method, and program
JP5984722B2 (en) * 2013-03-22 2016-09-06 シャープ株式会社 Information processing device
CN104423854A (en) * 2013-08-23 2015-03-18 鸿合科技有限公司 Method and device for processing information on touch screen
CN104571890B (en) * 2013-10-10 2017-12-19 京微雅格(北京)科技有限公司 A kind of touch-control slides display system, electronic equipment and display methods
JP5977768B2 (en) * 2014-01-14 2016-08-24 シャープ株式会社 Image display apparatus and operation method thereof
EP3245954A4 (en) * 2015-01-16 2018-10-03 Olympus Corporation Ultrasonic observation system
CN106168864A (en) 2015-05-18 2016-11-30 佳能株式会社 Display control unit and display control method
JP6919174B2 (en) * 2016-10-26 2021-08-18 セイコーエプソン株式会社 Touch panel device and touch panel control program
CN108205407B (en) * 2016-12-20 2021-07-06 夏普株式会社 Display device, display method, and storage medium
JP6995605B2 (en) * 2017-12-18 2022-01-14 キヤノン株式会社 Electronic devices, control methods for electronic devices, programs and storage media
JP7064173B2 (en) * 2018-05-11 2022-05-10 富士フイルムビジネスイノベーション株式会社 Information processing equipment and programs
JP2020042625A (en) * 2018-09-12 2020-03-19 株式会社東海理化電機製作所 Tactile sense presentation device and tactile sense presentation method
CN112346581A (en) * 2019-08-07 2021-02-09 南京中兴新软件有限责任公司 Method and device for drawing movement track and computer readable storage medium
JP7174926B2 (en) * 2019-12-17 2022-11-18 パナソニックIpマネジメント株式会社 Display control system, moving object, display control method, display device, display method and program

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5111316A (en) * 1990-08-09 1992-05-05 Western Publishing Company Liquid crystal writing state

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB8908612D0 (en) * 1989-04-17 1989-06-01 Quantel Ltd Video graphics system
JPH06175776A (en) 1992-11-27 1994-06-24 Wacom Co Ltd Presentation device
US5761340A (en) 1993-04-28 1998-06-02 Casio Computer Co., Ltd. Data editing method and system for a pen type input device
JP3388451B2 (en) 1993-05-21 2003-03-24 カシオ計算機株式会社 Handwriting input device
JPH0876926A (en) 1994-09-02 1996-03-22 Brother Ind Ltd Picture display device
JPH09231004A (en) 1996-02-23 1997-09-05 Yazaki Corp Information processor
JPH10124239A (en) 1996-10-22 1998-05-15 Sharp Corp Tabelt input device
JPH11102274A (en) 1997-09-25 1999-04-13 Nec Corp Scroll device
JP2000222130A (en) 1999-02-02 2000-08-11 Toshiba Corp Input device and method and storage medium
JP2001117686A (en) 1999-10-20 2001-04-27 Toshiba Corp Pen-inputting device and pointing processing method for the device
JP4803883B2 (en) 2000-01-31 2011-10-26 キヤノン株式会社 Position information processing apparatus and method and program thereof.
US7138983B2 (en) 2000-01-31 2006-11-21 Canon Kabushiki Kaisha Method and apparatus for detecting and interpreting path of designated position
GB0117543D0 (en) * 2001-07-18 2001-09-12 Hewlett Packard Co Document viewing device
US6690365B2 (en) * 2001-08-29 2004-02-10 Microsoft Corporation Automatic scrolling
KR100486711B1 (en) 2002-08-12 2005-05-03 삼성전기주식회사 Apparatus and method for turning pages personal information terminal
JP4157337B2 (en) 2002-08-12 2008-10-01 株式会社リコー Display device with touch panel and control method of display device with touch panel
JP4215549B2 (en) * 2003-04-02 2009-01-28 富士通株式会社 Information processing device that operates in touch panel mode and pointing device mode
TWI248576B (en) * 2004-07-05 2006-02-01 Elan Microelectronics Corp Method for controlling rolling of scroll bar on a touch panel
WO2006066456A1 (en) * 2004-12-23 2006-06-29 Dong Li A inductive-switch user interface device and portable terminal thereof
JP2007316732A (en) 2006-05-23 2007-12-06 Sharp Corp Item selection device, information processor and computer program for item selection
CN101226440A (en) * 2007-01-17 2008-07-23 汉王科技股份有限公司 Touch control induction key-press hand-written painting plate and implementing method
US8334847B2 (en) * 2007-10-19 2012-12-18 Qnx Software Systems Limited System having user interface using object selection and gestures
JP5239328B2 (en) 2007-12-21 2013-07-17 ソニー株式会社 Information processing apparatus and touch motion recognition method
JP5098961B2 (en) 2008-11-05 2012-12-12 日本電気株式会社 Image display apparatus, method, and program
CN101739190B (en) * 2008-11-10 2012-09-05 汉王科技股份有限公司 Hand writing display device with capacitance type touch control keys
JP5232034B2 (en) 2009-02-06 2013-07-10 アルプス電気株式会社 Input processing device
US8681106B2 (en) * 2009-06-07 2014-03-25 Apple Inc. Devices, methods, and graphical user interfaces for accessibility using a touch-sensitive surface
FR2950169B1 (en) * 2009-09-11 2012-03-23 Milibris MOBILE TERMINAL WITH TOUCH SCREEN
US8749499B2 (en) * 2010-06-08 2014-06-10 Sap Ag Touch screen for bridging multi and/or single touch points to applications
CN104102422B (en) * 2013-04-03 2018-05-01 阿里巴巴集团控股有限公司 The page returns to the method and device of operation
KR102210045B1 (en) * 2013-12-12 2021-02-01 삼성전자 주식회사 Apparatus and method for contrlling an input of electronic device having a touch device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5111316A (en) * 1990-08-09 1992-05-05 Western Publishing Company Liquid crystal writing state

Cited By (190)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10275087B1 (en) 2011-08-05 2019-04-30 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10338736B1 (en) 2011-08-05 2019-07-02 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10345961B1 (en) 2011-08-05 2019-07-09 P4tents1, LLC Devices and methods for navigating between user interfaces
US10365758B1 (en) 2011-08-05 2019-07-30 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10386960B1 (en) 2011-08-05 2019-08-20 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10540039B1 (en) 2011-08-05 2020-01-21 P4tents1, LLC Devices and methods for navigating between user interface
US10649571B1 (en) 2011-08-05 2020-05-12 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10656752B1 (en) 2011-08-05 2020-05-19 P4tents1, LLC Gesture-equipped touch screen system, method, and computer program product
US10664097B1 (en) 2011-08-05 2020-05-26 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US20130127758A1 (en) * 2011-11-23 2013-05-23 Samsung Electronics Co., Ltd. Touch input apparatus and method in user terminal
US9158397B2 (en) * 2011-11-23 2015-10-13 Samsung Electronics Co., Ltd Touch input apparatus and method in user terminal
US9076085B2 (en) * 2012-02-15 2015-07-07 Canon Kabushiki Kaisha Image processing apparatus, image processing apparatus control method, and storage medium
US10942570B2 (en) 2012-05-09 2021-03-09 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
US11947724B2 (en) 2012-05-09 2024-04-02 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
US10884591B2 (en) 2012-05-09 2021-01-05 Apple Inc. Device, method, and graphical user interface for selecting object within a group of objects
US10782871B2 (en) 2012-05-09 2020-09-22 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US10775999B2 (en) 2012-05-09 2020-09-15 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US10775994B2 (en) 2012-05-09 2020-09-15 Apple Inc. Device, method, and graphical user interface for moving and dropping a user interface object
US9612741B2 (en) 2012-05-09 2017-04-04 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
US10969945B2 (en) 2012-05-09 2021-04-06 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
US10996788B2 (en) 2012-05-09 2021-05-04 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US11010027B2 (en) 2012-05-09 2021-05-18 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US10592041B2 (en) 2012-05-09 2020-03-17 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US11023116B2 (en) 2012-05-09 2021-06-01 Apple Inc. Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US11068153B2 (en) 2012-05-09 2021-07-20 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US10496260B2 (en) 2012-05-09 2019-12-03 Apple Inc. Device, method, and graphical user interface for pressure-based alteration of controls in a user interface
US10481690B2 (en) 2012-05-09 2019-11-19 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for media adjustment operations performed in a user interface
US9619076B2 (en) 2012-05-09 2017-04-11 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US11221675B2 (en) 2012-05-09 2022-01-11 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
US11314407B2 (en) 2012-05-09 2022-04-26 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US11354033B2 (en) 2012-05-09 2022-06-07 Apple Inc. Device, method, and graphical user interface for managing icons in a user interface region
US10908808B2 (en) 2012-05-09 2021-02-02 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
US10191627B2 (en) 2012-05-09 2019-01-29 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US10175757B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for touch-based operations performed and reversed in a user interface
US10175864B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for selecting object within a group of objects in accordance with contact intensity
US10168826B2 (en) 2012-05-09 2019-01-01 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US10126930B2 (en) 2012-05-09 2018-11-13 Apple Inc. Device, method, and graphical user interface for scrolling nested regions
US10114546B2 (en) 2012-05-09 2018-10-30 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US10095391B2 (en) 2012-05-09 2018-10-09 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
US10073615B2 (en) 2012-05-09 2018-09-11 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US10042542B2 (en) 2012-05-09 2018-08-07 Apple Inc. Device, method, and graphical user interface for moving and dropping a user interface object
US9996231B2 (en) 2012-05-09 2018-06-12 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US9753639B2 (en) 2012-05-09 2017-09-05 Apple Inc. Device, method, and graphical user interface for displaying content associated with a corresponding affordance
US9990121B2 (en) 2012-05-09 2018-06-05 Apple Inc. Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US9823839B2 (en) 2012-05-09 2017-11-21 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
US9971499B2 (en) 2012-05-09 2018-05-15 Apple Inc. Device, method, and graphical user interface for displaying content associated with a corresponding affordance
US9886184B2 (en) 2012-05-09 2018-02-06 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US11119564B2 (en) * 2012-05-23 2021-09-14 Kabushiki Kaisha Square Enix Information processing apparatus, method for information processing, and game apparatus for performing different operations based on a movement of inputs
US20130342486A1 (en) * 2012-06-22 2013-12-26 Smart Technologies Ulc Automatic annotation de-emphasis
US9323367B2 (en) * 2012-06-22 2016-04-26 Smart Technologies Ulc Automatic annotation de-emphasis
US20180136812A1 (en) * 2012-07-16 2018-05-17 Samsung Electronics Co., Ltd. Touch and non-contact gesture based screen switching method and terminal
US9414108B2 (en) 2012-08-17 2016-08-09 Flextronics Ap, Llc Electronic program guide and preview window
US9215393B2 (en) 2012-08-17 2015-12-15 Flextronics Ap, Llc On-demand creation of reports
US20140059626A1 (en) * 2012-08-17 2014-02-27 Flextronics Ap, Llc Systems and methods for providing video on demand in an intelligent television
US11782512B2 (en) 2012-08-17 2023-10-10 Multimedia Technologies Pte, Ltd Systems and methods for providing video on demand in an intelligent television
US9185323B2 (en) 2012-08-17 2015-11-10 Flextronics Ap, Llc Systems and methods for providing social media with an intelligent television
US9172896B2 (en) 2012-08-17 2015-10-27 Flextronics Ap, Llc Content-sensitive and context-sensitive user interface for an intelligent television
US9432742B2 (en) 2012-08-17 2016-08-30 Flextronics Ap, Llc Intelligent channel changing
US11474615B2 (en) 2012-08-17 2022-10-18 Flextronics Ap, Llc Systems and methods for providing user interfaces in an intelligent television
US11368760B2 (en) 2012-08-17 2022-06-21 Flextronics Ap, Llc Applications generating statistics for user behavior
US9237291B2 (en) 2012-08-17 2016-01-12 Flextronics Ap, Llc Method and system for locating programming on a television
US9264775B2 (en) 2012-08-17 2016-02-16 Flextronics Ap, Llc Systems and methods for managing data in an intelligent television
US9686582B2 (en) 2012-08-17 2017-06-20 Flextronics Ap, Llc Systems and methods for managing data in an intelligent television
US9055254B2 (en) 2012-08-17 2015-06-09 Flextronics Ap, Llc On screen method and system for changing television channels
US9426515B2 (en) 2012-08-17 2016-08-23 Flextronics Ap, Llc Systems and methods for providing social media with an intelligent television
US9185325B2 (en) 2012-08-17 2015-11-10 Flextronics Ap, Llc Systems and methods for providing video on demand in an intelligent television
US11150736B2 (en) 2012-08-17 2021-10-19 Flextronics Ap, Llc Systems and methods for providing user interfaces in an intelligent television
US9426527B2 (en) 2012-08-17 2016-08-23 Flextronics Ap, Llc Systems and methods for providing video on demand in an intelligent television
US9271039B2 (en) 2012-08-17 2016-02-23 Flextronics Ap, Llc Live television application setup behavior
US9191708B2 (en) 2012-08-17 2015-11-17 Jamdeo Technologies Ltd. Content-sensitive user interface for an intelligent television
US11119579B2 (en) 2012-08-17 2021-09-14 Flextronics Ap, Llc On screen header bar for providing program information
US9055255B2 (en) 2012-08-17 2015-06-09 Flextronics Ap, Llc Live television application on top of live feed
US9066040B2 (en) 2012-08-17 2015-06-23 Flextronics Ap, Llc Systems and methods for providing video on demand in an intelligent television
US9167186B2 (en) 2012-08-17 2015-10-20 Flextronics Ap, Llc Systems and methods for managing data in an intelligent television
US9380334B2 (en) 2012-08-17 2016-06-28 Flextronics Ap, Llc Systems and methods for providing user interfaces in an intelligent television
US9167187B2 (en) 2012-08-17 2015-10-20 Flextronics Ap, Llc Systems and methods for providing video on demand in an intelligent television
US10506294B2 (en) 2012-08-17 2019-12-10 Flextronics Ap, Llc Systems and methods for providing user interfaces in an intelligent television
US9021517B2 (en) * 2012-08-17 2015-04-28 Flextronics Ap, Llc Systems and methods for providing video on demand in an intelligent television
US9118967B2 (en) 2012-08-17 2015-08-25 Jamdeo Technologies Ltd. Channel changer for intelligent television
US9118864B2 (en) 2012-08-17 2015-08-25 Flextronics Ap, Llc Interactive channel navigation and switching
US9191604B2 (en) 2012-08-17 2015-11-17 Flextronics Ap, Llc Systems and methods for providing user interfaces in an intelligent television
US9374546B2 (en) 2012-08-17 2016-06-21 Flextronics Ap, Llc Location-based context for UI components
US9369654B2 (en) 2012-08-17 2016-06-14 Flextronics Ap, Llc EPG data interface
US9106866B2 (en) 2012-08-17 2015-08-11 Flextronics Ap, Llc Systems and methods for providing user interfaces in an intelligent television
US9363457B2 (en) 2012-08-17 2016-06-07 Flextronics Ap, Llc Systems and methods for providing social media with an intelligent television
US9247174B2 (en) 2012-08-17 2016-01-26 Flextronics Ap, Llc Panel user interface for an intelligent television
US9232168B2 (en) 2012-08-17 2016-01-05 Flextronics Ap, Llc Systems and methods for providing user interfaces in an intelligent television
US9077928B2 (en) 2012-08-17 2015-07-07 Flextronics Ap, Llc Data reporting of usage statistics
US9185324B2 (en) 2012-08-17 2015-11-10 Flextronics Ap, Llc Sourcing EPG data
US9301003B2 (en) 2012-08-17 2016-03-29 Jamdeo Technologies Ltd. Content-sensitive user interface for an intelligent television
US10051314B2 (en) 2012-08-17 2018-08-14 Jamdeo Technologies Ltd. Method and system for changing programming on a television
US9052773B2 (en) * 2012-09-03 2015-06-09 Acer Incorporated Electronic apparatus and control method using the same
US20140062914A1 (en) * 2012-09-03 2014-03-06 Acer Incorporated Electronic apparatus and control method using the same
US20140096071A1 (en) * 2012-10-03 2014-04-03 Konica Minolta, Inc. Display system, display device, and image forming device
WO2014082522A1 (en) * 2012-11-30 2014-06-05 小米科技有限责任公司 Interface identifier selection method, device and mobile terminal
US9904414B2 (en) * 2012-12-10 2018-02-27 Seiko Epson Corporation Display device, and method of controlling display device
US20140160076A1 (en) * 2012-12-10 2014-06-12 Seiko Epson Corporation Display device, and method of controlling display device
US9880726B2 (en) * 2012-12-21 2018-01-30 Orange Fragmented scrolling of a page
US20140181730A1 (en) * 2012-12-21 2014-06-26 Orange Fragmented scrolling of a page
US10037138B2 (en) 2012-12-29 2018-07-31 Apple Inc. Device, method, and graphical user interface for switching between user interfaces
US9857897B2 (en) 2012-12-29 2018-01-02 Apple Inc. Device and method for assigning respective portions of an aggregate intensity to a plurality of contacts
US10078442B2 (en) 2012-12-29 2018-09-18 Apple Inc. Device, method, and graphical user interface for determining whether to scroll or select content based on an intensity theshold
US10915243B2 (en) 2012-12-29 2021-02-09 Apple Inc. Device, method, and graphical user interface for adjusting content selection
US9778771B2 (en) 2012-12-29 2017-10-03 Apple Inc. Device, method, and graphical user interface for transitioning between touch input to display output relationships
US9996233B2 (en) 2012-12-29 2018-06-12 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US10437333B2 (en) 2012-12-29 2019-10-08 Apple Inc. Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture
US10620781B2 (en) 2012-12-29 2020-04-14 Apple Inc. Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics
US10175879B2 (en) 2012-12-29 2019-01-08 Apple Inc. Device, method, and graphical user interface for zooming a user interface while performing a drag operation
US10101887B2 (en) 2012-12-29 2018-10-16 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US10185491B2 (en) 2012-12-29 2019-01-22 Apple Inc. Device, method, and graphical user interface for determining whether to scroll or enlarge content
US9965074B2 (en) 2012-12-29 2018-05-08 Apple Inc. Device, method, and graphical user interface for transitioning between touch input to display output relationships
US9959025B2 (en) 2012-12-29 2018-05-01 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
WO2014150725A1 (en) * 2013-03-15 2014-09-25 Qualcomm Incorporated Detection of a gesture performed with at least two control objects
US9124739B2 (en) 2013-03-25 2015-09-01 Konica Minolta, Inc. Image forming apparatus, page image displaying device, and display processing method
WO2014165534A1 (en) * 2013-04-03 2014-10-09 Alibaba Group Holding Limited Page returning
US20140304625A1 (en) * 2013-04-03 2014-10-09 Alibaba Group Holding Limited Page returning
US20150026619A1 (en) * 2013-07-17 2015-01-22 Korea Advanced Institute Of Science And Technology User Interface Method and Apparatus Using Successive Touches
US9612736B2 (en) * 2013-07-17 2017-04-04 Korea Advanced Institute Of Science And Technology User interface method and apparatus using successive touches
US9983770B2 (en) * 2013-09-09 2018-05-29 Huawei Technologies Co., Ltd. Screen capture method, apparatus, and terminal device
US20160313883A1 (en) * 2013-09-09 2016-10-27 Huawei Technologies Co., Ltd. Screen Capture Method, Apparatus, and Terminal Device
US20150268827A1 (en) * 2014-03-24 2015-09-24 Hideep Inc. Method for controlling moving direction of display object and a terminal thereof
US20150363026A1 (en) * 2014-06-16 2015-12-17 Touchplus Information Corp. Control device, operation mode altering method thereof, control method thereof and battery power warning method thereof
US9898996B2 (en) * 2015-03-03 2018-02-20 Seiko Epson Corporation Display apparatus and display control method
US20160260410A1 (en) * 2015-03-03 2016-09-08 Seiko Epson Corporation Display apparatus and display control method
US9990107B2 (en) 2015-03-08 2018-06-05 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US10180772B2 (en) 2015-03-08 2019-01-15 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10338772B2 (en) 2015-03-08 2019-07-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10387029B2 (en) 2015-03-08 2019-08-20 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US10402073B2 (en) 2015-03-08 2019-09-03 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US11112957B2 (en) 2015-03-08 2021-09-07 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US10048757B2 (en) 2015-03-08 2018-08-14 Apple Inc. Devices and methods for controlling media presentation
US10067645B2 (en) 2015-03-08 2018-09-04 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US10268342B2 (en) 2015-03-08 2019-04-23 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10268341B2 (en) 2015-03-08 2019-04-23 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10860177B2 (en) 2015-03-08 2020-12-08 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9645732B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US9645709B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9632664B2 (en) 2015-03-08 2017-04-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10613634B2 (en) 2015-03-08 2020-04-07 Apple Inc. Devices and methods for controlling media presentation
US9785305B2 (en) 2015-03-19 2017-10-10 Apple Inc. Touch input cursor manipulation
US11054990B2 (en) 2015-03-19 2021-07-06 Apple Inc. Touch input cursor manipulation
US9639184B2 (en) 2015-03-19 2017-05-02 Apple Inc. Touch input cursor manipulation
US10599331B2 (en) 2015-03-19 2020-03-24 Apple Inc. Touch input cursor manipulation
US10222980B2 (en) 2015-03-19 2019-03-05 Apple Inc. Touch input cursor manipulation
US11550471B2 (en) 2015-03-19 2023-01-10 Apple Inc. Touch input cursor manipulation
US10152208B2 (en) 2015-04-01 2018-12-11 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10067653B2 (en) 2015-04-01 2018-09-04 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US9916080B2 (en) 2015-06-07 2018-03-13 Apple Inc. Devices and methods for navigating between user interfaces
US10200598B2 (en) 2015-06-07 2019-02-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9860451B2 (en) 2015-06-07 2018-01-02 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10705718B2 (en) 2015-06-07 2020-07-07 Apple Inc. Devices and methods for navigating between user interfaces
US9706127B2 (en) 2015-06-07 2017-07-11 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US11231831B2 (en) 2015-06-07 2022-01-25 Apple Inc. Devices and methods for content preview based on touch input intensity
US9602729B2 (en) 2015-06-07 2017-03-21 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US11240424B2 (en) 2015-06-07 2022-02-01 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10455146B2 (en) 2015-06-07 2019-10-22 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10303354B2 (en) 2015-06-07 2019-05-28 Apple Inc. Devices and methods for navigating between user interfaces
US10346030B2 (en) 2015-06-07 2019-07-09 Apple Inc. Devices and methods for navigating between user interfaces
US9830048B2 (en) 2015-06-07 2017-11-28 Apple Inc. Devices and methods for processing touch inputs with instructions in a web page
US11835985B2 (en) 2015-06-07 2023-12-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US11681429B2 (en) 2015-06-07 2023-06-20 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9891811B2 (en) 2015-06-07 2018-02-13 Apple Inc. Devices and methods for navigating between user interfaces
US10841484B2 (en) 2015-06-07 2020-11-17 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10209884B2 (en) 2015-08-10 2019-02-19 Apple Inc. Devices, Methods, and Graphical User Interfaces for Manipulating User Interface Objects with Visual and/or Haptic Feedback
US10162452B2 (en) 2015-08-10 2018-12-25 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US9880735B2 (en) * 2015-08-10 2018-01-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
CN107533368A (en) * 2015-08-10 2018-01-02 苹果公司 For manipulating equipment, method and the graphic user interface of user interface object using vision and/or touch feedback
US20170046024A1 (en) * 2015-08-10 2017-02-16 Apple Inc. Devices, Methods, and Graphical User Interfaces for Manipulating User Interface Objects with Visual and/or Haptic Feedback
US10416800B2 (en) 2015-08-10 2019-09-17 Apple Inc. Devices, methods, and graphical user interfaces for adjusting user interface objects
US10963158B2 (en) 2015-08-10 2021-03-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US11740785B2 (en) 2015-08-10 2023-08-29 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US11182017B2 (en) 2015-08-10 2021-11-23 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10698598B2 (en) 2015-08-10 2020-06-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10248308B2 (en) 2015-08-10 2019-04-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures
US10884608B2 (en) 2015-08-10 2021-01-05 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US10235035B2 (en) 2015-08-10 2019-03-19 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US10203868B2 (en) 2015-08-10 2019-02-12 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US11327648B2 (en) 2015-08-10 2022-05-10 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10754542B2 (en) 2015-08-10 2020-08-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US11209921B2 (en) * 2015-09-30 2021-12-28 Ricoh Company, Ltd. Electronic blackboard, storage medium, and information display method
CN105487687A (en) * 2015-11-23 2016-04-13 广州视睿电子科技有限公司 Handwriting display method and apparatus
US11531460B2 (en) * 2016-10-25 2022-12-20 Twitter, Inc. Automatic positioning of content items in a scrolling display for optimal viewing of the items
US10891044B1 (en) * 2016-10-25 2021-01-12 Twitter, Inc. Automatic positioning of content items in a scrolling display for optimal viewing of the items
US20210096714A1 (en) * 2016-10-25 2021-04-01 Twitter, Inc. Automatic positioning of content items in a scrolling display for optimal viewing of the items
US11169684B2 (en) * 2017-02-15 2021-11-09 Canon Kabushiki Kaisha Display control apparatuses, control methods therefor, and computer readable storage medium
CN108958594A (en) * 2018-05-23 2018-12-07 郑州云海信息技术有限公司 A kind of method for page jump and equipment
CN110471640A (en) * 2018-10-26 2019-11-19 珠海中电数码科技有限公司 A kind of multi-screen interaction method and its system
US11474689B2 (en) * 2020-04-02 2022-10-18 Beijing Dajia Internet Information Technology Co., Ltd. Method and apparatus for obtaining content
US20210311622A1 (en) * 2020-04-02 2021-10-07 Beijing Dajia Internet Information Technology Co., Ltd. Method and apparatus for obtaining content

Also Published As

Publication number Publication date
CN102681721A (en) 2012-09-19
US10191648B2 (en) 2019-01-29
CN104636049B (en) 2018-04-27
CN102681721B (en) 2015-04-01
US20150346945A1 (en) 2015-12-03
CN104636049A (en) 2015-05-20

Similar Documents

Publication Publication Date Title
US10191648B2 (en) Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus
JP5537458B2 (en) Image display device capable of touch input, control device for display device, and computer program
US8633906B2 (en) Operation control apparatus, operation control method, and computer program
JP5536690B2 (en) Touch drawing display device and operation method thereof
CN110647248B (en) Image display device and method of operating the same
US20190018585A1 (en) Touch operation method based on interactive electronic white board and system thereof
US20170336932A1 (en) Image display apparatus allowing operation of image screen and operation method thereof
US10719228B2 (en) Image processing apparatus, image processing system, and image processing method
US11150749B2 (en) Control module for stylus with whiteboard-style erasure
US10747425B2 (en) Touch operation input device, touch operation input method and program
JP2009198734A (en) Multi-display control method and control program and multi-display apparatus
JP5905783B2 (en) Image display system
JP2012168621A (en) Touch drawing display device and operation method therefor
JP2009116727A (en) Image input display
JP2013178701A (en) Touch drawing display device employing multiple windows
US20190317617A1 (en) Terminal Device And Recording Medium
JP5782157B2 (en) Image display device capable of touch input, control device for display device, and computer program
JP5801920B2 (en) Touch drawing display device and operation method thereof
KR20150114332A (en) Smart board and the control method thereof
JP2014109922A (en) Electronic blackboard
JP6584876B2 (en) Information processing apparatus, information processing program, and information processing method
JP6068428B2 (en) Image display system control method and control apparatus
JP2016186524A (en) Display system, display device, information processing device, and control method
KR20150114329A (en) Smart board and the control method thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: SHARP KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KANKI, NORIYOSHI;REEL/FRAME:027687/0985

Effective date: 20120106

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION