US20130241854A1 - Image sharing system and user terminal for the system - Google Patents
Image sharing system and user terminal for the system Download PDFInfo
- Publication number
- US20130241854A1 US20130241854A1 US13/785,302 US201313785302A US2013241854A1 US 20130241854 A1 US20130241854 A1 US 20130241854A1 US 201313785302 A US201313785302 A US 201313785302A US 2013241854 A1 US2013241854 A1 US 2013241854A1
- Authority
- US
- United States
- Prior art keywords
- image
- touch
- information
- touch means
- display unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
- G06F9/452—Remote windowing, e.g. X-Window System, desktop virtualisation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4126—The peripheral being portable, e.g. PDAs or mobile phones
- H04N21/41265—The peripheral being portable, e.g. PDAs or mobile phones having a remote control device for bidirectional communication between the remote control device and client device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
- H04N21/42208—Display device provided on the remote control
- H04N21/42209—Display device provided on the remote control for displaying non-command information, e.g. electronic program guide [EPG], e-mail, messages or a second television channel
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
- H04N21/4222—Remote control device emulator integrated into a non-television apparatus, e.g. a PDA, media center or smart toy
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
- H04N21/42224—Touch pad or touch panel provided on the remote control
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0384—Wireless input, i.e. hardware and software details of wireless interface arrangements for pointing devices
Definitions
- the user terminal may further include a position image information generation unit configured to generate the position information of the touch means near the display unit, where the image information generation unit may transmit the position information of the touch means together with the image information shown on the display unit.
- the image indicating the position of the touch means may be changed in form when it is shown at a preset position.
- the user terminal may further include an information provider unit configured to provide information associated with the change in the position image.
- a user terminal that includes: a display unit configured to display an image; a sensing unit configured to sense a position of a touch means touching the display unit; and an image information transmitting unit configured to transmit image information, which relates to an image displayed on the display unit, and position information, which relates to a position of the touch means touching the display unit, to a receiving terminal, where an image corresponding to the image information and an image indicating a position of the touch means are displayed on the receiving terminal.
- the image indicating the position of the touch means may not be shown if at least one of the touch pressure and the touch area is within a particular level range.
- the image indicating the position of the touch means may be changed in form according to at least one of the touch pressure and the touch area of the touch means.
- the image indicating the position of the touch means may be changed in size according to at least one of the touch pressure and the touch area of the touch means.
- the user terminal may further include a setting unit configured to provide an interface, for setting sensing level classes of the sensing unit and setting changes in a position image according to the sensing levels, and configured to store settings information.
- a setting unit configured to provide an interface, for setting sensing level classes of the sensing unit and setting changes in a position image according to the sensing levels, and configured to store settings information.
- Yet another aspect of the invention provides a method for sharing an image that includes: (a) sensing a position of a touch means touching a display unit; and (b) transmitting image information, which relates to an image displayed on the display unit, and position information, which relates to a position of the touch means touching the display unit, to a receiving terminal, where an image corresponding to the image information and an image indicating a position of the touch means are displayed on the receiving terminal.
- Certain embodiments of the invention enable the sharing of images between a user terminal and a different type of terminal.
- FIG. 6 illustrates an example of a change in the position image according to touch level, according to an embodiment of the invention.
- FIG. 7 is a flowchart illustrating the process for exchanging image information between a user terminal 100 and a receiving terminal 102 according to an embodiment of the invention.
- FIG. 8 is a flowchart illustrating the process for exchanging image information between a user terminal 100 and a receiving terminal 102 according to another embodiment of the invention.
- FIG. 9 is a flowchart illustrating the process for exchanging image information between a user terminal 100 and a receiving terminal 102 according to still another embodiment of the invention.
- FIG. 10 is a block diagram illustrating the modular composition of a user terminal according to an embodiment of the invention.
- FIG. 11 is a block diagram illustrating the modular composition of a user terminal according to another embodiment of the invention.
- FIG. 12 is a block diagram illustrating the modular composition of a user terminal according to still another embodiment of the invention.
- FIG. 1 schematically illustrates the composition of a system for sharing an image between terminals according to an embodiment of the invention.
- the user terminal 100 may be the terminal which provides an image, and may be a terminal that is capable of running application programs, is capable of communicating with other terminals, and is equipped with a display unit, such as a smart phone, a laptop, and a netbook, for example.
- the user terminal 100 can be a terminal that provides a touch interface.
- the user terminal 100 and the receiving terminal 102 may be equipped with communication modules that can communicate with each other to share images.
- the user terminal 100 and the receiving terminal 102 can communicate by various methods; for example, the user terminal 100 and the receiving terminal 102 can communicate using Wi-Fi.
- the user terminal 100 and the receiving terminal 102 could also communicate using a near-field communication method such as Bluetooth and NFC.
- the user terminal 100 and the receiving terminal 102 could also communicate by a wireless HDMI method.
- the communication can also be performed using an external device that is capable of interworking with the user terminal 100 or the receiving terminal 102 .
- the user terminal 100 and receiving terminal 102 could also communicate via a communication device such as a server.
- the image shown on the display unit of the user terminal 100 may be shared on the display unit of the receiving terminal 102 .
- the user terminal 100 may use a communication module to transmit to the receiving terminal 102 the image information shown on the display unit of the user terminal 100 .
- the communication module of the receiving terminal 102 may use the image information transmitted from the user terminal 100 to display the same image on the display unit of the receiving terminal as the image on the user terminal 100 .
- the image provided by the user terminal 100 can be viewed by the user through the display unit of another device, rather than the display unit of the user terminal 100 .
- the information on the corresponding video clip can be transmitted to the receiving terminal 102 , to allow viewing of the corresponding video clip through the display unit of the receiving terminal 102 .
- the user wishes to view the video clip on a screen that is larger compared to the display unit of the user terminal 100 , it is possible to utilize a device such as a TV as the receiving terminal 102 , to view the video clip through the TV.
- the position information of a touch means that touches or is near the display unit of the user terminal 100 may be shared between the user terminal 100 and the receiving terminal 102 .
- the information regarding which area the touch means is positioned in may be shared between the user terminal 100 and the receiving terminal 102 .
- the receiving terminal 102 may show the image (hereinafter referred to as the “position image”) at a point corresponding to the position of the touch means.
- the touch means can include any type of means that can perform a touch operation to control the terminal, such as a touch pen and a finger.
- the position image can take any of a variety of forms, such as a mouse pointer shaped as an arrow, a circular shaded image, etc.
- the position image will be described below in further detail.
- FIG. 4 illustrates an example of a position image of touch means according to an embodiment of the invention.
- a position image 400 is shown in the form of a circular shading, at a point corresponding to the position of a touch means that is near or is touching the display unit of the user terminal 100 .
- the function of sharing the position information of the touch means enables the user to manipulate the user terminal 100 while looking at the receiving terminal 102 and not the user terminal 100 .
- the user can identify the position of the touch means while looking only at the receiving terminal 102 , and can use the position image to select a desired item from a web document while looking at the receiving terminal 102 .
- FIG. 4 shows the screen of a racing game, where the user can select a particular menu in the game or manipulate a character by using the position image.
- the position image which may indicate the position information of a touch means that is near or in contact with the display unit of the user terminal 100 , can also be shown on the user terminal 100 as well as the receiving terminal 102 .
- the position image that indicates the position information of a touch means near the user terminal's display unit can be useful in minimizing erroneous touch operations.
- the position image When the position image is not positioned over an event-executing object, it may be shown as a circular image 500 a as illustrated in drawing (A) of FIG. 5 , but when the position image is positioned over the “PLAY” button, which is an event-executing object for executing a game, the position image can be changed from the previous circular form to a finger-shaped image 500 b as illustrated in drawing (B) of FIG. 5 .
- the user can recognize that the point that currently is about to be touched or is touched is a point at which a particular control command can be executed by a touch.
- the form of the position image can be changed under various conditions other than when the position image is over an event-executing object, and by way of the changed position image, the user can be provided with additional information.
- the form of the position image can be changed not only according to the position of the position image but also according to the movement of the touch means.
- the position of the touch means is shown as the position image on the display unit of the user terminal 100 or the receiving terminal 102
- a change can be implemented, such as by having the position image changed from a shaded image to a finger image.
- the image processing for the change in form of the position image may preferably be implemented at the user terminal 100 .
- the image processing for changing the form of the position image can also be implemented at the receiving terminal 102 as necessary.
- the option of whether or not a position image is to be shown on the user terminal 100 can be selected by the user through a separate setting unit.
- the user terminal 100 can provide an interface regarding whether or not to show a position image on the user terminal, and this interface may allow an on/off setting with respect to showing the position image at the user terminal.
- FIG. 2 is a flowchart illustrating the operations for sharing an image between a user terminal and a receiving terminal according to an embodiment of the invention.
- the user terminal 100 may first send a request for image-sharing to the receiving terminal 102 (step 200 ).
- the receiving terminal 102 on receiving the request for image-sharing from the user terminal 100 , may change to a mode that enables communication with the user terminal 100 (step 202 ) and may transmit information to the user terminal 100 indicating that the mode change for image-sharing is complete (step 204 ).
- the user terminal 100 may transmit the image information shown on the display unit to the receiving terminal 102 (step 206 ), and the receiving terminal 102 may display an image corresponding to the received image information on its own display unit (step 208 ).
- the user terminal 100 may sense whether or not a touch means is near (step 210 ). If a touch means is near, the position information of the nearby touch means may be transmitted to the receiving terminal 102 (step 212 ). Here, the position information of the touch means that is nearby can be transmitted to the receiving terminal in various ways.
- the image shown on the display unit of the user terminal 100 with the position image incorporated can itself serve as the position information of the touch means. Since the position image is shown on the user terminal 100 , it is possible to provide the position information of the touch means by transmitting the image itself that is shown on the display unit.
- the position image shown on the user terminal 100 can be useful in preventing touch errors beforehand when a blunt touch means, such as a finger, is used.
- the position information of the touch means can include coordinate information and form information of the position image.
- the user terminal 100 can output the coordinate information and form information of the position image and provide them to the receiving terminal 102 .
- the coordinate information of the position image can include the pixel coordinates at which the position image is to be shown in the image displayed on the display unit of the user terminal 100 .
- the coordinate information of the position image can be provided in various ways other than by using pixel coordinates.
- a position image can be synthesized into the image currently shown on the display unit to generate a separate image incorporating the position image, and the synthesized image thus generated can correspond to the position information of the touch means.
- the receiving terminal may show a position image corresponding to the position of the touch means (step 214 ).
- the receiving terminal 102 can display the corresponding image and thereby show the position image. If the coordinate and form information of the position image is provided separately from the user terminal 100 , the receiving terminal 102 may generate and show the position image at the corresponding coordinate position.
- FIG. 3 is a flowchart illustrating the operations for sharing an image between a user terminal and a receiving terminal according to another embodiment of the invention.
- the embodiment in FIG. 3 relates to an example of showing a position image if the touch means directly touches the display unit of the user terminal 100 .
- the position image that is to be displayed may be determined in correspondence to the sensed touch level (step 314 ).
- a position image may not be shown even if a touch is sensed, and if at least one of the sensed touch pressure and touch area belongs to a preset second level class, then a position image may be shown that corresponds to the touch point.
- a position image may not be shown even though a touch is sensed, and if it is below a preset threshold, a position image may be shown that corresponds to the touch point.
- the touch level classes for changing the position image can be divided further, and the position image can be changed for each of the touch level classes.
- the size of the position image can be adjusted in proportion to or in inverse proportion to the sensitivity of the touch level.
- FIG. 6 illustrates an example of a change in the position image according to touch level, according to an embodiment of the invention.
- drawing (A) illustrates a screen shown on the receiving terminal when the touch level (touch pressure or touch area) belongs to a first level class
- drawing (B) illustrates a screen shown on the receiving terminal when the touch level belongs to a second level class that is higher than the first level class.
- the same screen can be shown on the user terminal as well according to the user's selection.
- the position image can be shown as in drawing (A).
- the position image may not be shown, as is the case in drawing (B).
- the nearness of a touch means can be sensed using capacitance.
- a capacitive type touch panel may be used, and if the user brings a touch means, such as a finger or a touch pen, near the touch panel, it is possible to sense the change in capacitance and thus sense the position of the nearby touch means.
- an ultrasonic method can be used.
- a touch unit on a touch pen may emit infrared rays and ultrasonic waves, which may be received by a receiver equipped with an infrared sensor and two ultrasonic sensors to sense the movement and position information of the touch pen.
- the three sensors may respectively measure the transmission time of the infrared rays and the transmission time of the ultrasonic waves, convert the transmission times into distances, and then detect the position of the touch pen using the converted distances by a method such as triangulation, etc.
- electromagnetic induction can be used to sense a nearness of a touch pen or a touch operation of a touch pen.
- electromagnetic induction may occur between the touch pen and the touch panel, and it is possible to sense whether or not the touch pen is near by way of the alteration in the electromagnetic field caused by this electromagnetic induction.
- FIG. 7 is a flowchart illustrating the process for exchanging image information between a user terminal 100 and a receiving terminal 102 according to an embodiment of the invention.
- the user terminal 100 may transmit the image information shown on the display unit to the receiving terminal 102 (step 700 ).
- the receiving terminal 102 may display an image on its display unit (step 702 ).
- the user terminal 100 can transmit or receive image information by way of an HDMI (High-Definition Multimedia Interface) method, for instance.
- HDMI High-Definition Multimedia Interface
- Various methods for exchanging multimedia data other than HDMI can also be used.
- the user terminal 100 may show a position image in correspondence to the position of the touch means (step 704 ).
- the user terminal 100 may transmit information to the receiving terminal 102 regarding the current display image in which the position image is shown (step 706 ).
- the position image can be shown on the display unit of the user terminal 100 as an overlay or can be shown on the display unit of the user terminal 100 by image synthesis.
- the user terminal 100 may transmit the display image currently shown after encoding it into a preset format.
- FIG. 8 is a flowchart illustrating the process for exchanging image information between a user terminal 100 and a receiving terminal 102 according to another embodiment of the invention.
- the receiving terminal 102 may display an image on its display unit (step 802 ).
- the user terminal may calculate the position information of the touch means of which the touch or nearness is sensed (step 806 ).
- the position information of the touch means can be calculated by various methods.
- the position information of the touch means can be set as the coordinates of the touch means which represent the relative position of the touch means when the display unit of the user terminal 100 is set as the entire coordinate range.
- the coordinates of the touch means can also be set by using the pixel coordinates of the image currently shown on the user terminal's display unit. For example, when the position of the touch means corresponds to a particular pixel of the currently-shown image, the coordinates of the corresponding pixel can be set as the coordinates of the touch means.
- the user terminal may transmit the position information of the touch means to the receiving terminal 102 (step 808 ).
- the position information of the touch means can also be transmitted to the receiving terminal 102 through a separate data channel.
- the user terminal 100 and the receiving terminal 102 may have to establish two channels, i.e. an image channel (a channel for transmitting image data) and a control channel (a channel for transmitting position information), when establishing connection.
- the receiving terminal 102 may use the position information of the touch means to show a position image on the display unit of the receiving terminal 102 (step 810 ).
- the receiving terminal 102 can receive not only the position information of the position image but also the form information of the position image.
- the position image can change in size or form according to its position, and such form information of the position image can also be provided through the user terminal.
- a position image having a different form from that of a regular position image can be provided, and such changed form information can also be provided together with the position information of the touch means.
- FIG. 9 is a flowchart illustrating the process for exchanging image information between a user terminal 100 and a receiving terminal 102 according to still another embodiment of the invention.
- the user terminal 100 may transmit the image information shown on the display unit to the receiving terminal 102 (step 900 ).
- the receiving terminal 102 may display an image on its display unit (step 902 ).
- the position of the touch means may be identified, and image information may be generated that incorporates a position image for the identified touch means (step 906 ).
- the image information thus generated may preferably be image information that is encoded in a predetermined format agreed upon with the receiving terminal.
- a user terminal can include a display unit 1000 , an image information generation unit 1002 , an image information transmitting unit 1004 , a sensing unit 1006 , a position image generation unit 1008 , a setting unit 1010 , and a control unit 1012 .
- the display unit 1000 may display an image according to the operation of the user terminal 100 .
- the display unit 1000 may display images using various apparatuses such as LCD or LED, and may provide a touch interface.
- the display unit may display various screens according to the operation of the terminal such as a user interface screen, an application execution screen, etc.
- the position image can be generated in various forms, such as a shaded image, a cursor pointer image, etc.
- the position image generation unit 1008 may generate and show a position image, and if the touch pressure or the touch area is of a higher level, it may not show the position image.
- the position image generation unit 1008 may generate a position image having a small size, and if the touch pressure or the touch area is of a higher level, the position image generation unit 1008 may generate a position image having a larger size.
- the position image can be an image that shows the touch trajectory, where a thick trajectory can be shown if the touch pressure or touch area is large while a thin trajectory can be shown if the touch pressure or touch area is small.
- the position image generated at the position image generation unit 1008 can have different forms depending not only on the sensing level of the sensing unit but also on the position of the position image.
- the form of the position image when the position image is shown over an event-executing object can be different from the normal form of the position image.
- the position image can be modified in form in cases other than when it is over an event-executing object if a particular position of the position image is associated with a particular event.
- the position image can be shown as an overlay, or a preset position image can be synthesized with the image currently shown.
- the setting unit 1010 may serve to provide a setting interface for the various functions for sharing the image shown on the display and the position image. For example, it can provide a setting interface for setting the sensing level classes of the sensing unit and the changes in the position image according to sensing levels, and can also serve to store the settings information. To be more specific, the setting unit 1010 can change the settings such that the sensing unit 1006 only senses whether or not a touch is made and the function for sensing the levels of touch pressure and area is deactivated.
- the control unit 1012 may serve to control the overall operations of the components described above.
- FIG. 11 is a block diagram illustrating the modular composition of a user terminal according to another embodiment of the invention.
- FIG. 11 illustrates the modular composition of a user terminal which transmits the information of the position image separately, instead of transmitting the currently shown image of the user terminal incorporating the position image as in FIG. 10 .
- the position image information generation unit 1108 may generate position information and form information of a position image if the sensing unit 1106 senses a touch or a nearness of a touch means. If the form information of the position image is set beforehand in agreement with the receiving terminal 102 , it would also be possible to generate only the position information of the position image.
- the position information of the position image can include coordinate information of a touch means that is nearby or in contact, with respect to the currently-shown screen of the user terminal 100 , where the corresponding coordinate information can be provided to the receiving terminal 102 by way of the image information transmitting unit 1104 .
- a separate module can be included which only transmits the position image information.
- a user terminal can include a display unit 1200 , an image information generation unit 1202 , an image information transmitting unit 1204 , a sensing unit 1206 , a position image generation unit 1208 , a setting unit 1210 , a control unit 1212 , and an information provider unit 1214 .
- the information provider unit 1214 may serve to output a preset type of information in response to a touch of the user or a bringing near of a touch means.
- the information provider unit 1214 can output information that allows the user to recognize the touch pressure and touch area.
- the information provider unit 1214 can output information regarding whether the user performs a touch operation with a strong touch pressure or a weak touch pressure, and can output information regarding whether the touch operation is performed with a wide touch area or a narrow touch area.
- the information provider unit 1214 can output information in a way that stimulates the user's auditory, tactile, or visual sensations.
- the provision of the information can be achieved by using a vibration motor. If a touch is made with a pressure or area smaller than or equal to a preset value, the information provider unit 1214 can inform the user of this fact by way of vibration. If the pressure information or area information is provided in a form that stimulates the user's auditory sensation, the information provider unit can provide the user with such information by way of a speaker. If the pressure information or area information is provided in a form that stimulates the user's visual sensation, the information provider unit can provide the user with such information by way of a light-emitting means on the user terminal 100 .
- the information provider unit 1214 can emit a continuous vibration, sound, or light. That is, when the touch of a touch means having a pressure or area smaller than or equal to a preset value is first recognized, a short vibration may be created once (for a first duration), and if the touch means is moved afterwards, a continuous vibration may be created (for a second duration).
- the second duration can be longer than the first duration.
- the information provider unit 1214 can provide position image change information in various forms when the position image is changed in accordance with certain conditions.
- the embodiments of the present invention can be implemented in the form of program instructions that may be performed using various computer means and can be recorded in a computer-readable medium.
- a computer-readable medium can include program instructions, data files, data structures, etc., alone or in combination.
- the program instructions recorded on the medium can be designed and configured specifically for the present invention or can be a type of medium known to and used by the skilled person in the field of computer software.
- Examples of a computer-readable medium may include magnetic media such as hard disks, floppy disks, magnetic tapes, etc., optical media such as CD-ROM's, DVD's, etc., magneto-optical media such as floptical disks, etc., and hardware devices such as ROM, RAM, flash memory, etc.
Abstract
An image sharing system and a user terminal for the system are disclosed. The user terminal can include: a display unit configured to display an image; a sensing unit configured to sense a position of a touch means near the display unit; and an image information transmitting unit configured to transmit image information, which relates to an image displayed on the display unit, and position information, which relates to a position of the touch means near the display unit, to a receiving terminal, where an image corresponding to the image information and an image indicating a position of the nearby touch means are displayed on the receiving terminal. Certain embodiments of the invention provide the advantages of enabling image sharing between a user terminal and a different type of terminal and enabling a user to manipulate the user terminal while viewing another terminal.
Description
- This application claims the benefit of Korean Patent Applications Nos. 10-2012-0023012 (filed on Mar. 6, 2012), 10-2012-0022986 (filed on Mar. 6, 2012), 10-2012-0022988 (filed on Mar. 6, 2012), 10-2012-0022984 (filed on Mar. 6, 2012), 10-2012-0024073 (filed on Mar. 8, 2012), 10-2012-0024092 (filed on Mar. 8, 2012), 10-2012-0032982 (filed on Mar. 30, 2012), 10-2012-0033047 (filed on Mar. 30, 2012), 10-2012-0043148 (filed on Apr. 25, 2012), 10-2012-0057996 (filed on May 31, 2012), 10-2012-0057998 (filed on May 31, 2012), and 10-2012-0058000 (filed on May 31, 2012) filed with the Korean Intellectual Property Office. The disclosures of the above applications are incorporated herein by reference in their entirety.
- 1. Technical Field
- The present invention relates to an image sharing system, more particularly to an image sharing system and a user terminal for the system.
- 2. Description of the Related Art
- The use of smart phones is steadily increasing, and for many people, the smart phone has become a personal device that is essential for everyday living. The smart phone provides numerous uses in addition to voice calls, such as gaming, information search, managing personal information, and the like. Moreover, the utility of the smart phone is continuously expanding, as various new applications are being developed.
- With advances in CPU and memory device technology, the smart phone provides the functions of a miniature computer, but due to the constraint in size, there is a limit in utilizing the various application programs available.
- For example, certain games may require play on a large screen, and while a smart phone is capable of running such types of games, it may be difficult to play such games on a smart phone due to the limited display size.
- Also, as most smart phones employ a touch-based interface, the executing objects forming the interface may have to be positioned close to one another in a tight arrangement, and the tight arrangement of these interface-executing objects would often result in touch input errors.
- An aspect of the invention is to propose an image sharing system and a user terminal that enable the sharing of images between the user terminal and a different type of terminal.
- Another aspect of the invention is to propose an image sharing system and a user terminal that enable a user to manipulate the user terminal while looking at the display of another terminal.
- To achieve the objectives above, an embodiment of the invention provides a user terminal that includes: a display unit configured to display an image; a sensing unit configured to sense a position of a touch means near the display unit; and an image information transmitting unit configured to transmit image information, which relates to an image displayed on the display unit, and position information, which relates to a position of the touch means near the display unit, to a receiving terminal, where an image corresponding to the image information and an image indicating a position of the nearby touch means are displayed on the receiving terminal.
- The user terminal may further include a position image generation unit, which may be configured to show on the display unit a position image that corresponds to a position of the touch means near the display unit.
- The image information transmitting unit may transmit the position information of the touch means by transmitting the image information of the display unit in which the position image generated by the position image generation unit is shown.
- The user terminal may further include a position image information generation unit configured to generate the position information of the touch means near the display unit, where the image information generation unit may transmit the position information of the touch means together with the image information shown on the display unit.
- The user terminal may further include a position image information generation unit configured to generate information regarding an image in which a position image indicating a position of the touch means is incorporated, where the image information transmitting unit may transmit the information generated by the position image information generation unit.
- The image indicating the position of the touch means may be changed in form when it is shown over an event-executing object.
- The image indicating the position of the touch means may be changed in form when it is shown at a preset position.
- The user terminal may further include an information provider unit configured to provide information associated with the change in the position image.
- Another aspect of the invention provides a user terminal that includes: a display unit configured to display an image; a sensing unit configured to sense a position of a touch means touching the display unit; and an image information transmitting unit configured to transmit image information, which relates to an image displayed on the display unit, and position information, which relates to a position of the touch means touching the display unit, to a receiving terminal, where an image corresponding to the image information and an image indicating a position of the touch means are displayed on the receiving terminal.
- The sensing unit may sense at least one of a touch pressure and a touch area of the touch means, and the image indicating the position of the touch means may be changed according to at least one of the touch pressure and the touch area of the touch means.
- The image indicating the position of the touch means may not be shown if at least one of the touch pressure and the touch area is within a particular level range.
- The image indicating the position of the touch means may be changed in form according to at least one of the touch pressure and the touch area of the touch means.
- The image indicating the position of the touch means may be changed in size according to at least one of the touch pressure and the touch area of the touch means.
- The user terminal may further include a setting unit configured to provide an interface, for setting sensing level classes of the sensing unit and setting changes in a position image according to the sensing levels, and configured to store settings information.
- The user terminal may further include a position image generation unit configured to show a position image, which corresponds to a position of the touch means touching the display unit, on the display unit. Here, the image information generation unit may transmit the position information of the touch means by transmitting the image information of the display unit in which the position image generated by the position image generation unit is shown.
- Still another aspect of the invention provides a method for sharing an image that includes: (a) sensing a position of a touch means near a display unit; and (b) transmitting image information, which relates to an image displayed on the display unit, and position information, which relates to a position of the touch means near the display unit, to a receiving terminal, where an image corresponding to the image information and an image indicating a position of the nearby touch means are displayed on the receiving terminal.
- Yet another aspect of the invention provides a method for sharing an image that includes: (a) sensing a position of a touch means touching a display unit; and (b) transmitting image information, which relates to an image displayed on the display unit, and position information, which relates to a position of the touch means touching the display unit, to a receiving terminal, where an image corresponding to the image information and an image indicating a position of the touch means are displayed on the receiving terminal.
- Another aspect of the invention provides a recorded medium on which a program of instructions for executing the methods described above is recorded.
- Certain embodiments of the invention enable the sharing of images between a user terminal and a different type of terminal.
- Also, certain embodiments of the invention enable a user to manipulate the user terminal while viewing another terminal.
-
FIG. 1 schematically illustrates the composition of a system for sharing an image between terminals according to an embodiment of the invention. -
FIG. 2 is a flowchart illustrating the operations for sharing an image between a user terminal and a receiving terminal according to an embodiment of the invention. -
FIG. 3 is a flowchart illustrating the operations for sharing an image between a user terminal and a receiving terminal according to another embodiment of the invention. -
FIG. 4 illustrates an example of a position image of touch means according to an embodiment of the invention. -
FIG. 5 illustrates an example of a change in the position image according to an embodiment of the invention. -
FIG. 6 illustrates an example of a change in the position image according to touch level, according to an embodiment of the invention. -
FIG. 7 is a flowchart illustrating the process for exchanging image information between auser terminal 100 and areceiving terminal 102 according to an embodiment of the invention. -
FIG. 8 is a flowchart illustrating the process for exchanging image information between auser terminal 100 and areceiving terminal 102 according to another embodiment of the invention. -
FIG. 9 is a flowchart illustrating the process for exchanging image information between auser terminal 100 and areceiving terminal 102 according to still another embodiment of the invention. -
FIG. 10 is a block diagram illustrating the modular composition of a user terminal according to an embodiment of the invention. -
FIG. 11 is a block diagram illustrating the modular composition of a user terminal according to another embodiment of the invention. -
FIG. 12 is a block diagram illustrating the modular composition of a user terminal according to still another embodiment of the invention. - Certain embodiments of the invention will be described below in more detail with reference to the accompanying drawings.
-
FIG. 1 schematically illustrates the composition of a system for sharing an image between terminals according to an embodiment of the invention. - Referring to
FIG. 1 , a system for sharing an image between terminals according to an embodiment of the invention may include auser terminal 100 and areceiving terminal 102. - The
user terminal 100 may be the terminal which provides an image, and may be a terminal that is capable of running application programs, is capable of communicating with other terminals, and is equipped with a display unit, such as a smart phone, a laptop, and a netbook, for example. Preferably, theuser terminal 100 can be a terminal that provides a touch interface. - The
receiving terminal 102 may be the terminal which is not directly manipulated by the user but which shares the image shown on the display unit of theuser terminal 100. Preferably, thereceiving terminal 102 can be a terminal having a display unit that is relatively larger than theuser terminal 100, and examples of thereceiving terminal 102 can include devices such as a TV, a monitor, etc. - The
user terminal 100 and thereceiving terminal 102 may be equipped with communication modules that can communicate with each other to share images. Theuser terminal 100 and thereceiving terminal 102 can communicate by various methods; for example, theuser terminal 100 and thereceiving terminal 102 can communicate using Wi-Fi. In another example, theuser terminal 100 and thereceiving terminal 102 could also communicate using a near-field communication method such as Bluetooth and NFC. In still another example, theuser terminal 100 and thereceiving terminal 102 could also communicate by a wireless HDMI method. The skilled person would appreciate that, if at least one of theuser terminal 100 and thereceiving terminal 102 is not equipped with a built-in communication module, the communication can also be performed using an external device that is capable of interworking with theuser terminal 100 or thereceiving terminal 102. - Also, the
user terminal 100 and receivingterminal 102 could also communicate via a communication device such as a server. - Primarily, according to an embodiment of the invention, the image shown on the display unit of the
user terminal 100 may be shared on the display unit of the receivingterminal 102. To this end, theuser terminal 100 may use a communication module to transmit to the receivingterminal 102 the image information shown on the display unit of theuser terminal 100. The communication module of the receivingterminal 102 may use the image information transmitted from theuser terminal 100 to display the same image on the display unit of the receiving terminal as the image on theuser terminal 100. By virtue of this primary function, the image provided by theuser terminal 100 can be viewed by the user through the display unit of another device, rather than the display unit of theuser terminal 100. - For example, if a video clip is being played on the
user terminal 100, the information on the corresponding video clip can be transmitted to the receivingterminal 102, to allow viewing of the corresponding video clip through the display unit of the receivingterminal 102. If the user wishes to view the video clip on a screen that is larger compared to the display unit of theuser terminal 100, it is possible to utilize a device such as a TV as the receivingterminal 102, to view the video clip through the TV. - Secondarily, according to an embodiment of the invention, not only the image shown on the display unit of the
user terminal 100 but also the position information of a touch means that touches or is near the display unit of theuser terminal 100 may be shared between theuser terminal 100 and the receivingterminal 102. When a touch means is brought near to or is performing a touch operation on the display unit of theuser terminal 100, the information regarding which area the touch means is positioned in may be shared between theuser terminal 100 and the receivingterminal 102. Using the position information of the touch means thus shared, the receivingterminal 102 may show the image (hereinafter referred to as the “position image”) at a point corresponding to the position of the touch means. Here, the touch means can include any type of means that can perform a touch operation to control the terminal, such as a touch pen and a finger. - The position image can take any of a variety of forms, such as a mouse pointer shaped as an arrow, a circular shaded image, etc. The position image will be described below in further detail.
-
FIG. 4 illustrates an example of a position image of touch means according to an embodiment of the invention. - Referring to
FIG. 4 , aposition image 400 is shown in the form of a circular shading, at a point corresponding to the position of a touch means that is near or is touching the display unit of theuser terminal 100. - The function of sharing the position information of the touch means enables the user to manipulate the
user terminal 100 while looking at the receivingterminal 102 and not theuser terminal 100. For example, if the user is using a web browser for web surfing, the user can identify the position of the touch means while looking only at the receivingterminal 102, and can use the position image to select a desired item from a web document while looking at the receivingterminal 102. - Various manipulations can be made using the position image, in addition to the manipulation for selecting a particular item from a web document.
- As another example, it is also possible, by using the position image, to manipulate a game running on the
user terminal 100 while looking at the receivingterminal 102. The example inFIG. 4 shows the screen of a racing game, where the user can select a particular menu in the game or manipulate a character by using the position image. - The function by which the image shown on the display unit of the
user terminal 100 and the position image of the touch means are shown on the receivingterminal 102, according to an embodiment of the invention, can be useful when the user wishes to utilize the display unit of another terminal instead of the display unit of theuser terminal 100. - When the user wishes to play a game on a larger screen, a device equipped with a display unit of a relatively larger size, such as a TV or a monitor, can be set as the receiving terminal, to enable game play using the screen of the TV or monitor. The skilled person would appreciate that the function provided by an embodiment of the invention can be utilized for various purposes other than gaming.
- The position image, which may indicate the position information of a touch means that is near or in contact with the display unit of the
user terminal 100, can also be shown on theuser terminal 100 as well as the receivingterminal 102. In particular, the position image that indicates the position information of a touch means near the user terminal's display unit can be useful in minimizing erroneous touch operations. - According to an embodiment of the invention, the form of the position image of a touch means can be changed according to its position.
- For example, the position image can be shown in a different form if it is positioned over a displayed event-executing object for a content.
FIG. 5 illustrates an example of a change in the position image according to an embodiment of the invention. - When the position image is not positioned over an event-executing object, it may be shown as a
circular image 500 a as illustrated in drawing (A) ofFIG. 5 , but when the position image is positioned over the “PLAY” button, which is an event-executing object for executing a game, the position image can be changed from the previous circular form to a finger-shapedimage 500 b as illustrated in drawing (B) ofFIG. 5 . - From the change in the position image, the user can recognize that the point that currently is about to be touched or is touched is a point at which a particular control command can be executed by a touch.
- Of course, the skilled person would appreciate that the form of the position image can be changed under various conditions other than when the position image is over an event-executing object, and by way of the changed position image, the user can be provided with additional information.
- The form of the position image can be changed not only according to the position of the position image but also according to the movement of the touch means. After the position of the touch means is shown as the position image on the display unit of the
user terminal 100 or the receivingterminal 102, when the user changes the position of the touch means while maintaining a state of nearness or contact, a change can be implemented, such as by having the position image changed from a shaded image to a finger image. - The image processing for the change in form of the position image may preferably be implemented at the
user terminal 100. However, the image processing for changing the form of the position image can also be implemented at the receivingterminal 102 as necessary. - The option of whether or not a position image is to be shown on the
user terminal 100 can be selected by the user through a separate setting unit. Theuser terminal 100 can provide an interface regarding whether or not to show a position image on the user terminal, and this interface may allow an on/off setting with respect to showing the position image at the user terminal. - The above descriptions of the position image according to certain embodiments of the invention are provided as examples. The skilled person would appreciate that numerous variations are possible for the change in form, etc., of the position image, and the scope of the invention must not be limited to the examples presented above.
- A description is provided below of the basic operations performed for sharing an image between a
user terminal 100 and a receivingterminal 102 according to an embodiment of the invention. -
FIG. 2 is a flowchart illustrating the operations for sharing an image between a user terminal and a receiving terminal according to an embodiment of the invention. - Referring to
FIG. 2 , theuser terminal 100 may first send a request for image-sharing to the receiving terminal 102 (step 200). - The receiving
terminal 102, on receiving the request for image-sharing from theuser terminal 100, may change to a mode that enables communication with the user terminal 100 (step 202) and may transmit information to theuser terminal 100 indicating that the mode change for image-sharing is complete (step 204). - The
user terminal 100 may transmit the image information shown on the display unit to the receiving terminal 102 (step 206), and the receivingterminal 102 may display an image corresponding to the received image information on its own display unit (step 208). - The
user terminal 100 may sense whether or not a touch means is near (step 210). If a touch means is near, the position information of the nearby touch means may be transmitted to the receiving terminal 102 (step 212). Here, the position information of the touch means that is nearby can be transmitted to the receiving terminal in various ways. - According to an embodiment of the invention, if a position image is to be shown on the
user terminal 100, the image shown on the display unit of theuser terminal 100 with the position image incorporated can itself serve as the position information of the touch means. Since the position image is shown on theuser terminal 100, it is possible to provide the position information of the touch means by transmitting the image itself that is shown on the display unit. - The position image shown on the
user terminal 100 can be useful in preventing touch errors beforehand when a blunt touch means, such as a finger, is used. - According to another embodiment of the invention, the position information of the touch means can include coordinate information and form information of the position image. The
user terminal 100 can output the coordinate information and form information of the position image and provide them to the receivingterminal 102. Here, the coordinate information of the position image can include the pixel coordinates at which the position image is to be shown in the image displayed on the display unit of theuser terminal 100. Of course, the coordinate information of the position image can be provided in various ways other than by using pixel coordinates. - According to still another embodiment of the invention, if no particular position image is to be shown on the
user terminal 100, a position image can be synthesized into the image currently shown on the display unit to generate a separate image incorporating the position image, and the synthesized image thus generated can correspond to the position information of the touch means. - When the position information of the touch means is received from the
user terminal 100, the receiving terminal may show a position image corresponding to the position of the touch means (step 214). Upon receiving information on the image itself shown on the display unit of the user terminal that includes a position image or the image with the position image synthesized therein from the user terminal, the receivingterminal 102 can display the corresponding image and thereby show the position image. If the coordinate and form information of the position image is provided separately from theuser terminal 100, the receivingterminal 102 may generate and show the position image at the corresponding coordinate position. -
FIG. 3 is a flowchart illustrating the operations for sharing an image between a user terminal and a receiving terminal according to another embodiment of the invention. - Unlike the embodiment illustrated in
FIG. 2 , the embodiment inFIG. 3 relates to an example of showing a position image if the touch means directly touches the display unit of theuser terminal 100. - In
FIG. 3 , the operations other than that for sensing the touch means are substantially the same as those described with reference toFIG. 2 , and as such, only the portions related to the sensing operation will be described here. - The
user terminal 100 may sense whether or not a touch means is touching the display unit of the user terminal 100 (step 310). If the touch of a touch means is sensed, the touch level of the touch means may be sensed (step 312). Here, a touch level may refer to at least one of a touch pressure and a touch area of the touch means. - When the touch level of the touch means is sensed, the position image that is to be displayed may be determined in correspondence to the sensed touch level (step 314).
- According to an embodiment of the invention, if at least one of the sensed touch pressure and touch area belongs to a preset first level class, then a position image may not be shown even if a touch is sensed, and if at least one of the sensed touch pressure and touch area belongs to a preset second level class, then a position image may be shown that corresponds to the touch point.
- For example, if at least one of the touch pressure and the touch area is greater than or equal to a preset threshold, a position image may not be shown even though a touch is sensed, and if it is below a preset threshold, a position image may be shown that corresponds to the touch point.
- Of course, the touch level classes for changing the position image can be divided further, and the position image can be changed for each of the touch level classes. For instance, the size of the position image can be adjusted in proportion to or in inverse proportion to the sensitivity of the touch level.
-
FIG. 6 illustrates an example of a change in the position image according to touch level, according to an embodiment of the invention. - Referring to
FIG. 6 , drawing (A) illustrates a screen shown on the receiving terminal when the touch level (touch pressure or touch area) belongs to a first level class, while drawing (B) illustrates a screen shown on the receiving terminal when the touch level belongs to a second level class that is higher than the first level class. Obviously, the same screen can be shown on the user terminal as well according to the user's selection. - When the user touches the display unit of the user terminal with a relatively lighter touch pressure (corresponding to the first level class), the position image can be shown as in drawing (A).
- However, when the user increases the touch pressure and touches the display unit of the user terminal with a pressure corresponding to the second level class, the position image may not be shown, as is the case in drawing (B).
- The above descriptions relating to changes in the position image according to touch level are for illustrative purposes, and the skilled person would appreciate that numerous variations are possible other than the embodiments illustrated above.
- A description is provided below in further detail regarding a method of sensing a nearness or a touch operation of a touch means.
- According to an embodiment of the invention, the nearness of a touch means can be sensed using capacitance. In this case, a capacitive type touch panel may be used, and if the user brings a touch means, such as a finger or a touch pen, near the touch panel, it is possible to sense the change in capacitance and thus sense the position of the nearby touch means.
- According to another embodiment of the invention, an ultrasonic method can be used. For example, a touch unit on a touch pen may emit infrared rays and ultrasonic waves, which may be received by a receiver equipped with an infrared sensor and two ultrasonic sensors to sense the movement and position information of the touch pen.
- Looking at the method by which the receiver may detect the position of the touch pen, the three sensors may respectively measure the transmission time of the infrared rays and the transmission time of the ultrasonic waves, convert the transmission times into distances, and then detect the position of the touch pen using the converted distances by a method such as triangulation, etc.
- According to still another embodiment of the invention, electromagnetic induction can be used to sense a nearness of a touch pen or a touch operation of a touch pen. When a touch pen having a metal coil is brought near the touch panel, electromagnetic induction may occur between the touch pen and the touch panel, and it is possible to sense whether or not the touch pen is near by way of the alteration in the electromagnetic field caused by this electromagnetic induction.
- Of course, the skilled person would appreciate that various sensing methods other than those described above can also be employed, such as methods using a resistive film, optical methods, etc.
- A description is provided below in further detail regarding a method of exchanging image information between a
user terminal 100 and a receivingterminal 102. -
FIG. 7 is a flowchart illustrating the process for exchanging image information between auser terminal 100 and a receivingterminal 102 according to an embodiment of the invention. - Referring to
FIG. 7 , theuser terminal 100 may transmit the image information shown on the display unit to the receiving terminal 102 (step 700). - Using the image information received from the
user terminal 100, the receivingterminal 102 may display an image on its display unit (step 702). Theuser terminal 100 can transmit or receive image information by way of an HDMI (High-Definition Multimedia Interface) method, for instance. Various methods for exchanging multimedia data other than HDMI can also be used. - When the user brings a touch means near to or in contact with the display unit of the
user terminal 100, theuser terminal 100 may show a position image in correspondence to the position of the touch means (step 704). - The position image can be shown as an overlay, or the position image of a preset form can be synthesized with the image shown on the display unit.
- The
user terminal 100 may transmit information to the receivingterminal 102 regarding the current display image in which the position image is shown (step 706). As described above, the position image can be shown on the display unit of theuser terminal 100 as an overlay or can be shown on the display unit of theuser terminal 100 by image synthesis. Theuser terminal 100 may transmit the display image currently shown after encoding it into a preset format. - The receiving
terminal 102 may receive the current display image information of theuser terminal 100 from theuser terminal 100 and may display an image incorporating the position image (step 708). - If the image information is exchanged according to the embodiment illustrated in
FIG. 7 , theuser terminal 100 and the receivingterminal 102 may always display the same image on their respective displays. -
FIG. 8 is a flowchart illustrating the process for exchanging image information between auser terminal 100 and a receivingterminal 102 according to another embodiment of the invention. - Referring to
FIG. 8 , theuser terminal 100 may transmit the image information shown on the display unit to the receiving terminal 102 (step 800). - Using the image information received from the
user terminal 100, the receivingterminal 102 may display an image on its display unit (step 802). - When the user brings a touch means near to or in contact with the display unit of the
user terminal 100, theuser terminal 100 may sense the contact or nearness of the touch means (step 804). - When a touch or a nearness of the touch means is sensed, the user terminal may calculate the position information of the touch means of which the touch or nearness is sensed (step 806). The position information of the touch means can be calculated by various methods.
- For instance, the position information of the touch means can be set as the coordinates of the touch means which represent the relative position of the touch means when the display unit of the
user terminal 100 is set as the entire coordinate range. - In another example, the coordinates of the touch means can also be set by using the pixel coordinates of the image currently shown on the user terminal's display unit. For example, when the position of the touch means corresponds to a particular pixel of the currently-shown image, the coordinates of the corresponding pixel can be set as the coordinates of the touch means.
- Of course, the skilled person would appreciate that the coordinates of the touch means can be calculated by various methods other than those described above.
- When the position information of the touch means is calculated, the user terminal may transmit the position information of the touch means to the receiving terminal 102 (step 808).
- For instance, when the image information shown on the user terminal is transmitted, the position information of the touch means can be provided as header information for the corresponding image information.
- In another example, the position information of the touch means can also be transmitted to the receiving
terminal 102 through a separate data channel. In cases where the position information of the touch means is transmitted through a data channel, theuser terminal 100 and the receivingterminal 102 may have to establish two channels, i.e. an image channel (a channel for transmitting image data) and a control channel (a channel for transmitting position information), when establishing connection. - Upon receiving the position information of the touch means from the
user terminal 100, the receivingterminal 102 may use the position information of the touch means to show a position image on the display unit of the receiving terminal 102 (step 810). - Although it is not illustrated in
FIG. 8 , the receivingterminal 102 can receive not only the position information of the position image but also the form information of the position image. As described above, the position image can change in size or form according to its position, and such form information of the position image can also be provided through the user terminal. As in an example described above, if the position image is positioned over a particular event-executing object, a position image having a different form from that of a regular position image can be provided, and such changed form information can also be provided together with the position information of the touch means. -
FIG. 9 is a flowchart illustrating the process for exchanging image information between auser terminal 100 and a receivingterminal 102 according to still another embodiment of the invention. - Referring to
FIG. 9 , theuser terminal 100 may transmit the image information shown on the display unit to the receiving terminal 102 (step 900). - Using the image information received from the
user terminal 100, the receivingterminal 102 may display an image on its display unit (step 902). - When the user brings a touch means near to or in contact with the display unit of the
user terminal 100, theuser terminal 100 may sense the contact or nearness of the touch means (step 904). - When a touch or a nearness of the touch means is sensed, the position of the touch means may be identified, and image information may be generated that incorporates a position image for the identified touch means (step 906). The image information thus generated may preferably be image information that is encoded in a predetermined format agreed upon with the receiving terminal.
- The image information generated to incorporate a position image may be displayed on the user terminal or may not be shown on the display of the user terminal.
- The
user terminal 100 may transmit the image information, generated to incorporate a position image, to the receiving terminal 102 (step 908). - The receiving
terminal 102 may use the received image information to display an image incorporating a position image on its display (step 910). - Certain methods for providing information regarding the position image to the receiving terminal have been described above. The embodiments described above are for illustrative purposes, and the skilled person would appreciate that the information relating to the position image can be provided by using various communication methods other than those of the illustrative embodiments described above.
- A description is provided below of the detailed modular composition of a user terminal to which an embodiment of the invention may be applied. The user terminal to which an embodiment of the invention is applied can operate according to the following descriptions after a particular application is installed, or the firmware for executing the operations described below can be installed at the time of the terminal's manufacture.
-
FIG. 10 is a block diagram illustrating the modular composition of a user terminal according to an embodiment of the invention. - Referring to
FIG. 10 , a user terminal according to an embodiment of the invention can include adisplay unit 1000, an imageinformation generation unit 1002, an imageinformation transmitting unit 1004, asensing unit 1006, a positionimage generation unit 1008, asetting unit 1010, and acontrol unit 1012. - The
display unit 1000 may display an image according to the operation of theuser terminal 100. Thedisplay unit 1000 may display images using various apparatuses such as LCD or LED, and may provide a touch interface. The display unit may display various screens according to the operation of the terminal such as a user interface screen, an application execution screen, etc. - The image
information generation unit 1002 may generate information regarding the image displayed on thedisplay unit 1000. The imageinformation generation unit 1002 may generate image information that is encoded in a preset format, for which various known encoding methods can be used. - The image
information transmitting unit 1004 may transmit the image information generated by the imageinformation generation unit 1002 to the receivingterminal 102. As described above, the transmission of image information can be implemented by various communication methods such as Wi-Fi, wireless HDMI, etc. The imageinformation transmitting unit 1004 may also serve to transmit information regarding the position image. - The
sensing unit 1006 may serve to sense a touch means such as a finger or a touch pen, etc. Thesensing unit 1006 may sense whether or not a touch means is brought near thedisplay unit 1000 and whether or not a touch is made on thedisplay unit 1000. - According to a preferred embodiment of the invention, the
sensing unit 1006 may sense a touch state of a touch means, more specifically, at least one of a touch pressure and a touch area. As described above, levels can be set beforehand for the touch pressure and touch area, and thesensing unit 1006 may sense the level which at least one of the touch pressure and touch area correspond to. For instance, a first level class corresponding to a low pressure and a second level class corresponding to a high pressure can be set, and the sensing unit can sense which level class, between the first level class and the second level class, a touch pressure corresponds to. - The
user terminal 100 can determine whether or not to execute an action according to a touch operation in correspondence to the touch pressure and touch area sensed by thesensing unit 1006. For example, if at least one of the touch pressure and the touch are belongs to a lower level, then theuser terminal 100 may not execute an action according to the touch, and if it belongs to a higher level, an action corresponding to the touch can be performed. That is, even if a touch is made on an event-executing object, a touch operation may be performed only when at least one of the touch pressure or the touch area is greater than or equal to a preset level. - The position
image generation unit 1008 may generate a position image corresponding to the position of a touch means if thesensing unit 1006 senses that the touch means is near or making a touch. - As described above, the position image can be generated in various forms, such as a shaded image, a cursor pointer image, etc.
- According to an embodiment of the invention, the position image of a preset form can be shown as an overlay on the image currently shown, or the position image can be synthesized with the currently-shown image.
- The position image generated by the position
image generation unit 1008 can be changed according to the sensing level sensed by thesensing unit 1006. - For instance, if the touch pressure or the touch area is of a lower level, the position
image generation unit 1008 may generate and show a position image, and if the touch pressure or the touch area is of a higher level, it may not show the position image. - In another example, if the touch pressure or the touch area is of a lower level, the position
image generation unit 1008 may generate a position image having a small size, and if the touch pressure or the touch area is of a higher level, the positionimage generation unit 1008 may generate a position image having a larger size. - In cases where the user maintains a touch state while moving the touch means, the position image can be an image that shows the touch trajectory, where a thick trajectory can be shown if the touch pressure or touch area is large while a thin trajectory can be shown if the touch pressure or touch area is small.
- In still another example, position images having different forms can be provided for the position image when the touch pressure and touch area is of a lower level and when the touch pressure and touch area is of a higher level.
- As described above, the position image generated at the position
image generation unit 1008 can have different forms depending not only on the sensing level of the sensing unit but also on the position of the position image. For example, the form of the position image when the position image is shown over an event-executing object can be different from the normal form of the position image. The position image can be modified in form in cases other than when it is over an event-executing object if a particular position of the position image is associated with a particular event. - The position image can be shown as an overlay, or a preset position image can be synthesized with the image currently shown.
- If the position image itself is shown on the user terminal as in the embodiment illustrated in
FIG. 10 , the information regarding the position image can be provided as the imageinformation generation unit 1002 generates the image information regarding the image currently shown. - The
setting unit 1010 may serve to provide a setting interface for the various functions for sharing the image shown on the display and the position image. For example, it can provide a setting interface for setting the sensing level classes of the sensing unit and the changes in the position image according to sensing levels, and can also serve to store the settings information. To be more specific, thesetting unit 1010 can change the settings such that thesensing unit 1006 only senses whether or not a touch is made and the function for sensing the levels of touch pressure and area is deactivated. - The
control unit 1012 may serve to control the overall operations of the components described above. -
FIG. 11 is a block diagram illustrating the modular composition of a user terminal according to another embodiment of the invention. - Referring to
FIG. 11 , a user terminal according to another embodiment of the invention can include adisplay unit 1100, an imageinformation generation unit 1102, an imageinformation transmitting unit 1104, asensing unit 1106, a position imageinformation generation unit 1108, asetting unit 1110, and acontrol unit 1112. - In describing
FIG. 11 , the components that are the same as in the embodiment illustrated inFIG. 10 will not be described again. -
FIG. 11 illustrates the modular composition of a user terminal which transmits the information of the position image separately, instead of transmitting the currently shown image of the user terminal incorporating the position image as inFIG. 10 . - In the user terminal of
FIG. 11 , the position imageinformation generation unit 1108 may generate position information and form information of a position image if thesensing unit 1106 senses a touch or a nearness of a touch means. If the form information of the position image is set beforehand in agreement with the receivingterminal 102, it would also be possible to generate only the position information of the position image. - As described above, the position information of the position image can include coordinate information of a touch means that is nearby or in contact, with respect to the currently-shown screen of the
user terminal 100, where the corresponding coordinate information can be provided to the receivingterminal 102 by way of the imageinformation transmitting unit 1104. Of course, a separate module can be included which only transmits the position image information. - The position image information can be provided through a different channel from that used for transmitting the information of the image currently shown on the display of the
user terminal 100, and if the same channel is used, the position image information can be included in the header of the image information. -
FIG. 12 is a block diagram illustrating the modular composition of a user terminal according to still another embodiment of the invention. - Referring to
FIG. 12 , a user terminal according to still another embodiment of the invention can include adisplay unit 1200, an imageinformation generation unit 1202, an imageinformation transmitting unit 1204, asensing unit 1206, a positionimage generation unit 1208, asetting unit 1210, acontrol unit 1212, and aninformation provider unit 1214. - The user terminal illustrated in
FIG. 12 additionally includes an information provider unit, compared to the user terminal illustrated inFIG. 10 . - The
information provider unit 1214 may serve to output a preset type of information in response to a touch of the user or a bringing near of a touch means. For example, theinformation provider unit 1214 can output information that allows the user to recognize the touch pressure and touch area. To be more specific, theinformation provider unit 1214 can output information regarding whether the user performs a touch operation with a strong touch pressure or a weak touch pressure, and can output information regarding whether the touch operation is performed with a wide touch area or a narrow touch area. - The
information provider unit 1214 can output information in a way that stimulates the user's auditory, tactile, or visual sensations. - If the pressure information or area information is provided in a form that stimulates the user's tactile sensation, the provision of the information can be achieved by using a vibration motor. If a touch is made with a pressure or area smaller than or equal to a preset value, the
information provider unit 1214 can inform the user of this fact by way of vibration. If the pressure information or area information is provided in a form that stimulates the user's auditory sensation, the information provider unit can provide the user with such information by way of a speaker. If the pressure information or area information is provided in a form that stimulates the user's visual sensation, the information provider unit can provide the user with such information by way of a light-emitting means on theuser terminal 100. - To be more specific, during a movement of a touch means which maintains contact with a touch pressure or a touch area smaller than or equal to a preset value, the
information provider unit 1214 can emit a continuous vibration, sound, or light. That is, when the touch of a touch means having a pressure or area smaller than or equal to a preset value is first recognized, a short vibration may be created once (for a first duration), and if the touch means is moved afterwards, a continuous vibration may be created (for a second duration). Here, the second duration can be longer than the first duration. - Also, the
information provider unit 1214 can provide position image change information in various forms when the position image is changed in accordance with certain conditions. - The components of the embodiments described above can also be easily understood from the perspective of processes. That is, the components can each be understood as a process.
- The embodiments of the present invention can be implemented in the form of program instructions that may be performed using various computer means and can be recorded in a computer-readable medium. Such a computer-readable medium can include program instructions, data files, data structures, etc., alone or in combination. The program instructions recorded on the medium can be designed and configured specifically for the present invention or can be a type of medium known to and used by the skilled person in the field of computer software. Examples of a computer-readable medium may include magnetic media such as hard disks, floppy disks, magnetic tapes, etc., optical media such as CD-ROM's, DVD's, etc., magneto-optical media such as floptical disks, etc., and hardware devices such as ROM, RAM, flash memory, etc. Examples of the program of instructions may include not only machine language codes produced by a compiler but also high-level language codes that can be executed by a computer through the use of an interpreter, etc. The hardware mentioned above can be made to operate as one or more software modules that perform the actions of the embodiments of the invention, and vice versa.
Claims (29)
1. A user terminal comprising:
a display unit configured to display an image;
a sensing unit configured to sense a position of a touch means near the display unit; and
an image information transmitting unit configured to transmit image information and position information to a receiving terminal, the image information relating to an image displayed on the display unit and the position information relating to a position of the touch means near the display unit,
wherein an image corresponding to the image information and an image indicating a position of the nearby touch means are displayed on the receiving terminal.
2. The user terminal of claim 1 , further comprising:
a position image generation unit configured to show a position image on the display unit, the position image corresponding to a position of the touch means near the display unit.
3. The user terminal of claim 2 , wherein the image information transmitting unit transmits the position information of the touch means by transmitting the image information of the display unit having shown therein the position image generated by the position image generation unit.
4. The user terminal of claim 1 , further comprising:
a position image information generation unit configured to generate the position information of the touch means near the display unit,
wherein the image information generation unit transmits the position information of the touch means together with the image information shown on the display unit.
5. The user terminal of claim 1 , further comprising:
a position image information generation unit configured to generate information regarding an image incorporating a position image indicating a position of the touch means,
wherein the image information transmitting unit transmits the information generated by the position image information generation unit.
6. The user terminal of claim 1 , wherein the image indicating the position of the touch means is changed in form when shown over an event-executing object.
7. The user terminal of claim 1 , wherein the image indicating the position of the touch means is changed in form when shown at a preset position.
8. The user terminal of claim 6 , further comprising:
an information provider unit configured to provide information associated with the change in the image indicating the position of the touch means.
9. A user terminal comprising:
a display unit configured to display an image;
a sensing unit configured to sense a position of a touch means touching the display unit; and
an image information transmitting unit configured to transmit image information and position information to a receiving terminal, the image information relating to an image displayed on the display unit and the position information relating to a position of the touch means touching the display unit,
wherein an image corresponding to the image information and an image indicating a position of the touch means are displayed on the receiving terminal.
10. The user terminal of claim 9 , wherein the sensing unit senses at least one of a touch pressure and a touch area of the touch means, and the image indicating the position of the touch means is changed according to at least one of the touch pressure and the touch area of the touch means.
11. The user terminal of claim 10 , wherein the image indicating the position of the touch means is not shown if at least one of the touch pressure and the touch area is within a particular level range.
12. The user terminal of claim 10 , wherein the image indicating the position of the touch means is changed in form according to at least one of the touch pressure and the touch area of the touch means.
13. The user terminal of claim 10 , wherein the image indicating the position of the touch means is changed in size according to at least one of the touch pressure and the touch area of the touch means.
14. The user terminal of claim 10 , further comprising:
a setting unit configured to provide an interface for setting sensing level classes of the sensing unit and setting changes in a position image according to the sensing levels and configured to store settings information.
15. The user terminal of claim 10 , further comprising:
a position image generation unit configured to show a position image on the display unit, the position image corresponding to a position of the touch means touching the display unit,
wherein the image information generation unit transmits the position information of the touch means by transmitting the image information of the display unit having shown therein the position image generated by the position image generation unit.
16. The user terminal of claim 10 , further comprising:
a position image information generation unit configured to generate the position information of the touch means touching the display unit,
wherein the image information generation unit transmits the position information of the touch means together with the image information shown on the display unit.
17. A method for sharing an image, the method comprising:
(a) sensing a position of a touch means near a display unit; and
(b) transmitting image information and position information to a receiving terminal, the image information relating to an image displayed on the display unit and the position information relating to a position of the touch means near the display unit,
wherein an image corresponding to the image information and an image indicating a position of the nearby touch means are displayed on the receiving terminal.
18. The method of claim 17 , further comprising:
generating a position image corresponding to the position of the touch means near the display unit,
wherein said step (b) comprises transmitting the position information of the touch means by transmitting the image information of the display unit having shown therein the generated position image.
19. The method of claim 17 , further comprising:
generating position image information, the position image information comprising the position information of the touch means near the display unit,
wherein said step (b) comprises transmitting the position information of the touch means together with the image information shown on the display unit.
20. The method of claim 17 , further comprising:
generating position image information, the position image information comprising information regarding an image having incorporated therein a position image indicating the position of the touch means,
wherein said step (b) comprises transmitting the information regarding the image having the position image incorporated therein.
21. The method of claim 17 , wherein the image indicating the position of the touch means is changed in form when shown over an event-executing object.
22. The method of claim 17 , wherein the image indicating the position of the touch means is changed in form when shown at a preset position.
23. The method of claim 21 , further comprising:
providing information associated with the change in the position image.
24. A method for sharing an image, the method comprising:
(a) sensing a position of a touch means touching a display unit; and
(b) transmitting image information and position information to a receiving terminal, the image information relating to an image displayed on the display unit and the position information relating to a position of the touch means touching the display unit,
wherein an image corresponding to the image information and an image indicating a position of the touch means are displayed on the receiving terminal.
25. The method of claim 24 , wherein said step (a) comprises sensing at least one of a touch pressure and a touch area of the touch means, and the image indicating the position of the touch means is changed according to at least one of the touch pressure and the touch area of the touch means.
26. The method of claim 25 , wherein the image indicating the position of the touch means is not shown if at least one of the touch pressure and the touch area is within a particular level range.
27. The method of claim 25 , wherein the image indicating the position of the touch means is changed in form according to at least one of the touch pressure and the touch area of the touch means.
28. The method of claim 25 , wherein the image indicating the position of the touch means is changed in size according to at least one of the touch pressure and the touch area of the touch means.
29. A recorded medium having recorded thereon and tangibly embodying a program of instructions for executing the method for sharing an image according to claim 17 .
Applications Claiming Priority (24)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2012-0022986 | 2012-03-06 | ||
KR1020120022984A KR101951484B1 (en) | 2012-03-06 | 2012-03-06 | System for interworking and controlling devices and reception device used in the same |
KR1020120022986A KR20130101886A (en) | 2012-03-06 | 2012-03-06 | System for interworking and controlling devices and reception device used in the same |
KR10-2012-0023012 | 2012-03-06 | ||
KR1020120023012A KR101151549B1 (en) | 2012-03-06 | 2012-03-06 | System for interworking and controlling devices and user device used in the same |
KR10-2012-0022988 | 2012-03-06 | ||
KR10-2012-0022984 | 2012-03-06 | ||
KR1020120022988A KR101212364B1 (en) | 2012-03-06 | 2012-03-06 | System for interworking and controlling devices and user device used in the same |
KR10-2012-0024092 | 2012-03-08 | ||
KR20120024073 | 2012-03-08 | ||
KR20120024092 | 2012-03-08 | ||
KR10-2012-0024073 | 2012-03-08 | ||
KR10-2012-0032982 | 2012-03-30 | ||
KR10-2012-0033047 | 2012-03-30 | ||
KR20120032982 | 2012-03-30 | ||
KR20120033047 | 2012-03-30 | ||
KR10-2012-0043148 | 2012-04-25 | ||
KR20120043148 | 2012-04-25 | ||
KR1020120058000A KR101384493B1 (en) | 2012-03-08 | 2012-05-31 | System for interworking and controlling devices and user device used in the same |
KR10-2012-0058000 | 2012-05-31 | ||
KR10-2012-0057998 | 2012-05-31 | ||
KR1020120057998A KR101337665B1 (en) | 2012-03-08 | 2012-05-31 | System for interworking and controlling devices and user device used in the same |
KR10-2012-0057996 | 2012-05-31 | ||
KR20120057996 | 2012-05-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
US20130241854A1 true US20130241854A1 (en) | 2013-09-19 |
Family
ID=49113653
Family Applications (7)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/785,370 Abandoned US20130234959A1 (en) | 2012-03-06 | 2013-03-05 | System and method for linking and controlling terminals |
US13/785,498 Expired - Fee Related US8913026B2 (en) | 2012-03-06 | 2013-03-05 | System for linking and controlling terminals and user terminal used in the same |
US13/785,302 Abandoned US20130241854A1 (en) | 2012-03-06 | 2013-03-05 | Image sharing system and user terminal for the system |
US13/785,600 Abandoned US20130234984A1 (en) | 2012-03-06 | 2013-03-05 | System for linking and controlling terminals and user terminal used in the same |
US14/155,204 Abandoned US20140125622A1 (en) | 2012-03-06 | 2014-01-14 | System for linking and controlling terminals and user terminal used in the same |
US15/052,803 Abandoned US20160170703A1 (en) | 2012-03-06 | 2016-02-24 | System and method for linking and controlling terminals |
US16/142,998 Active US10656895B2 (en) | 2012-03-06 | 2018-09-26 | System for linking and controlling terminals and user terminal used in the same |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/785,370 Abandoned US20130234959A1 (en) | 2012-03-06 | 2013-03-05 | System and method for linking and controlling terminals |
US13/785,498 Expired - Fee Related US8913026B2 (en) | 2012-03-06 | 2013-03-05 | System for linking and controlling terminals and user terminal used in the same |
Family Applications After (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/785,600 Abandoned US20130234984A1 (en) | 2012-03-06 | 2013-03-05 | System for linking and controlling terminals and user terminal used in the same |
US14/155,204 Abandoned US20140125622A1 (en) | 2012-03-06 | 2014-01-14 | System for linking and controlling terminals and user terminal used in the same |
US15/052,803 Abandoned US20160170703A1 (en) | 2012-03-06 | 2016-02-24 | System and method for linking and controlling terminals |
US16/142,998 Active US10656895B2 (en) | 2012-03-06 | 2018-09-26 | System for linking and controlling terminals and user terminal used in the same |
Country Status (1)
Country | Link |
---|---|
US (7) | US20130234959A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150212621A1 (en) * | 2014-01-24 | 2015-07-30 | Industrial Technology Research Institute | Touch panel controlling method and system |
US9335862B1 (en) * | 2014-11-14 | 2016-05-10 | International Business Machines Corporation | Virtual multi-device navigation in surface computing system |
CN105898462A (en) * | 2015-12-11 | 2016-08-24 | 乐视致新电子科技(天津)有限公司 | Method and device capable of operating video display device |
US9766849B2 (en) | 2014-11-03 | 2017-09-19 | Samsung Electronics Co., Ltd. | User terminal device and method for control thereof and system for providing contents |
US9830030B2 (en) | 2015-05-07 | 2017-11-28 | Industrial Technology Research Institute | Flexible touch panel, touch control device and operating method using the same |
Families Citing this family (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130234959A1 (en) | 2012-03-06 | 2013-09-12 | Industry-University Cooperation Foundation Hanyang University | System and method for linking and controlling terminals |
US20150109257A1 (en) * | 2013-10-23 | 2015-04-23 | Lumi Stream Inc. | Pre-touch pointer for control and data entry in touch-screen devices |
CN104683863A (en) * | 2013-11-28 | 2015-06-03 | 中国移动通信集团公司 | Method and equipment for multimedia data transmission |
KR20150069155A (en) * | 2013-12-13 | 2015-06-23 | 삼성전자주식회사 | Touch indicator display method of electronic apparatus and electronic appparatus thereof |
US20150199030A1 (en) * | 2014-01-10 | 2015-07-16 | Microsoft Corporation | Hover-Sensitive Control Of Secondary Display |
CN104793912A (en) * | 2014-01-22 | 2015-07-22 | 宏碁股份有限公司 | Operation method and operation system |
KR102187027B1 (en) * | 2014-06-25 | 2020-12-04 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
TWI533190B (en) * | 2014-09-23 | 2016-05-11 | 緯創資通股份有限公司 | Touch sensing apparatus, touch system and touch sensing method |
US20160092152A1 (en) * | 2014-09-25 | 2016-03-31 | Oracle International Corporation | Extended screen experience |
CN105183284B (en) | 2015-08-27 | 2018-07-20 | 广东欧珀移动通信有限公司 | A kind of method and user terminal for checking short message |
CN105426178B (en) * | 2015-11-02 | 2019-03-08 | Oppo广东移动通信有限公司 | Display system, the display methods of terminal system of terminal |
US20170160866A1 (en) * | 2015-12-08 | 2017-06-08 | Innolux Corporation | Touch display device |
KR101771837B1 (en) * | 2016-02-17 | 2017-08-25 | (주)휴맥스 | Remote controller providing force input in a media system and method of driving the same |
KR101886209B1 (en) * | 2016-04-19 | 2018-08-08 | (주)휴맥스 | Apparatus and method of providing media service |
KR101775829B1 (en) * | 2016-05-16 | 2017-09-06 | (주)휴맥스 | Computer processing device and method for determining coordinate compensation and error for remote control key using user profile information based on force input |
CN108810592A (en) * | 2017-04-28 | 2018-11-13 | 数码士有限公司 | The remote controler and its driving method of strength input are provided in media system |
CN108984140B (en) * | 2018-06-28 | 2021-01-15 | 联想(北京)有限公司 | Display control method and system |
KR20200055983A (en) * | 2018-11-14 | 2020-05-22 | 삼성전자주식회사 | Method for estimating electromagnatic signal radiated from device and an electronic device thereof |
US11625155B2 (en) * | 2020-03-23 | 2023-04-11 | Ricoh Company, Ltd. | Information processing system, user terminal, method of processing information |
CN111541922B (en) * | 2020-04-15 | 2022-12-13 | 北京小米移动软件有限公司 | Method, device and storage medium for displaying interface input information |
CN114647390B (en) * | 2020-12-21 | 2024-03-26 | 华为技术有限公司 | Enhanced screen sharing method and system and electronic equipment |
US11606456B1 (en) | 2021-10-19 | 2023-03-14 | Motorola Mobility Llc | Electronic devices and corresponding methods utilizing ultra-wideband communication signals for user interface enhancement |
US11907495B2 (en) * | 2021-10-19 | 2024-02-20 | Motorola Mobility Llc | Electronic devices and corresponding methods utilizing ultra-wideband communication signals for user interface enhancement |
US11503358B1 (en) | 2021-10-19 | 2022-11-15 | Motorola Mobility Llc | Electronic devices and corresponding methods utilizing ultra-wideband communication signals for user interface enhancement |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060132455A1 (en) * | 2004-12-21 | 2006-06-22 | Microsoft Corporation | Pressure based selection |
US20080122796A1 (en) * | 2006-09-06 | 2008-05-29 | Jobs Steven P | Touch Screen Device, Method, and Graphical User Interface for Determining Commands by Applying Heuristics |
US20110057896A1 (en) * | 2009-09-04 | 2011-03-10 | Samsung Electronics Co., Ltd. | Apparatus and method for controlling mobile terminal |
US20110279384A1 (en) * | 2010-05-14 | 2011-11-17 | Google Inc. | Automatic Derivation of Analogous Touch Gestures From A User-Defined Gesture |
US20120038678A1 (en) * | 2010-08-13 | 2012-02-16 | Lg Electronics Inc. | Mobile terminal, display device and controlling method thereof |
US20120056825A1 (en) * | 2010-03-16 | 2012-03-08 | Immersion Corporation | Systems And Methods For Pre-Touch And True Touch |
US20120141342A1 (en) * | 2010-12-06 | 2012-06-07 | Pablo Alurralde | Recovery of li values from sodium saturate brine |
US20120194440A1 (en) * | 2011-01-31 | 2012-08-02 | Research In Motion Limited | Electronic device and method of controlling same |
US20130335333A1 (en) * | 2010-03-05 | 2013-12-19 | Adobe Systems Incorporated | Editing content using multiple touch inputs |
Family Cites Families (102)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4903012A (en) * | 1987-01-20 | 1990-02-20 | Alps Electric Co., Ltd. | Coordinate system input device providing registration calibration and a mouse function |
US4814552A (en) * | 1987-12-02 | 1989-03-21 | Xerox Corporation | Ultrasound position input device |
WO1992008206A1 (en) * | 1990-11-01 | 1992-05-14 | Gazelle Graphic Systems Inc. | Electromagnetic position transducer having active transmitting stylus |
US5120908A (en) * | 1990-11-01 | 1992-06-09 | Gazelle Graphic Systems Inc. | Electromagnetic position transducer |
JP3510318B2 (en) * | 1994-04-28 | 2004-03-29 | 株式会社ワコム | Angle information input device |
US6321158B1 (en) * | 1994-06-24 | 2001-11-20 | Delorme Publishing Company | Integrated routing/mapping information |
US5867146A (en) * | 1996-01-17 | 1999-02-02 | Lg Electronics Inc. | Three dimensional wireless pointing device |
US6008807A (en) * | 1997-07-14 | 1999-12-28 | Microsoft Corporation | Method and system for controlling the display of objects in a slide show presentation |
US6429846B2 (en) | 1998-06-23 | 2002-08-06 | Immersion Corporation | Haptic feedback for touchpads and other touch controls |
DE60023575T2 (en) * | 1999-02-26 | 2006-07-13 | Canon K.K. | Image display control system and method |
US6411283B1 (en) | 1999-05-20 | 2002-06-25 | Micron Technology, Inc. | Computer touch screen adapted to facilitate selection of features at edge of screen |
US6492979B1 (en) * | 1999-09-07 | 2002-12-10 | Elo Touchsystems, Inc. | Dual sensor touchscreen utilizing projective-capacitive and force touch sensors |
US6424338B1 (en) * | 1999-09-30 | 2002-07-23 | Gateway, Inc. | Speed zone touchpad |
US7565680B1 (en) * | 2000-06-30 | 2009-07-21 | Comcast Ip Holdings I, Llc | Advanced set top terminal having a video call feature |
US20020171689A1 (en) | 2001-05-15 | 2002-11-21 | International Business Machines Corporation | Method and system for providing a pre-selection indicator for a graphical user interface (GUI) widget |
TW508526B (en) * | 2001-06-15 | 2002-11-01 | Compal Electronics Inc | Personal digital assistant with a power-saving external image output port |
KR100474724B1 (en) * | 2001-08-04 | 2005-03-08 | 삼성전자주식회사 | Apparatus having touch screen and external display device using method therefor |
US7796122B2 (en) * | 2001-12-29 | 2010-09-14 | Taiguen Technology (Shen—Zhen) Co., Ltd. | Touch control display screen with a built-in electromagnet induction layer of septum array grids |
US7109975B2 (en) * | 2002-01-29 | 2006-09-19 | Meta4Hand Inc. | Computer pointer control |
JP3925297B2 (en) | 2002-05-13 | 2007-06-06 | ソニー株式会社 | Video display system and video display control device |
US20030231168A1 (en) | 2002-06-18 | 2003-12-18 | Jory Bell | Component for use as a portable computing device and pointing device in a modular computing system |
KR100480823B1 (en) * | 2002-11-14 | 2005-04-07 | 엘지.필립스 엘시디 주식회사 | touch panel for display device |
US20050071761A1 (en) | 2003-09-25 | 2005-03-31 | Nokia Corporation | User interface on a portable electronic device |
US8164573B2 (en) * | 2003-11-26 | 2012-04-24 | Immersion Corporation | Systems and methods for adaptive interpretation of input from a touch-sensitive input device |
KR100539904B1 (en) | 2004-02-27 | 2005-12-28 | 삼성전자주식회사 | Pointing device in terminal having touch screen and method for using it |
EP1596538A1 (en) * | 2004-05-10 | 2005-11-16 | Sony Ericsson Mobile Communications AB | Method and device for bluetooth pairing |
US20050256923A1 (en) | 2004-05-14 | 2005-11-17 | Citrix Systems, Inc. | Methods and apparatus for displaying application output on devices having constrained system resources |
US7245502B2 (en) * | 2004-06-07 | 2007-07-17 | Broadcom Corporation | Small form factor USB bluetooth dongle |
DE602005023897D1 (en) | 2004-08-02 | 2010-11-11 | Koninkl Philips Electronics Nv | TOUCH SCREEN WITH PRESSURE-RELATED VISUAL RETURN |
US20060103871A1 (en) * | 2004-11-16 | 2006-05-18 | Erwin Weinans | Methods, apparatus and computer program products supporting display generation in peripheral devices for communications terminals |
US20060135865A1 (en) * | 2004-11-23 | 2006-06-22 | General Electric Company | Method and apparatus for synching of images using regions of interest mapped by a user |
US7629966B2 (en) * | 2004-12-21 | 2009-12-08 | Microsoft Corporation | Hard tap |
US8717301B2 (en) | 2005-08-01 | 2014-05-06 | Sony Corporation | Information processing apparatus and method, and program |
US7835505B2 (en) * | 2005-05-13 | 2010-11-16 | Microsoft Corporation | Phone-to-monitor connection device |
EP1724955A3 (en) * | 2005-05-17 | 2007-01-03 | Samsung Electronics Co.,Ltd. | Method for taking a telephone call while receiving a broadcast service, and digital multimedia broadcasting terminal using this method |
US20080115073A1 (en) * | 2005-05-26 | 2008-05-15 | ERICKSON Shawn | Method and Apparatus for Remote Display of Drawn Content |
US8495514B1 (en) * | 2005-06-02 | 2013-07-23 | Oracle America, Inc. | Transparency assisted window focus and selection |
US8018440B2 (en) | 2005-12-30 | 2011-09-13 | Microsoft Corporation | Unintentional touch rejection |
US20090024721A1 (en) | 2006-02-27 | 2009-01-22 | Kyocera Corporation | Image Information Sharing System |
KR101128803B1 (en) | 2006-05-03 | 2012-03-23 | 엘지전자 주식회사 | A mobile communication terminal, and method of processing input signal in a mobile communication terminal with touch panel |
US9063647B2 (en) | 2006-05-12 | 2015-06-23 | Microsoft Technology Licensing, Llc | Multi-touch uses, gestures, and implementation |
KR100816286B1 (en) * | 2006-05-18 | 2008-03-24 | 삼성전자주식회사 | Display apparatus and support method using the portable terminal and the external device |
US20080079757A1 (en) * | 2006-09-29 | 2008-04-03 | Hochmuth Roland M | Display resolution matching or scaling for remotely coupled systems |
US7890863B2 (en) * | 2006-10-04 | 2011-02-15 | Immersion Corporation | Haptic effects with proximity sensing |
US8330773B2 (en) * | 2006-11-21 | 2012-12-11 | Microsoft Corporation | Mobile data and handwriting screen capture and forwarding |
KR100881186B1 (en) * | 2007-01-04 | 2009-02-05 | 삼성전자주식회사 | Touch screen display device |
US8665225B2 (en) * | 2007-01-07 | 2014-03-04 | Apple Inc. | Portable multifunction device, method, and graphical user interface for interpreting a finger gesture |
US20080273015A1 (en) * | 2007-05-02 | 2008-11-06 | GIGA BYTE Communications, Inc. | Dual function touch screen module for portable device and opeating method therefor |
US20080278454A1 (en) * | 2007-05-08 | 2008-11-13 | Samsung Electronics Co. Ltd. | Method for setting touch sensitivity in portable terminal |
JP2009100246A (en) * | 2007-10-17 | 2009-05-07 | Hitachi Ltd | Display device |
JP4533421B2 (en) | 2007-11-22 | 2010-09-01 | シャープ株式会社 | Display device |
US8184096B2 (en) | 2007-12-04 | 2012-05-22 | Apple Inc. | Cursor transitions |
US9857872B2 (en) * | 2007-12-31 | 2018-01-02 | Apple Inc. | Multi-touch display screen with localized tactile feedback |
US20090225043A1 (en) * | 2008-03-05 | 2009-09-10 | Plantronics, Inc. | Touch Feedback With Hover |
KR101513023B1 (en) | 2008-03-25 | 2015-04-22 | 엘지전자 주식회사 | Terminal and method of displaying information therein |
US8525802B2 (en) * | 2008-03-31 | 2013-09-03 | Lg Electronics Inc. | Portable terminal capable of sensing proximity touch and method for providing graphic user interface using the same |
US20090251422A1 (en) * | 2008-04-08 | 2009-10-08 | Honeywell International Inc. | Method and system for enhancing interaction of a virtual keyboard provided through a small touch screen |
KR101513025B1 (en) | 2008-05-08 | 2015-04-17 | 엘지전자 주식회사 | Mobile terminal and method of notifying information therein |
CN102027450B (en) * | 2008-05-20 | 2015-05-13 | 思杰系统有限公司 | Methods and systems for using external display devices with a mobile computing device |
US8594740B2 (en) * | 2008-06-11 | 2013-11-26 | Pantech Co., Ltd. | Mobile communication terminal and data input method |
US9030418B2 (en) * | 2008-06-24 | 2015-05-12 | Lg Electronics Inc. | Mobile terminal capable of sensing proximity touch |
US8446372B2 (en) * | 2008-07-09 | 2013-05-21 | Lenovo (Singapore) Pte. Ltd. | Apparatus, system, and method for automated touchpad adjustments |
KR101436608B1 (en) * | 2008-07-28 | 2014-09-01 | 삼성전자 주식회사 | Mobile terminal having touch screen and method for displaying cursor thereof |
KR101474452B1 (en) | 2008-08-04 | 2014-12-19 | 엘지전자 주식회사 | Method for controlling touch input of mobile terminal |
KR20100023981A (en) | 2008-08-23 | 2010-03-05 | 정윤경 | Credit card having contents and drviving method thereof |
KR101256008B1 (en) | 2008-09-05 | 2013-04-18 | 에스케이플래닛 주식회사 | Apparatus and method for providing virtual PC using broadcasting receiver and mobile terminal |
US8750938B2 (en) * | 2008-09-29 | 2014-06-10 | Microsoft Corporation | Glow touch feedback for virtual input devices |
KR20100064840A (en) | 2008-12-05 | 2010-06-15 | 엘지전자 주식회사 | Mobile terminal and operation method thereof |
KR101030398B1 (en) * | 2008-12-15 | 2011-04-20 | 삼성전자주식회사 | Touch sensitivity changeable touch sensor depending on user interface mode of terminal and the providing method thereof |
EP2389622A1 (en) * | 2009-01-26 | 2011-11-30 | Zrro Technologies (2009) Ltd. | Device and method for monitoring an object's behavior |
KR20100095951A (en) | 2009-02-23 | 2010-09-01 | 주식회사 팬택 | Portable electronic equipment and control method thereof |
US8339372B2 (en) * | 2009-04-20 | 2012-12-25 | Broadcom Corporation | Inductive touch screen with integrated antenna for use in a communication device and methods for use therewith |
KR101491169B1 (en) | 2009-05-07 | 2015-02-06 | 현대자동차일본기술연구소 | Device and method for controlling AVN of vehicle |
KR101533691B1 (en) * | 2009-06-01 | 2015-07-03 | 삼성전자 주식회사 | System for connecting a device with bluetooth module and method thereof |
US8378798B2 (en) * | 2009-07-24 | 2013-02-19 | Research In Motion Limited | Method and apparatus for a touch-sensitive display |
US20130009907A1 (en) * | 2009-07-31 | 2013-01-10 | Rosenberg Ilya D | Magnetic Stylus |
KR20110015745A (en) * | 2009-08-10 | 2011-02-17 | 삼성전자주식회사 | Appratus and method for controlling sensitivity of touch in a portable terminal |
KR20110027117A (en) | 2009-09-09 | 2011-03-16 | 삼성전자주식회사 | Electronic apparatus with touch panel and displaying method thereof |
KR101179466B1 (en) | 2009-09-22 | 2012-09-07 | 에스케이플래닛 주식회사 | Mobile terminal and method for displaying object using approach sensing of touch tool thereof |
KR101651128B1 (en) | 2009-10-05 | 2016-08-25 | 엘지전자 주식회사 | Mobile terminal and method for controlling application execution thereof |
KR101500008B1 (en) | 2009-11-25 | 2015-03-18 | 현대자동차주식회사 | System for connecting car handset unit with external device |
KR20110067559A (en) | 2009-12-14 | 2011-06-22 | 삼성전자주식회사 | Display device and control method thereof, display system and control method thereof |
KR101210298B1 (en) | 2009-12-18 | 2012-12-10 | 에스케이플래닛 주식회사 | User interface method for using touch input and terminal |
US20120287090A1 (en) * | 2009-12-29 | 2012-11-15 | Sanford, L.P. | Interactive Whiteboard with Wireless Remote Control |
US10048725B2 (en) * | 2010-01-26 | 2018-08-14 | Apple Inc. | Video out interface for electronic device |
KR20110100121A (en) | 2010-03-03 | 2011-09-09 | 삼성전자주식회사 | Method and apparatus for inputting character in mobile terminal |
US8384683B2 (en) * | 2010-04-23 | 2013-02-26 | Tong Luo | Method for user input from the back panel of a handheld computerized device |
KR20110119464A (en) | 2010-04-27 | 2011-11-02 | 터치유아이(주) | Touch screen device and methods of operating terminal using the same |
CN102860034B (en) * | 2010-04-28 | 2016-05-18 | Lg电子株式会社 | The method of image display and operation image display |
KR101099838B1 (en) | 2010-06-02 | 2011-12-27 | 신두일 | Remote a/s method using video phone call between computer and mobile phone |
US8836643B2 (en) * | 2010-06-10 | 2014-09-16 | Qualcomm Incorporated | Auto-morphing adaptive user interface device and methods |
EP2605611B1 (en) * | 2010-08-13 | 2020-12-23 | LG Electronics Inc. | Mobile/portable terminal device for displaying and method for controlling same |
JP5510185B2 (en) | 2010-08-20 | 2014-06-04 | ソニー株式会社 | Information processing apparatus, program, and display control method |
US8358596B2 (en) * | 2010-09-20 | 2013-01-22 | Research In Motion Limited | Communications system providing mobile wireless communications device application module associations for respective wireless communications formats and related methods |
US9851849B2 (en) * | 2010-12-03 | 2017-12-26 | Apple Inc. | Touch device communication |
US8369893B2 (en) * | 2010-12-31 | 2013-02-05 | Motorola Mobility Llc | Method and system for adapting mobile device to accommodate external display |
US8963858B2 (en) * | 2011-02-28 | 2015-02-24 | Semtech Corporation | Use of resistive touch screen as a proximity sensor |
US9152373B2 (en) * | 2011-04-12 | 2015-10-06 | Apple Inc. | Gesture visualization and sharing between electronic devices and remote displays |
CA2838280C (en) | 2011-06-15 | 2017-10-10 | Smart Technologies Ulc | Interactive surface with user proximity detection |
KR101286358B1 (en) * | 2011-08-11 | 2013-07-15 | 엘지전자 주식회사 | Display method and apparatus |
US8966131B2 (en) * | 2012-01-06 | 2015-02-24 | Qualcomm Incorporated | System method for bi-directional tunneling via user input back channel (UIBC) for wireless displays |
US20130234959A1 (en) * | 2012-03-06 | 2013-09-12 | Industry-University Cooperation Foundation Hanyang University | System and method for linking and controlling terminals |
-
2013
- 2013-03-05 US US13/785,370 patent/US20130234959A1/en not_active Abandoned
- 2013-03-05 US US13/785,498 patent/US8913026B2/en not_active Expired - Fee Related
- 2013-03-05 US US13/785,302 patent/US20130241854A1/en not_active Abandoned
- 2013-03-05 US US13/785,600 patent/US20130234984A1/en not_active Abandoned
-
2014
- 2014-01-14 US US14/155,204 patent/US20140125622A1/en not_active Abandoned
-
2016
- 2016-02-24 US US15/052,803 patent/US20160170703A1/en not_active Abandoned
-
2018
- 2018-09-26 US US16/142,998 patent/US10656895B2/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060132455A1 (en) * | 2004-12-21 | 2006-06-22 | Microsoft Corporation | Pressure based selection |
US20080122796A1 (en) * | 2006-09-06 | 2008-05-29 | Jobs Steven P | Touch Screen Device, Method, and Graphical User Interface for Determining Commands by Applying Heuristics |
US20110057896A1 (en) * | 2009-09-04 | 2011-03-10 | Samsung Electronics Co., Ltd. | Apparatus and method for controlling mobile terminal |
US20130335333A1 (en) * | 2010-03-05 | 2013-12-19 | Adobe Systems Incorporated | Editing content using multiple touch inputs |
US20120056825A1 (en) * | 2010-03-16 | 2012-03-08 | Immersion Corporation | Systems And Methods For Pre-Touch And True Touch |
US20110279384A1 (en) * | 2010-05-14 | 2011-11-17 | Google Inc. | Automatic Derivation of Analogous Touch Gestures From A User-Defined Gesture |
US20120038678A1 (en) * | 2010-08-13 | 2012-02-16 | Lg Electronics Inc. | Mobile terminal, display device and controlling method thereof |
US20120141342A1 (en) * | 2010-12-06 | 2012-06-07 | Pablo Alurralde | Recovery of li values from sodium saturate brine |
US20120194440A1 (en) * | 2011-01-31 | 2012-08-02 | Research In Motion Limited | Electronic device and method of controlling same |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150212621A1 (en) * | 2014-01-24 | 2015-07-30 | Industrial Technology Research Institute | Touch panel controlling method and system |
US9766849B2 (en) | 2014-11-03 | 2017-09-19 | Samsung Electronics Co., Ltd. | User terminal device and method for control thereof and system for providing contents |
US10353656B2 (en) | 2014-11-03 | 2019-07-16 | Samsung Electronics Co., Ltd. | User terminal device and method for control thereof and system for providing contents |
US9335862B1 (en) * | 2014-11-14 | 2016-05-10 | International Business Machines Corporation | Virtual multi-device navigation in surface computing system |
US9830030B2 (en) | 2015-05-07 | 2017-11-28 | Industrial Technology Research Institute | Flexible touch panel, touch control device and operating method using the same |
CN105898462A (en) * | 2015-12-11 | 2016-08-24 | 乐视致新电子科技(天津)有限公司 | Method and device capable of operating video display device |
Also Published As
Publication number | Publication date |
---|---|
US20190026059A1 (en) | 2019-01-24 |
US10656895B2 (en) | 2020-05-19 |
US20130234984A1 (en) | 2013-09-12 |
US20130234959A1 (en) | 2013-09-12 |
US20130234983A1 (en) | 2013-09-12 |
US20160170703A1 (en) | 2016-06-16 |
US8913026B2 (en) | 2014-12-16 |
US20140125622A1 (en) | 2014-05-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20130241854A1 (en) | Image sharing system and user terminal for the system | |
US10642449B2 (en) | Identifying applications on which content is available | |
US10120454B2 (en) | Gesture recognition control device | |
US9621434B2 (en) | Display apparatus, remote control apparatus, and method for providing user interface using the same | |
KR102284108B1 (en) | Electronic apparatus and method for screen sharing with external display apparatus | |
US10228810B2 (en) | Method of transmitting inquiry message, display device for the method, method of sharing information, and mobile terminal | |
KR20170036786A (en) | Mobile device input controller for secondary display | |
US20200142495A1 (en) | Gesture recognition control device | |
KR20160139481A (en) | User terminal apparatus and control method thereof | |
US20130244730A1 (en) | User terminal capable of sharing image and method for controlling the same | |
KR20150134674A (en) | User terminal device, and Method for controlling for User terminal device, and multimedia system thereof | |
KR101872272B1 (en) | Method and apparatus for controlling of electronic device using a control device | |
KR101384493B1 (en) | System for interworking and controlling devices and user device used in the same | |
US9691270B1 (en) | Automatically configuring a remote control for a device | |
WO2022017421A1 (en) | Interaction method, display device, emission device, interaction system, and storage medium | |
KR101515912B1 (en) | User Terminal Capable of Sharing Image and Method for Controlling the Same | |
KR20160050664A (en) | Electronic device having tactile sensor, method for operating thereof and system | |
KR101151549B1 (en) | System for interworking and controlling devices and user device used in the same | |
JP7263576B1 (en) | Program, method and information processing device | |
KR101951484B1 (en) | System for interworking and controlling devices and reception device used in the same | |
KR20130129693A (en) | System for interworking and controlling devices and user device used in the same | |
CN117707397A (en) | Control method and device | |
KR20130129684A (en) | System for interworking and controlling devices and user device used in the same | |
KR20190135958A (en) | User interface controlling device and method for selecting object in image and image input device | |
KR20130101886A (en) | System for interworking and controlling devices and reception device used in the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: INDUSTRY-UNIVERSITY COOPERATION FOUNDATION HANYANG Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YOO, CHANG-SIK;REEL/FRAME:030505/0980 Effective date: 20130508 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |