US20120054838A1 - Mobile terminal and information security setting method thereof - Google Patents
Mobile terminal and information security setting method thereof Download PDFInfo
- Publication number
- US20120054838A1 US20120054838A1 US13/217,212 US201113217212A US2012054838A1 US 20120054838 A1 US20120054838 A1 US 20120054838A1 US 201113217212 A US201113217212 A US 201113217212A US 2012054838 A1 US2012054838 A1 US 2012054838A1
- Authority
- US
- United States
- Prior art keywords
- content
- user
- security information
- information
- security
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 51
- 238000004891 communication Methods 0.000 claims description 22
- 230000001815 facial effect Effects 0.000 claims description 4
- 230000000694 effects Effects 0.000 description 11
- 230000006870 function Effects 0.000 description 11
- 230000004044 response Effects 0.000 description 8
- 238000010295 mobile communication Methods 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 5
- 230000005236 sound signal Effects 0.000 description 5
- 230000009471 action Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000002347 injection Methods 0.000 description 2
- 239000007924 injection Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 101150012579 ADSL gene Proteins 0.000 description 1
- 102100020775 Adenylosuccinate lyase Human genes 0.000 description 1
- 108700040193 Adenylosuccinate lyases Proteins 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000001186 cumulative effect Effects 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 230000005672 electromagnetic field Effects 0.000 description 1
- 239000010408 film Substances 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 238000013468 resource allocation Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W12/00—Security arrangements; Authentication; Protecting privacy or anonymity
- H04W12/30—Security of mobile devices; Security of mobile applications
-
- G06T5/73—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/10—Network architectures or network communication protocols for network security for controlling access to devices or network resources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00204—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a digital computer or a digital computer system, e.g. an internet server
- H04N1/00209—Transmitting or receiving image data, e.g. facsimile data, via a computer, e.g. using e-mail, a computer network, the internet, I-fax
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W12/00—Security arrangements; Authentication; Protecting privacy or anonymity
- H04W12/08—Access security
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W12/00—Security arrangements; Authentication; Protecting privacy or anonymity
- H04W12/60—Context-dependent security
- H04W12/67—Risk-dependent, e.g. selecting a security level depending on risk profiles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W12/00—Security arrangements; Authentication; Protecting privacy or anonymity
- H04W12/80—Arrangements enabling lawful interception [LI]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L2463/00—Additional details relating to network architectures or network communication protocols for network security covered by H04L63/00
- H04L2463/101—Additional details relating to network architectures or network communication protocols for network security covered by H04L63/00 applying security measures for digital rights management
Definitions
- the present invention is related to information security of a mobile terminal. Specifically, to a mobile terminal capable of setting a security level for information uploaded to a Social Network Service (SNS) site, and a security setting method thereof.
- SNS Social Network Service
- a mobile terminal is a device which may be configured to perform various functions. Examples of such functions include data and voice communications, capturing images and video via a camera, recording audio, outputting music via a speaker system, and displaying images and video on a display. Some terminals include additional functionality which supports game playing, while other terminals are also configured as multimedia players. More recently, mobile terminals have been configured to receive broadcast and multicast signals which permit viewing of contents, such as videos and television programs.
- a Social Network Service is a community type web-based service for communication and information sharing with others.
- SNS sites include sites such as TwitterTM, FacebookTM, and Google+TM.
- An SNS allows a user to share information with other users and allows the user to view information regarding other users. However the user is vulnerable to security risks in that a user's private information may be accessed without the user's permission.
- an information security method for a mobile terminal includes setting security information for a content associated with a first user, uploading the content to a Social Network Service (SNS) site, and uploading the security information to the SNS site to permit the SNS site to register the security information in order to display the content according to the security information when the content is accessed by a second user via the SNS site.
- SNS Social Network Service
- the content comprises a photo, a video, or text. Additionally, the information security method further includes setting content access rights for the second user in the security information.
- the information security method further includes setting content access rights according to at least device information, identification (ID) information, internet protocol (IP) information, or position information for a device in the security information.
- ID identification
- IP internet protocol
- the information security method further includes setting the security information for an entirety of the content or a specific portion of the content.
- the information security method further includes setting the security information as a site-wide rule for an SNS site or for a specific group within an SNS site.
- the information security method further includes setting the security information to pixilate, blur, blacken, or replace the content when accessed by the second user.
- an information security setting method for a mobile terminal includes setting security information associated with a first user for a Social Network Service (SNS) site, and uploading the security information to the SNS site to permit the SNS site to apply the security information to content associated with the first user in order to display the content according to the security information when the content is accessed by a second user via the SNS site.
- SNS Social Network Service
- a mobile terminal includes a controller configured to set security information for content associated with a first user, and a wireless communication unit configured to upload the content and the security information to a Social Network Service (SNS) site to permit the SNS site to display the content according to the security information when a second user accesses the uploaded content via the SNS site.
- SNS Social Network Service
- a mobile terminal includes a controller configured to set security information for a first user, and a wireless communication unit configured to upload the security information to a Social Network Service (SNS) site to permit the SNS site to apply the security information to content associated with the first user in order to display the content according to the security information when a second user accesses the content associated with the first user via the SNS site.
- SNS Social Network Service
- FIG. 1 illustrates a block diagram of a mobile terminal according to an embodiment of the present invention.
- FIG. 2 illustrates a block diagram of a mobile communication system operating with the mobile terminal according to an embodiment of the present invention.
- FIGS. 3A and 3B illustrate a menu for setting a security level for content according to an embodiment of the present invention.
- FIG. 4 illustrates a screen displayed after setting a security level according to an embodiment of the present invention.
- FIG. 5 illustrates an example of applying a security level according to an embodiment of the present invention.
- FIGS. 6A to 6C illustrate examples of a security level menu according to an embodiment of the present invention.
- FIG. 7 illustrates an example for applying a security level according to an embodiment of the present invention.
- FIGS. 8A and 8B illustrate examples of applying a security level according to an embodiment of the present invention.
- FIGS. 9A and 9B illustrate examples of applying a security level according to an embodiment of the present invention.
- FIG. 10 illustrates an example of applying a security level according to an embodiment of the present invention.
- FIG. 11 is a flowchart illustrating a method for setting a security level on a mobile terminal according to an embodiment of the present invention.
- FIG. 12 is a flowchart illustrating a method for setting a security level on a mobile terminal according to an embodiment of the present invention.
- the suffixes “module,” “unit,” and “part” are used for elements in order to facilitate the disclosure only. Therefore, significant meanings or roles are not given to the suffixes themselves and it is understood that the “module,” “unit,” and “part” can be used together or interchangeably.
- Mobile terminals described in this disclosure can include a mobile phone, a smart phone, a laptop computer, a digital broadcast terminal, a PDA (personal digital assistant), a PMP (portable multimedia player), and a navigation system.
- PDA personal digital assistant
- PMP portable multimedia player
- FIG. 1 is a block diagram of a mobile terminal 100 according to one embodiment of the present invention.
- the mobile terminal 100 includes a wireless communication unit 110 , an A/V (audio/video) input unit 120 , a user input unit 130 , a sensing unit 140 , an output unit 150 , a memory 160 , an interface unit 170 , a controller 180 , and a power supply unit 190 .
- FIG. 1 shows the mobile terminal 100 having various components, but it is understood that implementing all of the illustrated components is not a requirement. More or fewer components may be implemented according to various embodiments.
- the wireless communication unit 110 typically includes one or more components which permit wireless communication between the mobile terminal 100 and a wireless communication system or network within which the mobile terminal 100 is located.
- the wireless communication unit 110 can include a broadcast receiving module 111 , a mobile communication module 112 , a wireless Internet module 113 , a short-range communication module 114 , and a position-location module 115 .
- the broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast managing server via a broadcast channel.
- the broadcast channel may include a satellite channel and/or a terrestrial channel.
- the broadcast managing server is generally a server which generates and transmits a broadcast signal and/or broadcast associated information or a server which is provided with a previously generated broadcast signal and/or broadcast associated information and then transmits the provided signal or information to a terminal.
- the broadcast signal and/or broadcast associated information received by the broadcast receiving module 111 may be stored in a suitable device, such as a memory 160 .
- the broadcast signal may be implemented as a TV broadcast signal, a radio broadcast signal, and/or a data broadcast signal, among other signals. If desired, the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal.
- the broadcast associated information includes information associated with a broadcast channel, a broadcast program, or a broadcast service provider. Furthermore, the broadcast associated information can be provided via a mobile communication network. In this case, the broadcast associated information can be received by the mobile communication module 112 .
- broadcast associated information can be implemented in various forms.
- broadcast associated information may include an electronic program guide (EPG) of a digital multimedia broadcasting (DMB) system and an electronic service guide (ESG) of a digital video broadcast-handheld (DVB-H).
- EPG electronic program guide
- ESG electronic service guide
- DMB digital multimedia broadcasting
- DVD-H digital video broadcast-handheld
- the broadcast receiving module 111 may be configured to receive broadcast signals transmitted from various types of broadcast systems.
- broadcasting systems may include a digital multimedia broadcasting-terrestrial (DMB-T) system, a digital multimedia broadcasting-satellite (DMB-S) system, DVB-H, the data broadcasting system known as media forward link only (MediaFLOTM) and an integrated services digital broadcast-terrestrial (ISDB-T) system.
- DMB-T digital multimedia broadcasting-terrestrial
- DMB-S digital multimedia broadcasting-satellite
- DVB-H the data broadcasting system known as media forward link only
- ISDB-T integrated services digital broadcast-terrestrial
- the broadcast receiving module 111 can be configured to be suitable for other broadcasting systems as well as the above-noted digital broadcasting systems.
- the mobile communication module 112 transmits/receives wireless signals to/from one or more network entities (e.g., a base station, an external terminal, and/or a server). Such wireless signals may carry audio, video, and data according to text/multimedia messages.
- network entities e.g., a base station, an external terminal, and/or a server.
- the wireless Internet module 113 supports Internet access for the mobile terminal 100 .
- This module may be internally or externally coupled to the mobile terminal 100 .
- the wireless Internet technology can include WLAN (Wireless LAN), Wi-Fi, WibroTM (Wireless broadband), WimaxTM (World Interoperability for Microwave Access), and HSDPA (High Speed Downlink Packet Access).
- the short-range communication module 114 facilitates relatively short-range communications. Suitable technologies for implementing this module include radio frequency identification (RFID), infrared data association (IrDA), ultra-wideband (UWB), as well as the networking technologies commonly referred to as BluetoothTM and ZigBeeTM, to name a few.
- RFID radio frequency identification
- IrDA infrared data association
- UWB ultra-wideband
- BluetoothTM and ZigBeeTM to name a few.
- the position-location module 115 identifies or otherwise obtains the location of the mobile terminal 100 .
- this module may be implemented with a global positioning system (GPS) module.
- GPS global positioning system
- the audio/video (A/V) input unit 120 is configured to provide audio or video signal input to the mobile terminal 100 .
- the A/V input unit 120 includes a camera 121 and a microphone 122 .
- the camera 121 receives and processes (or produces) image frames of still pictures or video, which are obtained by an image sensor in a video call mode or a photographing mode. Furthermore, the processed image frames can be displayed on the display unit 151 .
- the image frames processed by the camera 121 can be stored in the memory 160 or can be transmitted to an external recipient via the wireless communication unit 110 .
- at least two cameras 121 can be provided in the mobile terminal 100 according to the environment of usage.
- the microphone 122 receives an external audio signal while the portable device is in a particular mode, such as phone call mode, recording mode and voice recognition mode. This audio signal is processed and converted into electronic audio data. The processed audio data is transformed into a format transmittable to a mobile communication base station via the mobile communication module 112 in a call mode.
- the microphone 122 typically includes assorted noise removing algorithms to remove noise generated in the course of receiving the external audio signal.
- the user input unit 130 generates input data responsive to user manipulation of an associated input device or devices.
- Examples of such devices include a keypad, a dome switch, a touchpad (e.g., static pressure/capacitance), a jog wheel, and a jog switch.
- the sensing unit 140 provides sensing signals for controlling operations of the mobile terminal 100 using status measurements of various aspects of the mobile terminal. For instance, the sensing unit 140 may detect an open/closed status of the mobile terminal 100 , the relative positioning of components (e.g., a display and keypad) of the mobile terminal 100 , a change of position (or location) of the mobile terminal 100 or a component of the mobile terminal 100 , a presence or absence of user contact with the mobile terminal 100 , and an orientation or acceleration/deceleration of the mobile terminal 100 .
- components e.g., a display and keypad
- a mobile terminal 100 configured as a slide-type mobile terminal is considered.
- the sensing unit 140 may sense whether a sliding portion of the mobile terminal is open or closed.
- the sensing unit 140 senses the presence or absence of power provided by the power supply 190 , and the presence or absence of a coupling or other connection between the interface unit 170 and an external device.
- the sensing unit 140 can include a proximity sensor 141 and a motion sensor 142 .
- the motion sensor 142 detects a body motion of the mobile terminal 100 .
- the motion sensor 142 outputs a signal corresponding to the detected body motion to the controller 180 .
- the output unit 150 generates output relevant to the senses of sight, hearing, and touch. Furthermore, the output unit 150 includes the display unit 151 , an audio output module 152 , an alarm unit 153 , a haptic module 154 , and a projector module 155 .
- the display unit 151 is typically implemented to visually display (output) information associated with the mobile terminal 100 .
- the display will generally provide a user interface (UI) or graphical user interface (GUI) which includes information associated with placing, conducting, and terminating a phone call.
- UI user interface
- GUI graphical user interface
- the display unit 151 may additionally or alternatively display images which are associated with these modes, the UI or the GUI.
- the display module 151 may be implemented using known display technologies. These technologies include, for example, a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT-LCD), an organic light-emitting diode display (OLED), a flexible display and a three-dimensional display.
- LCD liquid crystal display
- TFT-LCD thin film transistor-liquid crystal display
- OLED organic light-emitting diode display
- the mobile terminal 100 may include one or more of such displays.
- Some of the displays can be implemented as a transparent or optically transmissive type, i.e., a transparent display.
- a representative example of the transparent display is a TOLED (transparent OLED).
- a rear configuration of the display unit 151 can be implemented as the optically transmissive type as well. In this configuration, a user may be able to see an object located at the rear of a terminal body on a portion of the display unit 151 of the terminal body.
- At least two display units 151 can be provided in the mobile terminal 100 in accordance with one embodiment of the mobile terminal 100 .
- a plurality of display units can be arranged to be spaced apart from each other or to form a single body on a single face of the mobile terminal 100 .
- a plurality of display units can be arranged on different faces of the mobile terminal 100 .
- the display unit 151 and a sensor for detecting a touch action is configured as a mutual layer structure (hereinafter called ‘touchscreen’)
- the display unit 151 is usable as an input device as well as an output device.
- the touch sensor can be configured as a touch film, a touch sheet, or a touchpad.
- the touch sensor can be configured to convert pressure applied to a specific portion of the display unit 151 or a variation of capacitance generated from a specific portion of the display unit 151 to an electronic input signal. Moreover, the touch sensor is configurable to detect pressure of a touch as well as a touched position or size.
- a touch input is made to the touch sensor, a signal(s) corresponding to the touch input is transferred to a touch controller.
- the touch controller processes the signal(s) and then transfers the processed signal(s) to the controller 180 . Therefore, the controller 180 is made aware when a prescribed portion of the display unit 151 is touched.
- a proximity sensor 141 can be provided at an internal area of the mobile terminal 100 enclosed by the touchscreen or around the touchscreen.
- the proximity sensor is a sensor that detects a presence or non-presence of an object approaching a prescribed detecting surface or an object existing (or located) around the proximity sensor using an electromagnetic field strength or infrared ray without mechanical contact.
- the proximity sensor 141 is more durable than a contact type sensor and also has utility broader than the contact type sensor.
- the proximity sensor 141 can include one of a transmissive photoelectric sensor, a direct reflective photoelectric sensor, a mirror reflective photoelectric sensor, a radio frequency oscillation proximity sensor, an electrostatic capacity proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. If the touchscreen includes the electrostatic capacity proximity sensor, it is configured to detect the proximity of a pointer using a variation of an electric field according to the proximity of the pointer. In this configuration, the touchscreen (touch sensor) can be considered as the proximity sensor 141 .
- proximity touch an action in which a pointer approaches the touchscreen without contacting the touchscreen
- contact touch an action in which a pointer actually touches the touchscreen
- the position on the touchscreen that is proximity-touched by the pointer refers to the position of the pointer which vertically opposes the touchscreen when the pointer performs the proximity touch.
- the proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch duration, a proximity touch position, a proximity touch shift state).
- a proximity touch pattern e.g., a proximity touch distance, a proximity touch duration, a proximity touch position, a proximity touch shift state.
- Information corresponding to the detected proximity touch action and the detected proximity touch pattern can be output to the touchscreen.
- the audio output module 152 functions in various modes including a call-receiving mode, a call-placing mode, a recording mode, a voice recognition mode, and a broadcast reception mode to output audio data which is received from the wireless communication unit 110 or is stored in the memory 160 . During operation, the audio output module 152 outputs audio relating to a particular function (e.g., call received, message received).
- the audio output module 152 may be implemented using one or more speakers, buzzers, other audio producing devices, and combinations thereof.
- the alarm unit 153 outputs a signal for announcing the occurrence of a particular event associated with the mobile terminal 100 .
- Typical events include a call received, a message received and a touch input received.
- the alarm unit 153 is able to output a signal for announcing the event occurrence by way of vibration as well as video or audio signal.
- the video or audio signal can be output via the display unit 151 or the audio output unit 152 .
- the display unit 151 or the audio output module 152 can be regarded as a part of the alarm unit 153 .
- the haptic module 154 generates various tactile effects that can be sensed by a user. Vibration is a representative one of the tactile effects generated by the haptic module 154 .
- the strength and pattern of the vibration generated by the haptic module 154 are controllable. For instance, different vibrations can be output in a manner of being synthesized together or can be output in sequence.
- the haptic module 154 is able to generate various tactile effects in addition to the vibration.
- the haptic module 154 may generate an effect attributed to an arrangement of pins vertically moving against a contacted skin surface, an effect attributed to an injection/suction power of air though an injection/suction hole, an effect attributed to the skim over a skin surface, an effect attributed to a contact with an electrode, an effect attributed to an electrostatic force, and an effect attributed to the representation of a hot/cold sense using an endothermic or exothermic device.
- the haptic module 154 can be implemented to enable a user to sense the tactile effect through a muscle sense of a finger or an arm as well as to transfer the tactile effect through direct contact.
- at least two haptic modules 154 can be provided in the mobile terminal 100 in accordance with one embodiment of the mobile terminal 100 .
- the projector module 155 is an element for performing an image projector function using the mobile terminal 100 .
- the projector module 155 is able to display an image, which is identical to or at least partially different from the image displayed on the display unit 151 , on an external screen or wall according to a control signal of the controller 180 .
- the projector module 155 can include a light source generating light (e.g., a laser) for projecting an external image, an image producing means for producing an external image to project using the light generated from the light source, and a lens for enlarging the external image according to a predetermined focal distance. Furthermore, the projector module 155 can further include a device for adjusting an image projection direction by mechanically moving the lens or the whole module.
- a light source generating light (e.g., a laser) for projecting an external image
- an image producing means for producing an external image to project using the light generated from the light source
- a lens for enlarging the external image according to a predetermined focal distance.
- the projector module 155 can further include a device for adjusting an image projection direction by mechanically moving the lens or the whole module.
- the projector module 155 can be a CRT (cathode ray tube) module, an LCD (liquid crystal display) module, or a DLP (digital light processing) module according to a device type.
- the DLP module is operated by enabling the light generated from the light source to reflect on a DMD (digital micro-mirror device) chip and can be advantageous for reducing the size of the projector module 155 .
- the projector module 155 can be provided in a lengthwise direction of a lateral, front or backside direction of the mobile terminal 100 . Furthermore, it is understood that the projector module 155 can be provided in any portion of the mobile terminal 100 as deemed necessary.
- the memory 160 is generally used to store various types of data to support the processing, control, and storage requirements of the mobile terminal 100 .
- Examples of such data include program instructions for applications operating on the mobile terminal 100 , contact data, phonebook data, messages, audio, still pictures, and moving pictures.
- a recent use history or a cumulative use frequency of each data e.g., use frequency for each phonebook, each message or each multimedia file
- data for various patterns of vibration and/or sound output in response to a touch input to the touchscreen can be stored in the memory 160 .
- the memory 160 may be implemented using any type or combination of suitable volatile and non-volatile memory or storage devices including hard disk, random access memory (RAM), static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk, multimedia card micro type memory, card-type memory (e.g., SD memory, or XD memory), or other similar memory or data storage device. Furthermore, the mobile terminal 100 is able to operate in association with a web storage for performing a storage function of the memory 160 on the Internet.
- RAM random access memory
- SRAM static random access memory
- EEPROM electrically erasable programmable read-only memory
- EPROM erasable programmable read-only memory
- PROM programmable read-only memory
- ROM read-only memory
- magnetic memory flash memory
- flash memory magnetic or optical disk
- the interface unit 170 is often implemented to couple the mobile terminal 100 with external devices.
- the interface unit 170 receives data from the external devices or is supplied with the power and then transfers the data or power to the respective elements of the mobile terminal 100 or enables data within the mobile terminal 100 to be transferred to the external devices.
- the interface unit 170 may be configured using a wired/wireless headset port, an external charger port, a wired/wireless data port, a memory card port, a port for coupling to a device having an identity module, audio input/output ports, video input/output ports, and/or an earphone port.
- the identity module is a chip for storing various kinds of information for authenticating a usage authority of the mobile terminal 100 and can include a User Identity Module (UIM), a Subscriber Identity Module (SIM), and/or a Universal Subscriber Identity Module (USIM).
- a device having the identity module (hereinafter called ‘identity device’) can be manufactured as a smart card. Therefore, the identity device is connectable to the mobile terminal 100 via the corresponding port.
- the interface unit 170 When the mobile terminal 110 is connected to an external cradle, the interface unit 170 becomes a passage for supplying the mobile terminal 100 with power from the cradle or a passage for delivering various command signals input from the cradle by a user to the mobile terminal 100 .
- Each of the various command signals input from the cradle or the power can operate as a signal enabling the mobile terminal 100 to recognize that it is correctly loaded in the cradle.
- the controller 180 controls the overall operations of the mobile terminal 100 .
- the controller 180 performs the control and processing associated with voice calls, data communications, and video calls.
- the controller 180 may include a multimedia module 181 that provides multimedia playback.
- the multimedia module 181 may be configured as part of the controller 180 , or implemented as a separate component.
- the controller 180 is able to perform a pattern recognizing process for recognizing a writing input and a picture drawing input carried out on the touchscreen as characters or images, respectively.
- the power supply unit 190 provides power required by various components of the mobile terminal 100 .
- the power may be internal power, external power, or combinations of internal and external power.
- Various embodiments described herein may be implemented in a computer-readable medium using, for example, computer software, hardware, or some combination of computer software and hardware.
- the embodiments described herein may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described herein, or a selective combination thereof.
- ASICs application specific integrated circuits
- DSPs digital signal processors
- DSPDs digital signal processing devices
- PLDs programmable logic devices
- FPGAs field programmable gate arrays
- processors controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described herein, or a selective combination thereof.
- controller 180 Such embodiments may also be implemented by the controller 180 .
- the embodiments described herein may be implemented with separate software modules, such as procedures and functions, each of which perform one or more of the functions and operations described herein.
- the software codes can be implemented with a software application written in any suitable programming language and may be stored in memory such as the memory 160 , and executed by a controller or processor, such as the controller 180 .
- a CDMA wireless communication system may include a plurality of mobile terminals 100 , a plurality of base stations (BSs) 270 , base station controllers (BSCs) 275 , and a mobile switching center (MSC) 280 .
- the MSC 280 is configured to interface with a public switch telephone network (PSTN) 290 .
- PSTN public switch telephone network
- the system as shown illustrated in FIG. 2 may include a plurality of BSCs 275 .
- the MSC 280 is also configured to interface with the BSCs 275 , which may be coupled to the base stations 270 via backhaul lines.
- the backhaul lines may be configured in accordance with any of several known interfaces including, for example, E1/T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL, or xDSL.
- Each BS 270 may serve one or more sectors or regions, each sector or region covered by an omni-directional antenna or an antenna pointed in a particular direction radially away from the BS 270 . Alternatively, each sector or region may be covered by two or more antennas for diversity reception. Each BS 270 may be configured to support a plurality of frequency assignments, with each frequency assignment having a particular spectrum, such as 1.25 MHz or 5 MHz
- the intersection of a sector and frequency assignment may be referred to as a CDMA channel.
- the BS 270 may also be referred to as base station transceiver subsystems (BTSs) or other equivalent terms.
- BTSs base station transceiver subsystems
- the term “base station” may be used to collectively refer to a single BSC 275 and at least one BS 270 .
- the base station may also be referred to as a “cell site.”
- individual sectors of a particular BS 270 may be referred to as a plurality of cell sites.
- a broadcasting transmitter (BT) 295 transmits a broadcast signal to the mobile terminals 100 operating within the system.
- the broadcast receiving module 111 is provided in the mobile terminal 100 to receive broadcast signals transmitted by the BT 295 .
- FIG. 2 illustrates several global positioning system (GPS) satellites 300 .
- the GPS satellites 300 help locate at least one of a plurality of mobile terminals 100 . Although several GPS satellites 300 are depicted in FIG. 2 , it is understood that useful positioning information may be obtained with any number of GPS satellites.
- the Location information module 115 is typically configured to cooperate with the GPS satellites 300 to obtain desired positioning information.
- GPS tracking techniques instead of or in addition to GPS tracking techniques, other technologies to track the location of the mobile terminals 100 may be used.
- at least one of the GPS satellites 300 may selectively or additionally handle satellite DMB transmissions.
- the BSs 270 receive reverse-link signals from various mobile terminals 100 .
- the mobile terminals 100 typically engage in calls, messaging, and other types of communications.
- Each reverse-link signal received by a particular base station 270 is processed within the particular BS 270 .
- the resulting data is forwarded to an associated BSC 275 .
- the BSC 275 provides call resource allocation and mobility management functionality including the coordination of soft handoff procedures between BSs 270 .
- the BSCs 275 also route the received data to the MSC 280 , which provides additional routing services for interfacing with the PSTN 290 .
- the PSTN 290 interfaces with the MSC 280
- the MSC interfaces with the BSCs 275
- the BSCs 275 in turn control the BSs 270 to transmit forward-link signals to the mobile terminals 100 .
- Embodiments of the present invention provide a method for setting a security level on an SNS.
- content which is to be uploaded in an SNS may include information regarding a security level associated with access rights for the content.
- the access rights may be based on user information, device information, location information, or other information associated with an SNS.
- the display of the content may be censored for various users according to the security level associated with the content. For example, the entire content or portions of the content may be uniquely processed, such as being pixilated, blurred, or blacked. Furthermore, access to the content may be entirely denied.
- the setting for how to display the content such as pixilated, blurred, or blacked, may be determined by each SNS site or may be set via the security level.
- tag information may also be registered in an SNS site.
- the tag information may be registered with or without separately setting a security level for content.
- the registered tag information may automatically be stored in an SNS site or a terminal belonging to the another user.
- the stored tag information may function as a type of security information. For example, a second user may not be permitted to capture the user's face or a particular region of the user's face may be differently displayed when a user's tag information has been registered with an SNS site.
- FIGS. 3A and 3B illustrate menus for setting a security level according to an embodiment of the present invention.
- FIG. 3A illustrates an example of setting a security level for each SNS site.
- FIG. 3B illustrates an example of setting a security level for groups in an SNS site.
- the method of setting a security level for groups in an SNS site allows for a greater level of detail in comparison to a site wide security level ( FIG. 3A ).
- a security setup menu may be utilized to set a security level for an SNS site.
- the security setup menu may include at least one SNS site 301 and a check box 302 for setting a security level.
- an unchecked box may be associated with an unrestricted security level and a checked box may be associated with a restricted security level.
- FIG. 3B illustrates a security setup menu to set a security level for groups within an SNS site.
- the security setup menu may include at least one SNS group 303 and a check box 302 for setting a security level.
- an unchecked box may be associated with an unrestricted security level and a checked box may be associated with a restricted security level.
- the SNS sites and SNS groups included in the security setup menu illustrated in FIGS. 3A and 3B may be added/deleted according to a user selection.
- a security setup menu may include both an SNS site 301 and an SNS group 303 (not shown).
- the security level is set to restricted or unrestricted.
- the security level is not limited to restricted and unrestricted, other security levels may be applied as necessary.
- the controller 180 may automatically recognize a user's face in the corresponding content and apply the appropriate security level.
- the security level may be associated with content tagged by a user.
- Other methods of tagging content may include recognizing a user's name or other data associated with a user.
- FIG. 4 illustrates an example of a screen displayed after setting a security level for content according to an embodiment of the present invention.
- the controller 180 may automatically recognize the face of the first user and apply the appropriate security level to the content.
- An indicator 410 such as a circle, may indicate that the security level has been applied to the content or a portion of the content.
- the security level is concurrently registered with the SNS site when the content is uploaded to the SNS site.
- the security level may be registered with the SNS prior to uploading the content or after uploading the content.
- a user may apply a security level to content which has already been uploaded to the SNS site.
- the security level is executed in response to access of the content on the SNS site.
- a server of the SNS site may decode the corresponding content according to the security level set by the first user.
- a specific portion such as a user's face
- the censorship may include, but is not limited to, pixilating, blurring, or blacking out the specific portion, moreover the censorship may include superimposing a character or an image on the specific portion.
- the content associated with a user refers to content which includes information regarding the user, such as a picture which includes the user's image or text which mentions the user.
- FIG. 5 illustrates an example of applying a security level to content associated with a user.
- a user may set a security level to a restricted setting for a group on a SNS site, for example, the user may restrict content viewed by a cyworld-friend group ( FIG. 3B ).
- a face of the user associated the content may be censored.
- the first user's face 501 is not censored in an unrestricted state, however, the first user's face 502 may be censored when a user associated with a restricted group accesses the content associated with the first user ( FIG. 5( b )).
- the restricted setting for a security level is configured to censor a user's face via facial recognition methods. Accordingly, other methods may be employed in order to set restrictions on content.
- a security level may be set for the entire content or for only specific portions of the content. When multiple security levels are set for a specific content, the content may be decoded according to the priority level for each of the multiple security levels.
- FIG. 6A illustrates a security setup menu 610 for setting a security level for content according to an embodiment of the present invention.
- the security setup menu may include items such as site information 601 , device information 602 , position information 603 , and security level duration 604 .
- the security menu may include items for checking a set security level or for checking a state of the set security level. The formation and number of the items included in the security setup menu 610 may be adjusted according to user requirements.
- the site information 601 sets a security level on a site wide basis or according to groups associated with individual SNS sites.
- the site information menu is similar to the menu illustrated in FIGS. 3A and 3B .
- the device information 602 sets a security level according to a device type.
- the security level may vary according to whether a device accessing the content is a mobile terminal or a personal computer (PC).
- the security level may be set according to whether a PC is a private PC or a shared PC.
- the device information security level utilizes device information such as hardware or software characteristics, IP information, and other device specific information.
- the position information 603 sets a security level according to various zones or locations.
- the security level may vary according to a physical location of a user attempting to access the content.
- the physical location may include a city, town, country, or other geographic location.
- the security level set duration 604 may set a duration for a security level. For example, a restricted setting for a content may be set for only a month.
- the security setup menu 610 may be displayed as a pop-up window in response to tagging the content ( FIG. 6B ). Alternatively, the security setup menu 610 may be included in a user menu. Moreover, the security setup menu 610 may be displayed in response to a specific touch input, such as a long touch or double touch ( FIG. 6C ).
- the set security level may be displayed on a predetermined area of a screen in response to an input on the content.
- the set security level may be displayed in response to an input on a tagged portion 703 of the content ( FIG. 7( a )).
- the set security level may be displayed in a block form (not shown).
- the set security level may be displayed as a text or numeral 701 adjacent to a tagged portion of the content ( FIG. 7( b )).
- a scroll bar may be displayed to allow a user to search the security levels when multiple security levels have been set for the displayed content (not shown).
- a security setup menu 702 may be displayed on a predetermined area ( FIG. 7( c )).
- the security setup menu 702 may display numbers corresponding to different menu items or different security levels. Accordingly, a user may change a preset security level by selecting a number displayed on the security setup menu 702 .
- the process of applying the security level when content is accessed on an SNS site is similar to the process disclosed with regard to FIGS. 4 and 5 .
- FIG. 8A illustrates an example of applying a security level for content according to an embodiment of the present invention.
- a user may set a security level according to a method described in relation to FIGS. 6A to 6C .
- the first user may set a group associated with a specific SNS site to a restricted setting, furthermore, the user may set a security level for the position information 603 .
- the user may set the security level for a Facebook Classmate group to restricted and the security level for the position information as unrestricted for only a domestic location.
- the first user may upload the content to an SNS site, and the security level will be registered within the SNS site. For example purposes, the first user will have uploaded the content to Facebook and the security level will be registered with Facebook.
- a domestic location setting may refer to a user located in the same geographic location, such as a city, state, or country, as the first user.
- setting an unrestricted setting for only a specific location will result in all other locations being set as restricted.
- the position information is not limited to only being unrestricted to a specific location.
- a user may set some locations with a restricted setting while setting other locations with an unrestricted setting.
- a second user may tag the content 810 .
- the second user may belong to a Facebook group which has an unrestricted security level, and therefore, the second user may have an uncensored view of the content 810 (FIG. 8 A(b)). Since the second user is in an unrestricted group, the second user may tag the content 810 , in other words, a user associated with a restricted security level may not tag content.
- the second user may set additional security levels for the content 810 .
- the second user 802 may set a security level to designate a third user as having a restricted security level.
- the content 810 uploaded by the first user would have three different security levels.
- the server may censor both the image of the first user and the second user. Specifically, the image of the first user is censored because the first user had set the security level for members of the Facebook Classmate group to restricted. Additionally, the image of the second user is censored because the second user had set the security level for the third user to restricted.
- FIG. 8B illustrates an example of applying a security level set for content according to an embodiment of the present invention.
- content 810 uploaded by the first user may have a security level for a Facebook Classmate group set to restricted and may have a position information as unrestricted for only domestic access (FIG. 8 B(a)).
- a Facebook server decodes the corresponding content according to the set security levels and censors the output of the content 810 . Accordingly, the image of the first user 801 is blurred when viewed by the second user (FIG. 8 B(b)).
- the server identifies the location of the third user and censors the entire content 810 (FIG. 8 B(c)).
- the location of the third user may be determined according to IP information or GPS information of the terminal.
- the entire photo is blurred because the security level for the position information has a higher priority than the security level for groups associated with an SNS.
- the priority for the security levels may be set by a user or preset by each SNS.
- FIGS. 9A and 9B illustrate an example of applying a security level according to priority levels. Similar to the embodiment described with regard to FIG. 8A , content 910 uploaded by the first user 901 may have a security level for a Facebook Classmate group set to restricted and a position information set to unrestricted for only domestic access ( FIG. 9A ). Furthermore, the content 910 is set with an additional security level of restricted when the content is accessed by a shared PC ( FIG. 9A ).
- a SNS server decodes the corresponding content according to the set security levels and does not censor the output of the content 910 (FIG. 9 A(b)). Specifically, as illustrated in FIG. 9 A(b) the content 910 is not censored because the security level set for the SNS group has a higher priority in comparison to the security level set for the device information 602 . In other words, the unrestricted SNS group setting trumps the restricted shared PC setting.
- the second user since the second user has unrestricted access, the second user may set a security level which designates a third user as having a restricted security level.
- the content 910 uploaded by the first user would have four different security levels.
- the server may censor the image of the first user and the second user. Specifically, the image of the first user is censored because the first user had set the security level for members of the Facebook Classmate group to restricted. Additionally, the image of the second user is censored because the second user had set the security level for the third user to restricted.
- content 910 uploaded by the first user may be set to have a security level for a Facebook Classmate group set to restricted, position information as unrestricted for only domestic access, and device information as restricted for shared PCs.
- a Facebook server decodes the corresponding content according to the set security levels and censors the output of the content 910 . Accordingly, the image of the first user 901 is blurred when viewed by the second user since the second user is associated with a restricted SNS group (FIG. 8 B(b)).
- the security level set for the SNS group has a higher priority in comparison to the security level set for the device information.
- the server identifies the location of the third user and censors the entire content (FIG. 9 B(c)).
- the location of the third user may be determined according to IP information or GPS information of the terminal utilized by the third user.
- the entire photo is blurred because the security level for the position information has a higher priority than the security level for groups associated with a SNS.
- priorities for applying the corresponding security levels to the content may be defined and the security levels may be applied according to the defined priorities.
- the method for setting security levels has thus far been described with regard to setting a security level for each content or specific portions of each content. Additionally, it is often useful to provide a method of setting the security levels for all content associated with a user as opposed to setting a security level for each content.
- a first user may register a desired security level with an SNS site. Additionally, the desired security level may be stored on the SNS site or may be stored on terminal associated with a second user. Thus, when the second user visits the SNS site associated with the first user, the stored security levels may be applied to the content associated with the first user. Additionally, the security levels may be applied when the second user attempts to upload content associated with the first user.
- the security level may be associated with a user's tag information, such as a user's face, or a user's personal data, such as a user's phone number, user ID, email address, or home address.
- the security level sent to the second user may be a user's tag information as to prevent unauthorized access to a user's personal data. Accordingly, when the second user accesses content associated with the first user or uploads content associated with the first user, the tag information and personal data associated with the first user are recognized and thereafter the information associated with the first user, such as the user's face, may be censored or access to the content may be completely denied.
- FIG. 10 illustrates an example of applying a security level according to tag information and personal data according to an embodiment of the present invention.
- a first user 950 registers tag information, such as information related to a user's face, and personal data including a phone number, an email address, or other information in an SNS site such as Twitter.
- tag information such as information related to a user's face
- personal data including a phone number, an email address, or other information in an SNS site such as Twitter.
- the registered tag information may be stored on terminal belonging to the second user.
- a server or the controller 180 determines that the content is associated with the first user according to tag information, such as the first user's face, or personal data and censors the content. As illustrated in FIG. 10( b ) the censorship may comprise blurring the face of the first user 950 .
- the security level according to the tag information or personal data may include the same security level settings disclosed with regard to the embodiments illustrated in FIGS. 6-9 .
- the security level may be set according to site information, device information, or position information.
- the user may set a security level for position information to restricted for a location, such as Las Vegas.
- a security level for position information to restricted for a location, such as Las Vegas.
- the mobile terminal 100 first determines that the captured image is associated with the first user via tag information or personal data, and then determines that the image was captured in a restricted location and censors the image.
- the location of the image capture is determined via GPS of other positioning methods.
- the controller 180 determines that the content is associated with the first user according to tag information, such as the first user's face, or personal data and censors or prevents the capture of the content. As illustrated in FIG. 10( c ) the censorship may comprise blacking the entire image of the first user 950 .
- FIG. 11 is a flowchart illustrating a security setting method for a mobile terminal according to an embodiment of the present invention. In many configurations, the method illustrated in FIG. 11 is applied when a user sets a security level directly to a content prior to uploading the content to an SNS site.
- a first user sets a security level for a content or a portion of the content (S 10 ).
- the security level may be set automatically by the controller 810 via facial recognition or other techniques. Additionally, the security level may be set by the first user. In addition to setting a security level according to tag information or personal data, the first user may also set the security level according to at least, for example, site information, device information, or position information.
- the set security level may be displayed on the display unit 151 to allow a user to check or adjust the security levels.
- the plurality of security levels may be displayable in a graphic form or a numeric form.
- the displayed order of the security levels may indicate the priority levels.
- the wireless communication unit 110 uploads the corresponding content to an SNS site, such as Twitter (S 11 ).
- SNS site such as Twitter (S 11 ).
- the security level is then registered with the site.
- a server of the corresponding SNS site applies the security level set by the first user to the content (S 13 ).
- the server of the corresponding SNS site determines if the security level of the content is set to a restricted level or an unrestricted level for the second user (S 14 ). If the security level for access by the second user is set to unrestricted, the content is displayed (S 15 ), and the second user may view the content. However, if the security level for access by the second user is set to restricted, the content is censored (S 16 ). Censorship of the content may refer to modifying the display of the content, such as pixelating or blurring the content. Alternatively the content may be censored such that content is not displayed.
- FIG. 12 is a flowchart illustrating a security level setting method for a mobile terminal according to an embodiment of the present invention.
- the method illustrated in FIG. 12 is applied when a user sets security a security level on a SNS site or sends the security level directly to another user.
- a first user sets tag information and personal data in a mobile terminal (S 20 ).
- the controller 180 then registers the set tag information and the personal data in an SNS site, such as Facebook, via the wireless communication unit 110 (S 21 ).
- the SNS site may send the tag information to be stored in a terminal of the second user (S 23 ).
- the server or controller 180 automatically generates tags for information associated with the first user, such as the user's face, a phone number, or an address corresponding to the tag information and the personal data (S 24 ).
- the server of the corresponding SNS site determines if the security level of the content is set to a restricted level or an unrestricted level for the second user (S 25 ). If the security level for access by the second user is set to unrestricted, access to the content is allowed (S 26 ), and the second user may view the content. However, if the security level for access by the second user is set to restricted, the content is censored (S 27 ). Censorship of the content may refer to modifying the display of the content, such as pixelating or blurring the content. Alternatively the content may be censored such that content is not displayed.
- determining the security level of the content also determines if the security level for content is set to a site wide restricted or unrestricted setting and is not limited to determining the security level for a specific user.
- the controller 180 when a user executes a content capturing menu, such as a camera, the controller 180 receives a preview image from a camera and displays the preview image on the display unit 151 .
- the controller 180 may detect at least one face image from the preview image and stores the detected face image or characteristic information related to the detected face image.
- the characteristic information related to the face image may be encoded by terminal-specific information, such as at least one of an electronic serial number (ESN), a phone number, or subscriber identity module (SIM) information.
- ESN electronic serial number
- SIM subscriber identity module
- the various embodiments of the present invention illustrate the display as a horizontal display, however, the display is not limited to a horizontal display.
- a vertical display may also be displayed in the same manner.
- the disclosed methods can be implemented in a program recorded medium as computer-readable codes.
- the computer-readable media may include all types of recording devices in which data readable by a computer system are stored.
- the computer-readable media may include ROM, RAM, CD-ROM, magnetic tapes, floppy discs, and optical data storage devices, and may also include carrier-wave type implementations, such as transmission via the Internet.
Abstract
Description
- Pursuant to 35 U.S.C. §119(a), this application claims the benefit of earlier filing date and right of priority to Korean Application No. 10-2010-0085646, filed on Sep. 1, 2010, the contents of which are incorporated by reference herein in its entirety.
- 1. Field of the Invention
- The present invention is related to information security of a mobile terminal. Specifically, to a mobile terminal capable of setting a security level for information uploaded to a Social Network Service (SNS) site, and a security setting method thereof.
- 2. Discussion of the Related Art
- A mobile terminal is a device which may be configured to perform various functions. Examples of such functions include data and voice communications, capturing images and video via a camera, recording audio, outputting music via a speaker system, and displaying images and video on a display. Some terminals include additional functionality which supports game playing, while other terminals are also configured as multimedia players. More recently, mobile terminals have been configured to receive broadcast and multicast signals which permit viewing of contents, such as videos and television programs.
- Efforts have been made to support or enhance various functions of mobile terminals. Such efforts include changes and improvements to the structural components implementing a mobile terminal in addition to software or hardware improvement.
- A Social Network Service (SNS) is a community type web-based service for communication and information sharing with others. SNS sites include sites such as Twitter™, Facebook™, and Google+™. An SNS allows a user to share information with other users and allows the user to view information regarding other users. However the user is vulnerable to security risks in that a user's private information may be accessed without the user's permission.
- According to an embodiment, an information security method for a mobile terminal is presented. The method includes setting security information for a content associated with a first user, uploading the content to a Social Network Service (SNS) site, and uploading the security information to the SNS site to permit the SNS site to register the security information in order to display the content according to the security information when the content is accessed by a second user via the SNS site.
- According to a feature, the content comprises a photo, a video, or text. Additionally, the information security method further includes setting content access rights for the second user in the security information.
- According to another feature, the information security method further includes setting content access rights according to at least device information, identification (ID) information, internet protocol (IP) information, or position information for a device in the security information.
- According to yet another feature, the information security method further includes setting the security information for an entirety of the content or a specific portion of the content.
- According to still yet another feature, the information security method further includes setting the security information as a site-wide rule for an SNS site or for a specific group within an SNS site.
- According to another feature, the information security method further includes setting the security information to pixilate, blur, blacken, or replace the content when accessed by the second user.
- According to another embodiment, an information security setting method for a mobile terminal is presented. The method includes setting security information associated with a first user for a Social Network Service (SNS) site, and uploading the security information to the SNS site to permit the SNS site to apply the security information to content associated with the first user in order to display the content according to the security information when the content is accessed by a second user via the SNS site.
- According to yet another embodiment, a mobile terminal is presented. The mobile terminal includes a controller configured to set security information for content associated with a first user, and a wireless communication unit configured to upload the content and the security information to a Social Network Service (SNS) site to permit the SNS site to display the content according to the security information when a second user accesses the uploaded content via the SNS site.
- According to still yet another embodiment, a mobile terminal is presented. The mobile terminal includes a controller configured to set security information for a first user, and a wireless communication unit configured to upload the security information to a Social Network Service (SNS) site to permit the SNS site to apply the security information to content associated with the first user in order to display the content according to the security information when a second user accesses the content associated with the first user via the SNS site.
- These and other embodiments will also become readily apparent to those skilled in the art from the following detailed description of the embodiments having reference to the attached figures, the invention not being limited to any particular embodiment disclosed.
- The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments and together with the description serve to explain the principles of the invention.
-
FIG. 1 illustrates a block diagram of a mobile terminal according to an embodiment of the present invention. -
FIG. 2 illustrates a block diagram of a mobile communication system operating with the mobile terminal according to an embodiment of the present invention. -
FIGS. 3A and 3B illustrate a menu for setting a security level for content according to an embodiment of the present invention. -
FIG. 4 illustrates a screen displayed after setting a security level according to an embodiment of the present invention. -
FIG. 5 illustrates an example of applying a security level according to an embodiment of the present invention. -
FIGS. 6A to 6C illustrate examples of a security level menu according to an embodiment of the present invention. -
FIG. 7 illustrates an example for applying a security level according to an embodiment of the present invention. -
FIGS. 8A and 8B illustrate examples of applying a security level according to an embodiment of the present invention. -
FIGS. 9A and 9B illustrate examples of applying a security level according to an embodiment of the present invention. -
FIG. 10 illustrates an example of applying a security level according to an embodiment of the present invention. -
FIG. 11 is a flowchart illustrating a method for setting a security level on a mobile terminal according to an embodiment of the present invention. -
FIG. 12 is a flowchart illustrating a method for setting a security level on a mobile terminal according to an embodiment of the present invention. - In the following detailed description, reference is made to the accompanying drawing figures which form a part hereof, and which show by way of illustration specific embodiments of the invention. It is to be understood by those of ordinary skill in this technological field that other embodiments may be utilized, and structural, electrical, as well as procedural changes may be made without departing from the scope of the present invention. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or similar parts.
- As used herein, the suffixes “module,” “unit,” and “part” are used for elements in order to facilitate the disclosure only. Therefore, significant meanings or roles are not given to the suffixes themselves and it is understood that the “module,” “unit,” and “part” can be used together or interchangeably.
- Mobile terminals described in this disclosure can include a mobile phone, a smart phone, a laptop computer, a digital broadcast terminal, a PDA (personal digital assistant), a PMP (portable multimedia player), and a navigation system.
- Except where applicable to a mobile terminal only, it will be appreciated by those skilled in the art that features described herein with reference to one or more embodiments may be applicable to a stationary terminal such as a digital TV, or a desktop computer.
-
FIG. 1 is a block diagram of amobile terminal 100 according to one embodiment of the present invention. Referring toFIG. 1 , themobile terminal 100 includes awireless communication unit 110, an A/V (audio/video) input unit 120, auser input unit 130, asensing unit 140, anoutput unit 150, amemory 160, an interface unit 170, acontroller 180, and apower supply unit 190.FIG. 1 shows themobile terminal 100 having various components, but it is understood that implementing all of the illustrated components is not a requirement. More or fewer components may be implemented according to various embodiments. - The
wireless communication unit 110 typically includes one or more components which permit wireless communication between themobile terminal 100 and a wireless communication system or network within which themobile terminal 100 is located. For instance, thewireless communication unit 110 can include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short-range communication module 114, and a position-location module 115. - The broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast managing server via a broadcast channel. The broadcast channel may include a satellite channel and/or a terrestrial channel.
- The broadcast managing server is generally a server which generates and transmits a broadcast signal and/or broadcast associated information or a server which is provided with a previously generated broadcast signal and/or broadcast associated information and then transmits the provided signal or information to a terminal. The broadcast signal and/or broadcast associated information received by the broadcast receiving module 111 may be stored in a suitable device, such as a
memory 160. - The broadcast signal may be implemented as a TV broadcast signal, a radio broadcast signal, and/or a data broadcast signal, among other signals. If desired, the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal.
- The broadcast associated information includes information associated with a broadcast channel, a broadcast program, or a broadcast service provider. Furthermore, the broadcast associated information can be provided via a mobile communication network. In this case, the broadcast associated information can be received by the mobile communication module 112.
- The broadcast associated information can be implemented in various forms. For instance, broadcast associated information may include an electronic program guide (EPG) of a digital multimedia broadcasting (DMB) system and an electronic service guide (ESG) of a digital video broadcast-handheld (DVB-H).
- The broadcast receiving module 111 may be configured to receive broadcast signals transmitted from various types of broadcast systems. By nonlimiting example, such broadcasting systems may include a digital multimedia broadcasting-terrestrial (DMB-T) system, a digital multimedia broadcasting-satellite (DMB-S) system, DVB-H, the data broadcasting system known as media forward link only (MediaFLO™) and an integrated services digital broadcast-terrestrial (ISDB-T) system. Optionally, the broadcast receiving module 111 can be configured to be suitable for other broadcasting systems as well as the above-noted digital broadcasting systems.
- The mobile communication module 112 transmits/receives wireless signals to/from one or more network entities (e.g., a base station, an external terminal, and/or a server). Such wireless signals may carry audio, video, and data according to text/multimedia messages.
- The wireless Internet module 113 supports Internet access for the
mobile terminal 100. This module may be internally or externally coupled to themobile terminal 100. The wireless Internet technology can include WLAN (Wireless LAN), Wi-Fi, Wibro™ (Wireless broadband), Wimax™ (World Interoperability for Microwave Access), and HSDPA (High Speed Downlink Packet Access). - The short-range communication module 114 facilitates relatively short-range communications. Suitable technologies for implementing this module include radio frequency identification (RFID), infrared data association (IrDA), ultra-wideband (UWB), as well as the networking technologies commonly referred to as Bluetooth™ and ZigBee™, to name a few.
- The position-
location module 115 identifies or otherwise obtains the location of themobile terminal 100. According to one embodiment, this module may be implemented with a global positioning system (GPS) module. - Referring to
FIG. 1 , the audio/video (A/V) input unit 120 is configured to provide audio or video signal input to themobile terminal 100. As shown, the A/V input unit 120 includes acamera 121 and a microphone 122. Thecamera 121 receives and processes (or produces) image frames of still pictures or video, which are obtained by an image sensor in a video call mode or a photographing mode. Furthermore, the processed image frames can be displayed on the display unit 151. - The image frames processed by the
camera 121 can be stored in thememory 160 or can be transmitted to an external recipient via thewireless communication unit 110. Optionally, at least twocameras 121 can be provided in themobile terminal 100 according to the environment of usage. - The microphone 122 receives an external audio signal while the portable device is in a particular mode, such as phone call mode, recording mode and voice recognition mode. This audio signal is processed and converted into electronic audio data. The processed audio data is transformed into a format transmittable to a mobile communication base station via the mobile communication module 112 in a call mode. The microphone 122 typically includes assorted noise removing algorithms to remove noise generated in the course of receiving the external audio signal.
- The
user input unit 130 generates input data responsive to user manipulation of an associated input device or devices. Examples of such devices include a keypad, a dome switch, a touchpad (e.g., static pressure/capacitance), a jog wheel, and a jog switch. - The
sensing unit 140 provides sensing signals for controlling operations of themobile terminal 100 using status measurements of various aspects of the mobile terminal. For instance, thesensing unit 140 may detect an open/closed status of themobile terminal 100, the relative positioning of components (e.g., a display and keypad) of themobile terminal 100, a change of position (or location) of themobile terminal 100 or a component of themobile terminal 100, a presence or absence of user contact with themobile terminal 100, and an orientation or acceleration/deceleration of themobile terminal 100. - As an example, a
mobile terminal 100 configured as a slide-type mobile terminal is considered. In this configuration, thesensing unit 140 may sense whether a sliding portion of the mobile terminal is open or closed. According to other examples, thesensing unit 140 senses the presence or absence of power provided by thepower supply 190, and the presence or absence of a coupling or other connection between the interface unit 170 and an external device. According to one embodiment, thesensing unit 140 can include aproximity sensor 141 and a motion sensor 142. - The motion sensor 142 detects a body motion of the
mobile terminal 100. The motion sensor 142 outputs a signal corresponding to the detected body motion to thecontroller 180. - The
output unit 150 generates output relevant to the senses of sight, hearing, and touch. Furthermore, theoutput unit 150 includes the display unit 151, an audio output module 152, analarm unit 153, ahaptic module 154, and a projector module 155. - The display unit 151 is typically implemented to visually display (output) information associated with the
mobile terminal 100. For instance, if the mobile terminal is operating in a phone call mode, the display will generally provide a user interface (UI) or graphical user interface (GUI) which includes information associated with placing, conducting, and terminating a phone call. As another example, if themobile terminal 100 is in a video call mode or a photographing mode, the display unit 151 may additionally or alternatively display images which are associated with these modes, the UI or the GUI. - The display module 151 may be implemented using known display technologies. These technologies include, for example, a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT-LCD), an organic light-emitting diode display (OLED), a flexible display and a three-dimensional display. The
mobile terminal 100 may include one or more of such displays. - Some of the displays can be implemented as a transparent or optically transmissive type, i.e., a transparent display. A representative example of the transparent display is a TOLED (transparent OLED). A rear configuration of the display unit 151 can be implemented as the optically transmissive type as well. In this configuration, a user may be able to see an object located at the rear of a terminal body on a portion of the display unit 151 of the terminal body.
- At least two display units 151 can be provided in the
mobile terminal 100 in accordance with one embodiment of themobile terminal 100. For instance, a plurality of display units can be arranged to be spaced apart from each other or to form a single body on a single face of themobile terminal 100. Alternatively, a plurality of display units can be arranged on different faces of themobile terminal 100. - If the display unit 151 and a sensor for detecting a touch action (hereinafter called ‘touch sensor’) is configured as a mutual layer structure (hereinafter called ‘touchscreen’), the display unit 151 is usable as an input device as well as an output device. In this case, the touch sensor can be configured as a touch film, a touch sheet, or a touchpad.
- The touch sensor can be configured to convert pressure applied to a specific portion of the display unit 151 or a variation of capacitance generated from a specific portion of the display unit 151 to an electronic input signal. Moreover, the touch sensor is configurable to detect pressure of a touch as well as a touched position or size.
- If a touch input is made to the touch sensor, a signal(s) corresponding to the touch input is transferred to a touch controller. The touch controller processes the signal(s) and then transfers the processed signal(s) to the
controller 180. Therefore, thecontroller 180 is made aware when a prescribed portion of the display unit 151 is touched. - Referring to
FIG. 1 , aproximity sensor 141 can be provided at an internal area of themobile terminal 100 enclosed by the touchscreen or around the touchscreen. The proximity sensor is a sensor that detects a presence or non-presence of an object approaching a prescribed detecting surface or an object existing (or located) around the proximity sensor using an electromagnetic field strength or infrared ray without mechanical contact. Hence, theproximity sensor 141 is more durable than a contact type sensor and also has utility broader than the contact type sensor. - The
proximity sensor 141 can include one of a transmissive photoelectric sensor, a direct reflective photoelectric sensor, a mirror reflective photoelectric sensor, a radio frequency oscillation proximity sensor, an electrostatic capacity proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. If the touchscreen includes the electrostatic capacity proximity sensor, it is configured to detect the proximity of a pointer using a variation of an electric field according to the proximity of the pointer. In this configuration, the touchscreen (touch sensor) can be considered as theproximity sensor 141. - In the following description, for purposes of clarity, an action in which a pointer approaches the touchscreen without contacting the touchscreen will sometimes be referred to as a “proximity touch.” Furthermore, an action in which a pointer actually touches the touchscreen will sometimes be referred to as a “contact touch.” The position on the touchscreen that is proximity-touched by the pointer refers to the position of the pointer which vertically opposes the touchscreen when the pointer performs the proximity touch.
- The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch duration, a proximity touch position, a proximity touch shift state). Information corresponding to the detected proximity touch action and the detected proximity touch pattern can be output to the touchscreen.
- The audio output module 152 functions in various modes including a call-receiving mode, a call-placing mode, a recording mode, a voice recognition mode, and a broadcast reception mode to output audio data which is received from the
wireless communication unit 110 or is stored in thememory 160. During operation, the audio output module 152 outputs audio relating to a particular function (e.g., call received, message received). The audio output module 152 may be implemented using one or more speakers, buzzers, other audio producing devices, and combinations thereof. - The
alarm unit 153 outputs a signal for announcing the occurrence of a particular event associated with themobile terminal 100. Typical events include a call received, a message received and a touch input received. Thealarm unit 153 is able to output a signal for announcing the event occurrence by way of vibration as well as video or audio signal. The video or audio signal can be output via the display unit 151 or the audio output unit 152. Hence, the display unit 151 or the audio output module 152 can be regarded as a part of thealarm unit 153. - The
haptic module 154 generates various tactile effects that can be sensed by a user. Vibration is a representative one of the tactile effects generated by thehaptic module 154. The strength and pattern of the vibration generated by thehaptic module 154 are controllable. For instance, different vibrations can be output in a manner of being synthesized together or can be output in sequence. - The
haptic module 154 is able to generate various tactile effects in addition to the vibration. For instance, thehaptic module 154 may generate an effect attributed to an arrangement of pins vertically moving against a contacted skin surface, an effect attributed to an injection/suction power of air though an injection/suction hole, an effect attributed to the skim over a skin surface, an effect attributed to a contact with an electrode, an effect attributed to an electrostatic force, and an effect attributed to the representation of a hot/cold sense using an endothermic or exothermic device. - The
haptic module 154 can be implemented to enable a user to sense the tactile effect through a muscle sense of a finger or an arm as well as to transfer the tactile effect through direct contact. Optionally, at least twohaptic modules 154 can be provided in themobile terminal 100 in accordance with one embodiment of themobile terminal 100. - The projector module 155 is an element for performing an image projector function using the
mobile terminal 100. The projector module 155 is able to display an image, which is identical to or at least partially different from the image displayed on the display unit 151, on an external screen or wall according to a control signal of thecontroller 180. - In particular, the projector module 155 can include a light source generating light (e.g., a laser) for projecting an external image, an image producing means for producing an external image to project using the light generated from the light source, and a lens for enlarging the external image according to a predetermined focal distance. Furthermore, the projector module 155 can further include a device for adjusting an image projection direction by mechanically moving the lens or the whole module.
- The projector module 155 can be a CRT (cathode ray tube) module, an LCD (liquid crystal display) module, or a DLP (digital light processing) module according to a device type. In particular, the DLP module is operated by enabling the light generated from the light source to reflect on a DMD (digital micro-mirror device) chip and can be advantageous for reducing the size of the projector module 155.
- In many configurations, the projector module 155 can be provided in a lengthwise direction of a lateral, front or backside direction of the
mobile terminal 100. Furthermore, it is understood that the projector module 155 can be provided in any portion of themobile terminal 100 as deemed necessary. - The
memory 160 is generally used to store various types of data to support the processing, control, and storage requirements of themobile terminal 100. Examples of such data include program instructions for applications operating on themobile terminal 100, contact data, phonebook data, messages, audio, still pictures, and moving pictures. Furthermore, a recent use history or a cumulative use frequency of each data (e.g., use frequency for each phonebook, each message or each multimedia file) can be stored in thememory 160. Moreover, data for various patterns of vibration and/or sound output in response to a touch input to the touchscreen can be stored in thememory 160. - The
memory 160 may be implemented using any type or combination of suitable volatile and non-volatile memory or storage devices including hard disk, random access memory (RAM), static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk, multimedia card micro type memory, card-type memory (e.g., SD memory, or XD memory), or other similar memory or data storage device. Furthermore, themobile terminal 100 is able to operate in association with a web storage for performing a storage function of thememory 160 on the Internet. - The interface unit 170 is often implemented to couple the
mobile terminal 100 with external devices. The interface unit 170 receives data from the external devices or is supplied with the power and then transfers the data or power to the respective elements of themobile terminal 100 or enables data within themobile terminal 100 to be transferred to the external devices. The interface unit 170 may be configured using a wired/wireless headset port, an external charger port, a wired/wireless data port, a memory card port, a port for coupling to a device having an identity module, audio input/output ports, video input/output ports, and/or an earphone port. - The identity module is a chip for storing various kinds of information for authenticating a usage authority of the
mobile terminal 100 and can include a User Identity Module (UIM), a Subscriber Identity Module (SIM), and/or a Universal Subscriber Identity Module (USIM). A device having the identity module (hereinafter called ‘identity device’) can be manufactured as a smart card. Therefore, the identity device is connectable to themobile terminal 100 via the corresponding port. - When the
mobile terminal 110 is connected to an external cradle, the interface unit 170 becomes a passage for supplying themobile terminal 100 with power from the cradle or a passage for delivering various command signals input from the cradle by a user to themobile terminal 100. Each of the various command signals input from the cradle or the power can operate as a signal enabling themobile terminal 100 to recognize that it is correctly loaded in the cradle. - The
controller 180 controls the overall operations of themobile terminal 100. For example, thecontroller 180 performs the control and processing associated with voice calls, data communications, and video calls. Thecontroller 180 may include amultimedia module 181 that provides multimedia playback. Themultimedia module 181 may be configured as part of thecontroller 180, or implemented as a separate component. Moreover, thecontroller 180 is able to perform a pattern recognizing process for recognizing a writing input and a picture drawing input carried out on the touchscreen as characters or images, respectively. - The
power supply unit 190 provides power required by various components of themobile terminal 100. The power may be internal power, external power, or combinations of internal and external power. - Various embodiments described herein may be implemented in a computer-readable medium using, for example, computer software, hardware, or some combination of computer software and hardware. For a hardware implementation, the embodiments described herein may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described herein, or a selective combination thereof. Such embodiments may also be implemented by the
controller 180. - For a software implementation, the embodiments described herein may be implemented with separate software modules, such as procedures and functions, each of which perform one or more of the functions and operations described herein. The software codes can be implemented with a software application written in any suitable programming language and may be stored in memory such as the
memory 160, and executed by a controller or processor, such as thecontroller 180. - As illustrated in
FIG. 2 , a CDMA wireless communication system may include a plurality ofmobile terminals 100, a plurality of base stations (BSs) 270, base station controllers (BSCs) 275, and a mobile switching center (MSC) 280. The MSC 280 is configured to interface with a public switch telephone network (PSTN) 290. The system as shown illustrated inFIG. 2 may include a plurality of BSCs 275. - The MSC 280 is also configured to interface with the BSCs 275, which may be coupled to the
base stations 270 via backhaul lines. The backhaul lines may be configured in accordance with any of several known interfaces including, for example, E1/T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL, or xDSL. - Each
BS 270 may serve one or more sectors or regions, each sector or region covered by an omni-directional antenna or an antenna pointed in a particular direction radially away from theBS 270. Alternatively, each sector or region may be covered by two or more antennas for diversity reception. EachBS 270 may be configured to support a plurality of frequency assignments, with each frequency assignment having a particular spectrum, such as 1.25 MHz or 5 MHz - The intersection of a sector and frequency assignment may be referred to as a CDMA channel. The
BS 270 may also be referred to as base station transceiver subsystems (BTSs) or other equivalent terms. - The term “base station” may be used to collectively refer to a single BSC 275 and at least one
BS 270. The base station may also be referred to as a “cell site.” Alternatively, individual sectors of aparticular BS 270 may be referred to as a plurality of cell sites. - As illustrated in
FIG. 2 , a broadcasting transmitter (BT) 295 transmits a broadcast signal to themobile terminals 100 operating within the system. The broadcast receiving module 111 is provided in themobile terminal 100 to receive broadcast signals transmitted by theBT 295. -
FIG. 2 illustrates several global positioning system (GPS)satellites 300. TheGPS satellites 300 help locate at least one of a plurality ofmobile terminals 100. Althoughseveral GPS satellites 300 are depicted inFIG. 2 , it is understood that useful positioning information may be obtained with any number of GPS satellites. TheLocation information module 115 is typically configured to cooperate with theGPS satellites 300 to obtain desired positioning information. - Instead of or in addition to GPS tracking techniques, other technologies to track the location of the
mobile terminals 100 may be used. In addition, at least one of theGPS satellites 300 may selectively or additionally handle satellite DMB transmissions. - As one typical operation of the wireless communication system, the
BSs 270 receive reverse-link signals from variousmobile terminals 100. Themobile terminals 100 typically engage in calls, messaging, and other types of communications. Each reverse-link signal received by aparticular base station 270 is processed within theparticular BS 270. The resulting data is forwarded to an associated BSC 275. - The BSC 275 provides call resource allocation and mobility management functionality including the coordination of soft handoff procedures between
BSs 270. The BSCs 275 also route the received data to the MSC 280, which provides additional routing services for interfacing with thePSTN 290. Similarly, thePSTN 290 interfaces with the MSC 280, the MSC interfaces with the BSCs 275, and the BSCs 275 in turn control theBSs 270 to transmit forward-link signals to themobile terminals 100. - Embodiments of the present invention provide a method for setting a security level on an SNS. Accordingly, content which is to be uploaded in an SNS may include information regarding a security level associated with access rights for the content. The access rights may be based on user information, device information, location information, or other information associated with an SNS. The display of the content may be censored for various users according to the security level associated with the content. For example, the entire content or portions of the content may be uniquely processed, such as being pixilated, blurred, or blacked. Furthermore, access to the content may be entirely denied. The setting for how to display the content, such as pixilated, blurred, or blacked, may be determined by each SNS site or may be set via the security level.
- According to other embodiments of the present invention, tag information, sometimes referred to as characteristic information, may also be registered in an SNS site. The tag information may be registered with or without separately setting a security level for content. The registered tag information may automatically be stored in an SNS site or a terminal belonging to the another user. The stored tag information may function as a type of security information. For example, a second user may not be permitted to capture the user's face or a particular region of the user's face may be differently displayed when a user's tag information has been registered with an SNS site.
-
FIGS. 3A and 3B illustrate menus for setting a security level according to an embodiment of the present invention.FIG. 3A illustrates an example of setting a security level for each SNS site.FIG. 3B illustrates an example of setting a security level for groups in an SNS site. The method of setting a security level for groups in an SNS site (FIG. 3B ) allows for a greater level of detail in comparison to a site wide security level (FIG. 3A ). - As illustrated in
FIG. 3A a security setup menu may be utilized to set a security level for an SNS site. The security setup menu may include at least oneSNS site 301 and acheck box 302 for setting a security level. For example, an unchecked box may be associated with an unrestricted security level and a checked box may be associated with a restricted security level. - Similar to
FIG. 3A ,FIG. 3B illustrates a security setup menu to set a security level for groups within an SNS site. The security setup menu may include at least oneSNS group 303 and acheck box 302 for setting a security level. For example, an unchecked box may be associated with an unrestricted security level and a checked box may be associated with a restricted security level. The SNS sites and SNS groups included in the security setup menu illustrated inFIGS. 3A and 3B may be added/deleted according to a user selection. Furthermore, a security setup menu may include both anSNS site 301 and an SNS group 303 (not shown). - In many configurations the security level is set to restricted or unrestricted. However, the security level is not limited to restricted and unrestricted, other security levels may be applied as necessary.
- Upon setting the security level via the security setup menu, the
controller 180 may automatically recognize a user's face in the corresponding content and apply the appropriate security level. Alternatively, the security level may be associated with content tagged by a user. Other methods of tagging content may include recognizing a user's name or other data associated with a user. -
FIG. 4 illustrates an example of a screen displayed after setting a security level for content according to an embodiment of the present invention. As illustrated inFIG. 4 , when afirst user 401 selects a restricted security level for a group in the security setup menu (FIG. 3B ), thecontroller 180 may automatically recognize the face of the first user and apply the appropriate security level to the content. Anindicator 410, such as a circle, may indicate that the security level has been applied to the content or a portion of the content. - In many configurations the security level is concurrently registered with the SNS site when the content is uploaded to the SNS site. However, the security level may be registered with the SNS prior to uploading the content or after uploading the content. Alternatively, a user may apply a security level to content which has already been uploaded to the SNS site. Finally, the security level is executed in response to access of the content on the SNS site.
- For example, when a second user accesses content, such as a photo, associated with the first user, a server of the SNS site may decode the corresponding content according to the security level set by the first user. Thus, when the decoded content associated with the first user is displayed, a specific portion, such as a user's face, may be censored according to the set security level. The censorship may include, but is not limited to, pixilating, blurring, or blacking out the specific portion, moreover the censorship may include superimposing a character or an image on the specific portion. The content associated with a user refers to content which includes information regarding the user, such as a picture which includes the user's image or text which mentions the user.
-
FIG. 5 illustrates an example of applying a security level to content associated with a user. A user may set a security level to a restricted setting for a group on a SNS site, for example, the user may restrict content viewed by a cyworld-friend group (FIG. 3B ). When a second user belonging to the group associated with a restricted setting accesses the content, a face of the user associated the content may be censored. As illustrated inFIG. 5( a), the first user'sface 501 is not censored in an unrestricted state, however, the first user'sface 502 may be censored when a user associated with a restricted group accesses the content associated with the first user (FIG. 5( b)). - According to an embodiment of the present invention, the restricted setting for a security level is configured to censor a user's face via facial recognition methods. Accordingly, other methods may be employed in order to set restrictions on content. For example, according to another embodiment of the present invention, a security level may be set for the entire content or for only specific portions of the content. When multiple security levels are set for a specific content, the content may be decoded according to the priority level for each of the multiple security levels.
-
FIG. 6A illustrates asecurity setup menu 610 for setting a security level for content according to an embodiment of the present invention. As illustrated inFIG. 6A , the security setup menu may include items such assite information 601,device information 602,position information 603, andsecurity level duration 604. Furthermore, although not shown, the security menu may include items for checking a set security level or for checking a state of the set security level. The formation and number of the items included in thesecurity setup menu 610 may be adjusted according to user requirements. - The
site information 601 sets a security level on a site wide basis or according to groups associated with individual SNS sites. The site information menu is similar to the menu illustrated inFIGS. 3A and 3B . - The
device information 602 sets a security level according to a device type. For example, the security level may vary according to whether a device accessing the content is a mobile terminal or a personal computer (PC). Moreover, the security level may be set according to whether a PC is a private PC or a shared PC. The device information security level utilizes device information such as hardware or software characteristics, IP information, and other device specific information. - The
position information 603 sets a security level according to various zones or locations. For example, the security level may vary according to a physical location of a user attempting to access the content. The physical location may include a city, town, country, or other geographic location. - The security
level set duration 604 may set a duration for a security level. For example, a restricted setting for a content may be set for only a month. - The
security setup menu 610 may be displayed as a pop-up window in response to tagging the content (FIG. 6B ). Alternatively, thesecurity setup menu 610 may be included in a user menu. Moreover, thesecurity setup menu 610 may be displayed in response to a specific touch input, such as a long touch or double touch (FIG. 6C ). - Referring to
FIG. 7 , the set security level may be displayed on a predetermined area of a screen in response to an input on the content. For example, the set security level may be displayed in response to an input on a tagged portion 703 of the content (FIG. 7( a)). The set security level may be displayed in a block form (not shown). Alternatively, the set security level may be displayed as a text or numeral 701 adjacent to a tagged portion of the content (FIG. 7( b)). Finally, a scroll bar may be displayed to allow a user to search the security levels when multiple security levels have been set for the displayed content (not shown). - Furthermore, a
security setup menu 702 may be displayed on a predetermined area (FIG. 7( c)). Thesecurity setup menu 702 may display numbers corresponding to different menu items or different security levels. Accordingly, a user may change a preset security level by selecting a number displayed on thesecurity setup menu 702. - The process of applying the security level when content is accessed on an SNS site is similar to the process disclosed with regard to
FIGS. 4 and 5 . -
FIG. 8A illustrates an example of applying a security level for content according to an embodiment of the present invention. In response to a first user tagging a content 810 (FIG. 8A(a)), a user may set a security level according to a method described in relation toFIGS. 6A to 6C . For example, the first user may set a group associated with a specific SNS site to a restricted setting, furthermore, the user may set a security level for theposition information 603. - For example, the user may set the security level for a Facebook Classmate group to restricted and the security level for the position information as unrestricted for only a domestic location. Once the various security levels have been set, the first user may upload the content to an SNS site, and the security level will be registered within the SNS site. For example purposes, the first user will have uploaded the content to Facebook and the security level will be registered with Facebook.
- In this example, a domestic location setting may refer to a user located in the same geographic location, such as a city, state, or country, as the first user. In many configurations, setting an unrestricted setting for only a specific location will result in all other locations being set as restricted. However, the position information is not limited to only being unrestricted to a specific location. In some embodiments a user may set some locations with a restricted setting while setting other locations with an unrestricted setting.
- Once the
content 810 has been uploaded to Facebook, other Facebook users may view the content. Additionally, other users may tag thecontent 810 via Facebook. For example, a second user may tag thecontent 810. In this example, the second user may belong to a Facebook group which has an unrestricted security level, and therefore, the second user may have an uncensored view of the content 810 (FIG. 8A(b)). Since the second user is in an unrestricted group, the second user may tag thecontent 810, in other words, a user associated with a restricted security level may not tag content. - Accordingly, in response to setting a
tag 802 on thecontent 810, the second user may set additional security levels for thecontent 810. For example, thesecond user 802 may set a security level to designate a third user as having a restricted security level. Thus, according to the present example, thecontent 810 uploaded by the first user would have three different security levels. - Thus, for example, when the third user, who is a member of a Facebook Classmate group, attempts to view the
content 810, the server may censor both the image of the first user and the second user. Specifically, the image of the first user is censored because the first user had set the security level for members of the Facebook Classmate group to restricted. Additionally, the image of the second user is censored because the second user had set the security level for the third user to restricted. -
FIG. 8B illustrates an example of applying a security level set for content according to an embodiment of the present invention. As discussed with regard toFIG. 8A ,content 810 uploaded by the first user may have a security level for a Facebook Classmate group set to restricted and may have a position information as unrestricted for only domestic access (FIG. 8B(a)). - According to the present example, when a second user associated with the Facebook Classmate group accesses the
content 810, a Facebook server decodes the corresponding content according to the set security levels and censors the output of thecontent 810. Accordingly, the image of the first user 801 is blurred when viewed by the second user (FIG. 8B(b)). - Furthermore, when a third user who is not located in an area which is in the same domestic region as the first user accesses the
content 810, the server identifies the location of the third user and censors the entire content 810 (FIG. 8B(c)). The location of the third user may be determined according to IP information or GPS information of the terminal. In this example, the entire photo is blurred because the security level for the position information has a higher priority than the security level for groups associated with an SNS. The priority for the security levels may be set by a user or preset by each SNS. -
FIGS. 9A and 9B illustrate an example of applying a security level according to priority levels. Similar to the embodiment described with regard toFIG. 8A ,content 910 uploaded by thefirst user 901 may have a security level for a Facebook Classmate group set to restricted and a position information set to unrestricted for only domestic access (FIG. 9A ). Furthermore, thecontent 910 is set with an additional security level of restricted when the content is accessed by a shared PC (FIG. 9A ). - According to the present example, when a second user using a shared PC who is associated with an unrestricted group accesses the
content 910, a SNS server decodes the corresponding content according to the set security levels and does not censor the output of the content 910 (FIG. 9A(b)). Specifically, as illustrated in FIG. 9A(b) thecontent 910 is not censored because the security level set for the SNS group has a higher priority in comparison to the security level set for thedevice information 602. In other words, the unrestricted SNS group setting trumps the restricted shared PC setting. - Additionally, since the second user has unrestricted access, the second user may set a security level which designates a third user as having a restricted security level. Thus, according to the present example, the
content 910 uploaded by the first user would have four different security levels. - Accordingly, when the third user, who is a member of a Facebook Classmate group, attempts to view the
content 910, the server may censor the image of the first user and the second user. Specifically, the image of the first user is censored because the first user had set the security level for members of the Facebook Classmate group to restricted. Additionally, the image of the second user is censored because the second user had set the security level for the third user to restricted. - With regard to
FIG. 9B ,content 910 uploaded by the first user may be set to have a security level for a Facebook Classmate group set to restricted, position information as unrestricted for only domestic access, and device information as restricted for shared PCs. - According to the present example, when a second user associated with the Facebook Classmate group access the
content 910 via a terminal which is set with unrestricted access, such as a private PC, a Facebook server decodes the corresponding content according to the set security levels and censors the output of thecontent 910. Accordingly, the image of thefirst user 901 is blurred when viewed by the second user since the second user is associated with a restricted SNS group (FIG. 8B(b)). As previously discussed, the security level set for the SNS group has a higher priority in comparison to the security level set for the device information. - Furthermore, when a third user associated with an unrestricted group and located in an area which is not in the same domestic region as the first user accesses the
content 910, the server identifies the location of the third user and censors the entire content (FIG. 9B(c)). The location of the third user may be determined according to IP information or GPS information of the terminal utilized by the third user. In this example, the entire photo is blurred because the security level for the position information has a higher priority than the security level for groups associated with a SNS. - As discussed with regard to the various embodiments, when several security levels are set for a specific content, priorities for applying the corresponding security levels to the content may be defined and the security levels may be applied according to the defined priorities.
- The method for setting security levels has thus far been described with regard to setting a security level for each content or specific portions of each content. Additionally, it is often useful to provide a method of setting the security levels for all content associated with a user as opposed to setting a security level for each content.
- According to an embodiment of the present invention, a first user may register a desired security level with an SNS site. Additionally, the desired security level may be stored on the SNS site or may be stored on terminal associated with a second user. Thus, when the second user visits the SNS site associated with the first user, the stored security levels may be applied to the content associated with the first user. Additionally, the security levels may be applied when the second user attempts to upload content associated with the first user.
- In this example the security level may be associated with a user's tag information, such as a user's face, or a user's personal data, such as a user's phone number, user ID, email address, or home address. The security level sent to the second user may be a user's tag information as to prevent unauthorized access to a user's personal data. Accordingly, when the second user accesses content associated with the first user or uploads content associated with the first user, the tag information and personal data associated with the first user are recognized and thereafter the information associated with the first user, such as the user's face, may be censored or access to the content may be completely denied.
-
FIG. 10 illustrates an example of applying a security level according to tag information and personal data according to an embodiment of the present invention. As illustrated inFIG. 10 , afirst user 950 registers tag information, such as information related to a user's face, and personal data including a phone number, an email address, or other information in an SNS site such as Twitter. When a second user visits the SNS site associated with the first user, the registered tag information may be stored on terminal belonging to the second user. - In this example, assuming that the second user has restricted access to content associated with the first user, when the second user downloads or views content associated with the first user via the SNS site, a server or the
controller 180 determines that the content is associated with the first user according to tag information, such as the first user's face, or personal data and censors the content. As illustrated inFIG. 10( b) the censorship may comprise blurring the face of thefirst user 950. - According to this example, the security level according to the tag information or personal data may include the same security level settings disclosed with regard to the embodiments illustrated in
FIGS. 6-9 . For example, the security level may be set according to site information, device information, or position information. - In one example, the user may set a security level for position information to restricted for a location, such as Las Vegas. Thus, when the second user captures an image of the first user in the restricted location (Las Vegas), the
mobile terminal 100 first determines that the captured image is associated with the first user via tag information or personal data, and then determines that the image was captured in a restricted location and censors the image. The location of the image capture is determined via GPS of other positioning methods. - Additionally, when the second user attempts to upload content associated with the first user or capture content associated with the first user, the
controller 180 determines that the content is associated with the first user according to tag information, such as the first user's face, or personal data and censors or prevents the capture of the content. As illustrated inFIG. 10( c) the censorship may comprise blacking the entire image of thefirst user 950. -
FIG. 11 is a flowchart illustrating a security setting method for a mobile terminal according to an embodiment of the present invention. In many configurations, the method illustrated inFIG. 11 is applied when a user sets a security level directly to a content prior to uploading the content to an SNS site. - As illustrated in
FIG. 11 , a first user sets a security level for a content or a portion of the content (S10). The security level may be set automatically by thecontroller 810 via facial recognition or other techniques. Additionally, the security level may be set by the first user. In addition to setting a security level according to tag information or personal data, the first user may also set the security level according to at least, for example, site information, device information, or position information. - The set security level may be displayed on the display unit 151 to allow a user to check or adjust the security levels. The plurality of security levels may be displayable in a graphic form or a numeric form. The displayed order of the security levels may indicate the priority levels.
- Once the security level has been set, the
wireless communication unit 110 uploads the corresponding content to an SNS site, such as Twitter (S11). The security level is then registered with the site. - Once the content has been uploaded, when a second user attempts to access the content associated with the first user via the SNS site (S12), a server of the corresponding SNS site applies the security level set by the first user to the content (S13).
- The server of the corresponding SNS site determines if the security level of the content is set to a restricted level or an unrestricted level for the second user (S14). If the security level for access by the second user is set to unrestricted, the content is displayed (S15), and the second user may view the content. However, if the security level for access by the second user is set to restricted, the content is censored (S16). Censorship of the content may refer to modifying the display of the content, such as pixelating or blurring the content. Alternatively the content may be censored such that content is not displayed.
-
FIG. 12 is a flowchart illustrating a security level setting method for a mobile terminal according to an embodiment of the present invention. In many configurations, the method illustrated inFIG. 12 is applied when a user sets security a security level on a SNS site or sends the security level directly to another user. - As illustrated in
FIG. 12 , a first user sets tag information and personal data in a mobile terminal (S20). Thecontroller 180 then registers the set tag information and the personal data in an SNS site, such as Facebook, via the wireless communication unit 110 (S21). - When a second user attempts to access content associated with the first user via the SNS site, the SNS site may send the tag information to be stored in a terminal of the second user (S23).
- In this example, when the second user downloads or views content associated with the first user, the server or
controller 180 automatically generates tags for information associated with the first user, such as the user's face, a phone number, or an address corresponding to the tag information and the personal data (S24). - The server of the corresponding SNS site determines if the security level of the content is set to a restricted level or an unrestricted level for the second user (S25). If the security level for access by the second user is set to unrestricted, access to the content is allowed (S26), and the second user may view the content. However, if the security level for access by the second user is set to restricted, the content is censored (S27). Censorship of the content may refer to modifying the display of the content, such as pixelating or blurring the content. Alternatively the content may be censored such that content is not displayed.
- Moreover, determining the security level of the content (S25) also determines if the security level for content is set to a site wide restricted or unrestricted setting and is not limited to determining the security level for a specific user.
- In many configurations, when a user executes a content capturing menu, such as a camera, the
controller 180 receives a preview image from a camera and displays the preview image on the display unit 151. Thecontroller 180 may detect at least one face image from the preview image and stores the detected face image or characteristic information related to the detected face image. The characteristic information related to the face image may be encoded by terminal-specific information, such as at least one of an electronic serial number (ESN), a phone number, or subscriber identity module (SIM) information. - The various embodiments of the present invention illustrate the display as a horizontal display, however, the display is not limited to a horizontal display. A vertical display may also be displayed in the same manner.
- The disclosed methods can be implemented in a program recorded medium as computer-readable codes. The computer-readable media may include all types of recording devices in which data readable by a computer system are stored. The computer-readable media may include ROM, RAM, CD-ROM, magnetic tapes, floppy discs, and optical data storage devices, and may also include carrier-wave type implementations, such as transmission via the Internet.
- It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the present invention. Thus, it is intended that the present invention covers the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.
Claims (22)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2010-0085646 | 2010-09-01 | ||
KR1020100085646A KR101788598B1 (en) | 2010-09-01 | 2010-09-01 | Mobile terminal and information security setting method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
US20120054838A1 true US20120054838A1 (en) | 2012-03-01 |
US8813193B2 US8813193B2 (en) | 2014-08-19 |
Family
ID=44651180
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/217,212 Expired - Fee Related US8813193B2 (en) | 2010-09-01 | 2011-08-24 | Mobile terminal and information security setting method thereof |
Country Status (4)
Country | Link |
---|---|
US (1) | US8813193B2 (en) |
EP (1) | EP2426969B1 (en) |
KR (1) | KR101788598B1 (en) |
CN (1) | CN102387133B (en) |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140091984A1 (en) * | 2012-09-28 | 2014-04-03 | Nokia Corporation | Method and apparatus for providing an indication regarding content presented to another user |
US20150104003A1 (en) * | 2013-10-10 | 2015-04-16 | Elwha Llc | Methods, systems, and devices for handling image data from captured images |
US9235711B1 (en) | 2014-06-24 | 2016-01-12 | Voxience S.A.R.L. | Systems, methods and devices for providing visual privacy to messages |
KR20160131886A (en) * | 2015-05-08 | 2016-11-16 | 삼성전자주식회사 | Terminal device and method for protecting information thereof |
WO2016182272A1 (en) * | 2015-05-08 | 2016-11-17 | Samsung Electronics Co., Ltd. | Terminal device and method for protecting information thereof |
US20160337673A1 (en) * | 2013-12-20 | 2016-11-17 | Siemens Aktiengesellschaft | Protection of privacy in a video stream by means of a redundant slice |
WO2017053214A1 (en) * | 2015-09-26 | 2017-03-30 | Microsoft Technology Licensing, Llc | Providing access to non-obscured content items based on triggering events |
US9794200B2 (en) | 2012-09-20 | 2017-10-17 | DeNA Co., Ltd. | Server device, method, and system |
US9799036B2 (en) | 2013-10-10 | 2017-10-24 | Elwha Llc | Devices, methods, and systems for managing representations of entities through use of privacy indicators |
US10013564B2 (en) | 2013-10-10 | 2018-07-03 | Elwha Llc | Methods, systems, and devices for handling image capture devices and captured images |
US10102543B2 (en) | 2013-10-10 | 2018-10-16 | Elwha Llc | Methods, systems, and devices for handling inserted data into captured images |
US10185841B2 (en) | 2013-10-10 | 2019-01-22 | Elwha Llc | Devices, methods, and systems for managing representations of entities through use of privacy beacons |
US20190028896A1 (en) * | 2013-11-26 | 2019-01-24 | At&T Intellectual Property I, L.P. | Security management on a mobile device |
US20190087608A1 (en) * | 2017-09-15 | 2019-03-21 | Paypal, Inc. | Providing privacy protection for data capturing devices |
US10346624B2 (en) | 2013-10-10 | 2019-07-09 | Elwha Llc | Methods, systems, and devices for obscuring entities depicted in captured images |
WO2019175685A1 (en) * | 2018-03-14 | 2019-09-19 | Sony Mobile Communications Inc. | Method, electronic device and social media server for controlling content in a video media stream using face detection |
US10834290B2 (en) | 2013-10-10 | 2020-11-10 | Elwha Llc | Methods, systems, and devices for delivering image data from captured images to devices |
US11350270B2 (en) * | 2016-06-22 | 2022-05-31 | Saronikos Trading And Services, Unipessoal Lda | Method, software, apparatus, electronic device, server and storage medium for ensuring privacy of communication |
US11546329B2 (en) * | 2019-03-25 | 2023-01-03 | Casio Computer Co., Ltd. | Portable communication terminal control system, portable communication terminal and recording medium |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9679606B2 (en) * | 2011-09-14 | 2017-06-13 | Cable Television Laboratories, Inc. | Method of modifying play of an original content form |
CN103023944B (en) * | 2011-09-27 | 2015-11-25 | 腾讯科技(深圳)有限公司 | The method and system of associated user are pushed in a kind of SNS network |
US9501702B2 (en) * | 2012-12-11 | 2016-11-22 | Unify Gmbh & Co. Kg | Method of processing video data, device, computer program product, and data construct |
KR101988319B1 (en) | 2013-09-09 | 2019-06-12 | 엘지전자 주식회사 | Mobile terminal and controlling method thereof |
KR102219464B1 (en) * | 2014-05-23 | 2021-02-25 | 삼성전자주식회사 | Operating method and Electronic device for security |
US9665697B2 (en) | 2015-03-17 | 2017-05-30 | International Business Machines Corporation | Selectively blocking content on electronic displays |
EP3110161B1 (en) * | 2015-06-23 | 2019-10-09 | Nokia Technologies Oy | Method, apparatus and computer program product for controlling access to concurrently captured images |
US10284558B2 (en) | 2015-08-12 | 2019-05-07 | Google Llc | Systems and methods for managing privacy settings of shared content |
CN105141507A (en) * | 2015-08-26 | 2015-12-09 | 努比亚技术有限公司 | Method and device for displaying head portrait for social application |
US9979684B2 (en) | 2016-07-13 | 2018-05-22 | At&T Intellectual Property I, L.P. | Apparatus and method for managing sharing of content |
KR102148937B1 (en) * | 2018-10-26 | 2020-08-31 | 서울여자대학교 산학협력단 | Cloud server and method of managing image file |
FR3114893B1 (en) * | 2020-10-05 | 2024-02-23 | Commissariat Energie Atomique | Method for administering the uploading of personal content to a web platform |
KR20220119970A (en) * | 2021-02-22 | 2022-08-30 | 삼성전자주식회사 | Electronic device displaying safety image and method of operating the same |
KR102444070B1 (en) * | 2022-04-05 | 2022-09-19 | 윤예영 | Apparatus and method for censoring images |
Citations (50)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6158009A (en) * | 1997-10-17 | 2000-12-05 | Fujitsu Limited | Communication monitoring and controlling apparatus |
US20030237006A1 (en) * | 2002-06-24 | 2003-12-25 | International Business Machines Corporation | Security objects controlling access to resources |
US20040044894A1 (en) * | 2001-12-13 | 2004-03-04 | Lofgren Neil E. | Transforming data files into logical storage units for auxiliary data through reversible watermarks |
US20040145654A1 (en) * | 2003-01-21 | 2004-07-29 | Nec Corporation | Mobile videophone terminal |
US20040204060A1 (en) * | 2002-03-20 | 2004-10-14 | Takumi Makinouchi | Communication terminal device capable of transmitting visage information |
US20050021625A1 (en) * | 2002-01-18 | 2005-01-27 | Matsushita Elec. Ind. Co.Ltd. | Communication apparatus |
JP2005050012A (en) * | 2003-07-31 | 2005-02-24 | Casio Comput Co Ltd | Image outputting device, image outputting method, and image output processing program, and image distributing server, and image distribution processing program |
US20050055579A1 (en) * | 2003-08-21 | 2005-03-10 | Mitsuru Kanda | Server apparatus, and method of distributing a security policy in communication system |
US20050091518A1 (en) * | 2003-10-23 | 2005-04-28 | Agarwal Sameet H. | System and methods providing enhanced security model |
JP2005258679A (en) * | 2004-03-10 | 2005-09-22 | Advanced Telecommunication Research Institute International | Image photographing device |
US20050246772A1 (en) * | 2004-04-30 | 2005-11-03 | Microsoft Corporation | System and method for zone transition mitigation with relation to a network browser |
US20060010155A1 (en) * | 2004-07-09 | 2006-01-12 | Microsoft Corporation | System that facilitates maintaining business calendars |
US20060064384A1 (en) * | 2004-09-15 | 2006-03-23 | Sharad Mehrotra | Apparatus and method for privacy protection of data collection in pervasive environments |
US20060181547A1 (en) * | 2005-02-12 | 2006-08-17 | Patrick Loo | Method and system for image editing in a mobile multimedia processor |
US20060238380A1 (en) * | 2005-04-21 | 2006-10-26 | Microsoft Corporation | Maintaining user privacy in a virtual earth environment |
US20070150340A1 (en) * | 2005-12-28 | 2007-06-28 | Cartmell Brian R | Advertising technique |
US20070220540A1 (en) * | 2000-06-12 | 2007-09-20 | Walker Jay S | Methods and systems for facilitating the provision of opinions to a shopper from a panel of peers |
US20080034052A1 (en) * | 2002-07-02 | 2008-02-07 | International Business Machines Corporation | Application Prioritization in a Stateless Protocol |
US20080045189A1 (en) * | 2006-08-18 | 2008-02-21 | Samsung Electronics Co., Ltd. | Method of sharing information in mobile terminal using local wireless communication |
US20080117295A1 (en) * | 2004-12-27 | 2008-05-22 | Touradj Ebrahimi | Efficient Scrambling Of Regions Of Interest In An Image Or Video To Preserve Privacy |
US20080219493A1 (en) * | 2004-03-30 | 2008-09-11 | Yoav Tadmor | Image Processing System |
US20080229215A1 (en) * | 2007-03-14 | 2008-09-18 | Samuel Pierce Baron | Interaction In A Virtual Social Environment |
US7433473B2 (en) * | 2004-09-10 | 2008-10-07 | Nagracard S.A. | Data transmission method between a broadcasting center and a multimedia unit |
US20090016615A1 (en) * | 2007-07-11 | 2009-01-15 | Ricoh Co., Ltd. | Invisible Junction Feature Recognition For Document Security or Annotation |
US20090085918A1 (en) * | 2007-10-02 | 2009-04-02 | Crawford Adam Hollingworth | Method and device for creating movies from still image data |
US20090216769A1 (en) * | 2008-02-26 | 2009-08-27 | Bellwood Thomas A | Digital Rights Management of Captured Content Based on Criteria Regulating a Combination of Elements |
US20090222766A1 (en) * | 2008-02-29 | 2009-09-03 | Lg Electronics Inc. | Controlling access to features of a mobile communication terminal |
US20090228486A1 (en) * | 2008-03-05 | 2009-09-10 | Kuehr-Mclaren David Gerard | Using social networking thersholds in access control decisions |
US20090259932A1 (en) * | 2008-04-14 | 2009-10-15 | International Business Machines Corporation | User-selectable hide option for a user interface, which is not persisted, and which is not dependent upon intra-document controls |
US20090292814A1 (en) * | 2008-05-22 | 2009-11-26 | Yahoo! Inc. | Federation and interoperability between social networks |
US20090300480A1 (en) * | 2005-07-01 | 2009-12-03 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Media segment alteration with embedded markup identifier |
US20090323087A1 (en) * | 2008-06-30 | 2009-12-31 | Konica Minolta Systems Laboratory, Inc. | Systems and Methods for Document Redaction |
US20100016003A1 (en) * | 2005-09-28 | 2010-01-21 | Ontela, Inc. | System and method for allowing a user to opt for automatic or selectively sending of media |
US20100088364A1 (en) * | 2008-10-08 | 2010-04-08 | International Business Machines Corporation | Social networking architecture in which profile data hosting is provided by the profile owner |
US20100110080A1 (en) * | 2008-11-05 | 2010-05-06 | Clive Goodinson | System and method for comic creation and editing |
US20100128923A1 (en) * | 2007-03-06 | 2010-05-27 | Hitoshi Kiya | Image processing method and image inspecting method |
US20100153848A1 (en) * | 2008-10-09 | 2010-06-17 | Pinaki Saha | Integrated branding, social bookmarking, and aggregation system for media content |
US7747680B2 (en) * | 2007-10-30 | 2010-06-29 | Yahoo! Inc. | Community-based web filtering |
US20100174722A1 (en) * | 2009-01-08 | 2010-07-08 | International Business Machines Corporation | Filters for shared content in an online community |
US20100199340A1 (en) * | 2008-08-28 | 2010-08-05 | Jonas Lawrence A | System for integrating multiple im networks and social networking websites |
US20100246965A1 (en) * | 2009-03-31 | 2010-09-30 | Microsoft Corporation | Tagging video using character recognition and propagation |
US20100306815A1 (en) * | 2009-05-29 | 2010-12-02 | Embarq Holdings Company, Llc | System and method for sharing user content through a set-top box |
US7874013B2 (en) * | 2006-04-10 | 2011-01-18 | Sawteeth, Inc. | Secure and granular index for information retrieval |
US20110161999A1 (en) * | 2009-12-30 | 2011-06-30 | Rovi Technologies Corporation | Systems and methods for selectively obscuring portions of media content using a widget |
US20110202968A1 (en) * | 2010-02-18 | 2011-08-18 | Nokia Corporation | Method and apparatus for preventing unauthorized use of media items |
US20110208572A1 (en) * | 2010-02-22 | 2011-08-25 | ASC Information Technology, Inc. | Systems and methods for providing a refferal reward incentive for an item via a networking website |
US8073866B2 (en) * | 2005-03-17 | 2011-12-06 | Claria Innovations, Llc | Method for providing content to an internet user based on the user's demonstrated content preferences |
US8108359B1 (en) * | 2007-12-14 | 2012-01-31 | Symantec Corporation | Methods and systems for tag-based object management |
US20120075464A1 (en) * | 2010-09-23 | 2012-03-29 | Stryker Corporation | Video monitoring system |
US8385971B2 (en) * | 2008-08-19 | 2013-02-26 | Digimarc Corporation | Methods and systems for content processing |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6937730B1 (en) | 2000-02-16 | 2005-08-30 | Intel Corporation | Method and system for providing content-specific conditional access to digital content |
US20050197922A1 (en) * | 2004-03-04 | 2005-09-08 | Peter Pezaris | Method and system for accessing and printing access-controlled photographs using a public computer network |
US7832003B2 (en) * | 2005-04-28 | 2010-11-09 | Microsoft Corporation | Walled gardens |
-
2010
- 2010-09-01 KR KR1020100085646A patent/KR101788598B1/en active IP Right Grant
-
2011
- 2011-08-24 US US13/217,212 patent/US8813193B2/en not_active Expired - Fee Related
- 2011-08-25 EP EP11178841.0A patent/EP2426969B1/en not_active Not-in-force
- 2011-08-31 CN CN201110255608.5A patent/CN102387133B/en not_active Expired - Fee Related
Patent Citations (51)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6158009A (en) * | 1997-10-17 | 2000-12-05 | Fujitsu Limited | Communication monitoring and controlling apparatus |
US20070220540A1 (en) * | 2000-06-12 | 2007-09-20 | Walker Jay S | Methods and systems for facilitating the provision of opinions to a shopper from a panel of peers |
US20040044894A1 (en) * | 2001-12-13 | 2004-03-04 | Lofgren Neil E. | Transforming data files into logical storage units for auxiliary data through reversible watermarks |
US20050021625A1 (en) * | 2002-01-18 | 2005-01-27 | Matsushita Elec. Ind. Co.Ltd. | Communication apparatus |
US20040204060A1 (en) * | 2002-03-20 | 2004-10-14 | Takumi Makinouchi | Communication terminal device capable of transmitting visage information |
US20030237006A1 (en) * | 2002-06-24 | 2003-12-25 | International Business Machines Corporation | Security objects controlling access to resources |
US20080034052A1 (en) * | 2002-07-02 | 2008-02-07 | International Business Machines Corporation | Application Prioritization in a Stateless Protocol |
US20040145654A1 (en) * | 2003-01-21 | 2004-07-29 | Nec Corporation | Mobile videophone terminal |
JP2005050012A (en) * | 2003-07-31 | 2005-02-24 | Casio Comput Co Ltd | Image outputting device, image outputting method, and image output processing program, and image distributing server, and image distribution processing program |
US20050055579A1 (en) * | 2003-08-21 | 2005-03-10 | Mitsuru Kanda | Server apparatus, and method of distributing a security policy in communication system |
US20050091518A1 (en) * | 2003-10-23 | 2005-04-28 | Agarwal Sameet H. | System and methods providing enhanced security model |
JP2005258679A (en) * | 2004-03-10 | 2005-09-22 | Advanced Telecommunication Research Institute International | Image photographing device |
US20080219493A1 (en) * | 2004-03-30 | 2008-09-11 | Yoav Tadmor | Image Processing System |
US20050246772A1 (en) * | 2004-04-30 | 2005-11-03 | Microsoft Corporation | System and method for zone transition mitigation with relation to a network browser |
US20060010155A1 (en) * | 2004-07-09 | 2006-01-12 | Microsoft Corporation | System that facilitates maintaining business calendars |
US7433473B2 (en) * | 2004-09-10 | 2008-10-07 | Nagracard S.A. | Data transmission method between a broadcasting center and a multimedia unit |
US20060064384A1 (en) * | 2004-09-15 | 2006-03-23 | Sharad Mehrotra | Apparatus and method for privacy protection of data collection in pervasive environments |
US20080117295A1 (en) * | 2004-12-27 | 2008-05-22 | Touradj Ebrahimi | Efficient Scrambling Of Regions Of Interest In An Image Or Video To Preserve Privacy |
US20060181547A1 (en) * | 2005-02-12 | 2006-08-17 | Patrick Loo | Method and system for image editing in a mobile multimedia processor |
US8073866B2 (en) * | 2005-03-17 | 2011-12-06 | Claria Innovations, Llc | Method for providing content to an internet user based on the user's demonstrated content preferences |
US20060238380A1 (en) * | 2005-04-21 | 2006-10-26 | Microsoft Corporation | Maintaining user privacy in a virtual earth environment |
US20090300480A1 (en) * | 2005-07-01 | 2009-12-03 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Media segment alteration with embedded markup identifier |
US20100016003A1 (en) * | 2005-09-28 | 2010-01-21 | Ontela, Inc. | System and method for allowing a user to opt for automatic or selectively sending of media |
US20070150340A1 (en) * | 2005-12-28 | 2007-06-28 | Cartmell Brian R | Advertising technique |
US8407093B2 (en) * | 2005-12-28 | 2013-03-26 | Brian R. Cartmell | Advertising technique |
US7874013B2 (en) * | 2006-04-10 | 2011-01-18 | Sawteeth, Inc. | Secure and granular index for information retrieval |
US20080045189A1 (en) * | 2006-08-18 | 2008-02-21 | Samsung Electronics Co., Ltd. | Method of sharing information in mobile terminal using local wireless communication |
US20100128923A1 (en) * | 2007-03-06 | 2010-05-27 | Hitoshi Kiya | Image processing method and image inspecting method |
US20080229215A1 (en) * | 2007-03-14 | 2008-09-18 | Samuel Pierce Baron | Interaction In A Virtual Social Environment |
US20090016615A1 (en) * | 2007-07-11 | 2009-01-15 | Ricoh Co., Ltd. | Invisible Junction Feature Recognition For Document Security or Annotation |
US20090085918A1 (en) * | 2007-10-02 | 2009-04-02 | Crawford Adam Hollingworth | Method and device for creating movies from still image data |
US7747680B2 (en) * | 2007-10-30 | 2010-06-29 | Yahoo! Inc. | Community-based web filtering |
US8108359B1 (en) * | 2007-12-14 | 2012-01-31 | Symantec Corporation | Methods and systems for tag-based object management |
US20090216769A1 (en) * | 2008-02-26 | 2009-08-27 | Bellwood Thomas A | Digital Rights Management of Captured Content Based on Criteria Regulating a Combination of Elements |
US20090222766A1 (en) * | 2008-02-29 | 2009-09-03 | Lg Electronics Inc. | Controlling access to features of a mobile communication terminal |
US20090228486A1 (en) * | 2008-03-05 | 2009-09-10 | Kuehr-Mclaren David Gerard | Using social networking thersholds in access control decisions |
US20090259932A1 (en) * | 2008-04-14 | 2009-10-15 | International Business Machines Corporation | User-selectable hide option for a user interface, which is not persisted, and which is not dependent upon intra-document controls |
US20090292814A1 (en) * | 2008-05-22 | 2009-11-26 | Yahoo! Inc. | Federation and interoperability between social networks |
US20090323087A1 (en) * | 2008-06-30 | 2009-12-31 | Konica Minolta Systems Laboratory, Inc. | Systems and Methods for Document Redaction |
US8385971B2 (en) * | 2008-08-19 | 2013-02-26 | Digimarc Corporation | Methods and systems for content processing |
US20100199340A1 (en) * | 2008-08-28 | 2010-08-05 | Jonas Lawrence A | System for integrating multiple im networks and social networking websites |
US20100088364A1 (en) * | 2008-10-08 | 2010-04-08 | International Business Machines Corporation | Social networking architecture in which profile data hosting is provided by the profile owner |
US20100153848A1 (en) * | 2008-10-09 | 2010-06-17 | Pinaki Saha | Integrated branding, social bookmarking, and aggregation system for media content |
US20100110080A1 (en) * | 2008-11-05 | 2010-05-06 | Clive Goodinson | System and method for comic creation and editing |
US20100174722A1 (en) * | 2009-01-08 | 2010-07-08 | International Business Machines Corporation | Filters for shared content in an online community |
US20100246965A1 (en) * | 2009-03-31 | 2010-09-30 | Microsoft Corporation | Tagging video using character recognition and propagation |
US20100306815A1 (en) * | 2009-05-29 | 2010-12-02 | Embarq Holdings Company, Llc | System and method for sharing user content through a set-top box |
US20110161999A1 (en) * | 2009-12-30 | 2011-06-30 | Rovi Technologies Corporation | Systems and methods for selectively obscuring portions of media content using a widget |
US20110202968A1 (en) * | 2010-02-18 | 2011-08-18 | Nokia Corporation | Method and apparatus for preventing unauthorized use of media items |
US20110208572A1 (en) * | 2010-02-22 | 2011-08-25 | ASC Information Technology, Inc. | Systems and methods for providing a refferal reward incentive for an item via a networking website |
US20120075464A1 (en) * | 2010-09-23 | 2012-03-29 | Stryker Corporation | Video monitoring system |
Cited By (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9794200B2 (en) | 2012-09-20 | 2017-10-17 | DeNA Co., Ltd. | Server device, method, and system |
US20140091984A1 (en) * | 2012-09-28 | 2014-04-03 | Nokia Corporation | Method and apparatus for providing an indication regarding content presented to another user |
US10620902B2 (en) * | 2012-09-28 | 2020-04-14 | Nokia Technologies Oy | Method and apparatus for providing an indication regarding content presented to another user |
US10289863B2 (en) | 2013-10-10 | 2019-05-14 | Elwha Llc | Devices, methods, and systems for managing representations of entities through use of privacy beacons |
US10346624B2 (en) | 2013-10-10 | 2019-07-09 | Elwha Llc | Methods, systems, and devices for obscuring entities depicted in captured images |
US10834290B2 (en) | 2013-10-10 | 2020-11-10 | Elwha Llc | Methods, systems, and devices for delivering image data from captured images to devices |
US20150104003A1 (en) * | 2013-10-10 | 2015-04-16 | Elwha Llc | Methods, systems, and devices for handling image data from captured images |
US10185841B2 (en) | 2013-10-10 | 2019-01-22 | Elwha Llc | Devices, methods, and systems for managing representations of entities through use of privacy beacons |
US10102543B2 (en) | 2013-10-10 | 2018-10-16 | Elwha Llc | Methods, systems, and devices for handling inserted data into captured images |
US9799036B2 (en) | 2013-10-10 | 2017-10-24 | Elwha Llc | Devices, methods, and systems for managing representations of entities through use of privacy indicators |
US10013564B2 (en) | 2013-10-10 | 2018-07-03 | Elwha Llc | Methods, systems, and devices for handling image capture devices and captured images |
US20190028896A1 (en) * | 2013-11-26 | 2019-01-24 | At&T Intellectual Property I, L.P. | Security management on a mobile device |
US10820204B2 (en) * | 2013-11-26 | 2020-10-27 | At&T Intellectual Property I, L.P. | Security management on a mobile device |
US20210006978A1 (en) * | 2013-11-26 | 2021-01-07 | At&T Intellectual Property I, L.P. | Security management on a mobile device |
US11641581B2 (en) * | 2013-11-26 | 2023-05-02 | At&T Intellectual Property I, L.P. | Security management on a mobile device |
US20160337673A1 (en) * | 2013-12-20 | 2016-11-17 | Siemens Aktiengesellschaft | Protection of privacy in a video stream by means of a redundant slice |
US9235711B1 (en) | 2014-06-24 | 2016-01-12 | Voxience S.A.R.L. | Systems, methods and devices for providing visual privacy to messages |
KR20160131886A (en) * | 2015-05-08 | 2016-11-16 | 삼성전자주식회사 | Terminal device and method for protecting information thereof |
KR102468268B1 (en) * | 2015-05-08 | 2022-11-18 | 삼성전자주식회사 | Terminal device and method for protecting information thereof |
WO2016182272A1 (en) * | 2015-05-08 | 2016-11-17 | Samsung Electronics Co., Ltd. | Terminal device and method for protecting information thereof |
US10572674B2 (en) | 2015-05-08 | 2020-02-25 | Samsung Electronics Co., Ltd. | Terminal device and method for protecting information thereof |
US20170094019A1 (en) * | 2015-09-26 | 2017-03-30 | Microsoft Technology Licensing, Llc | Providing Access to Non-Obscured Content Items based on Triggering Events |
WO2017053214A1 (en) * | 2015-09-26 | 2017-03-30 | Microsoft Technology Licensing, Llc | Providing access to non-obscured content items based on triggering events |
US11350270B2 (en) * | 2016-06-22 | 2022-05-31 | Saronikos Trading And Services, Unipessoal Lda | Method, software, apparatus, electronic device, server and storage medium for ensuring privacy of communication |
US10754996B2 (en) * | 2017-09-15 | 2020-08-25 | Paypal, Inc. | Providing privacy protection for data capturing devices |
US20190087608A1 (en) * | 2017-09-15 | 2019-03-21 | Paypal, Inc. | Providing privacy protection for data capturing devices |
US20200364460A1 (en) * | 2018-03-14 | 2020-11-19 | Sony Corporation | Method, electronic device and social media server for controlling content in a video media stream using face detection |
WO2019175685A1 (en) * | 2018-03-14 | 2019-09-19 | Sony Mobile Communications Inc. | Method, electronic device and social media server for controlling content in a video media stream using face detection |
US11600066B2 (en) * | 2018-03-14 | 2023-03-07 | Sony Group Corporation | Method, electronic device and social media server for controlling content in a video media stream using face detection |
US11546329B2 (en) * | 2019-03-25 | 2023-01-03 | Casio Computer Co., Ltd. | Portable communication terminal control system, portable communication terminal and recording medium |
Also Published As
Publication number | Publication date |
---|---|
KR101788598B1 (en) | 2017-11-15 |
US8813193B2 (en) | 2014-08-19 |
EP2426969B1 (en) | 2017-11-01 |
CN102387133A (en) | 2012-03-21 |
KR20120066085A (en) | 2012-06-22 |
EP2426969A1 (en) | 2012-03-07 |
CN102387133B (en) | 2016-08-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8813193B2 (en) | Mobile terminal and information security setting method thereof | |
EP2466921B1 (en) | Mobile terminal and screen data sharing application controlling method thereof | |
US8433370B2 (en) | Mobile terminal and controlling method thereof | |
US8996999B2 (en) | Mobile terminal determining whether to transmit display data according to privacy property, and controlling method thereof | |
US8041390B2 (en) | Mobile terminal and data uploading method thereof | |
US9622056B2 (en) | Mobile terminal and controlling method thereof for extracting available personal information corresponding to recognized faces | |
US8611819B2 (en) | Mobile terminal and controlling method thereof | |
US8515398B2 (en) | Mobile terminal and method for managing phone book data thereof | |
US9195882B2 (en) | Mobile terminal and group generating method therein | |
US20120170089A1 (en) | Mobile terminal and hologram controlling method thereof | |
US10063679B2 (en) | Mobile terminal and incoming screen display method thereof | |
US9021378B2 (en) | Mobile terminal and method for controlling virtual key pad thereof | |
US8560973B2 (en) | Mobile terminal and method of displaying a plurality of objects by the mobile terminal | |
US20110289394A1 (en) | Mobile terminal and controlling method thereof | |
US8515461B2 (en) | Mobile terminal and controlling method thereof | |
US8483708B2 (en) | Mobile terminal and corresponding method for transmitting new position information to counterpart terminal | |
US20110111769A1 (en) | Mobile terminal and controlling method thereof | |
US8265706B2 (en) | Mobile terminal and display controlling method thereof | |
US20150146071A1 (en) | Mobile terminal and method for controlling the same | |
US9648157B2 (en) | Mobile terminal and method of controlling information publication via a website in a mobile terminal | |
US20110185290A1 (en) | Mobile terminal and controlling method thereof | |
KR20100099419A (en) | Method for displaying image data and mobile terminal using the same | |
KR20110080689A (en) | Terminal and method for sharing contents thereof | |
KR20120047515A (en) | Terminal and method for controlling media access thereof | |
KR20140067364A (en) | Mobile device and control method for the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: LG ELECTRONICS INC., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KIM, MOONJU;SONG, SUYEON;KWON, YUNMI;AND OTHERS;REEL/FRAME:026807/0170 Effective date: 20110816 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551) Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20220819 |