WO2015063551A1 - Method and apparatus for filtering pictures - Google Patents

Method and apparatus for filtering pictures Download PDF

Info

Publication number
WO2015063551A1
WO2015063551A1 PCT/IB2014/000101 IB2014000101W WO2015063551A1 WO 2015063551 A1 WO2015063551 A1 WO 2015063551A1 IB 2014000101 W IB2014000101 W IB 2014000101W WO 2015063551 A1 WO2015063551 A1 WO 2015063551A1
Authority
WO
WIPO (PCT)
Prior art keywords
filtering
pictures
picture
instruction
filtering condition
Prior art date
Application number
PCT/IB2014/000101
Other languages
French (fr)
Inventor
Arlan JIA
Zhangbin YIN
Yan JIAO
Nana ZHENG
Yijiao CHEN
Bizhong Ye
Original Assignee
Sony Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corporation filed Critical Sony Corporation
Publication of WO2015063551A1 publication Critical patent/WO2015063551A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00405Output means
    • H04N1/00408Display of information to the user, e.g. menus
    • H04N1/0044Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet
    • H04N1/00442Simultaneous viewing of a plurality of images, e.g. using a mosaic display arrangement of thumbnails
    • H04N1/00453Simultaneous viewing of a plurality of images, e.g. using a mosaic display arrangement of thumbnails arranged in a two dimensional array
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/5866Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00405Output means
    • H04N1/00408Display of information to the user, e.g. menus
    • H04N1/0044Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet
    • H04N1/00458Sequential viewing of a plurality of images, e.g. browsing or scrolling
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00405Output means
    • H04N1/00408Display of information to the user, e.g. menus
    • H04N1/0044Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet
    • H04N1/00461Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet marking or otherwise tagging one or more displayed image, e.g. for selective reproduction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00405Output means
    • H04N1/00408Display of information to the user, e.g. menus
    • H04N1/00464Display of information to the user, e.g. menus using browsers, i.e. interfaces based on mark-up languages
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3212Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
    • H04N2201/3214Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of a date
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3212Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
    • H04N2201/3215Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of a time or duration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • H04N2201/3253Position information, e.g. geographical position at time of capture, GPS data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3261Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of multimedia information, e.g. a sound signal
    • H04N2201/3264Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of multimedia information, e.g. a sound signal of sound signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3261Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of multimedia information, e.g. a sound signal
    • H04N2201/3266Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of multimedia information, e.g. a sound signal of text or character information, e.g. text accompanying an image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3273Display

Definitions

  • the disclosure relates to an electronic device, and particularly, to a method and an apparatus for filtering pictures which are applicable to an electronic device, and the electronic device.
  • Electronic devices having the functions of image displaying and image processing have been widely used, such as a digital photo frame, a digital camera, a smart mobile terminal, etc.
  • Those electronic devices store digital pictures by a storage unit, process the digital pictures by a picture processing unit, and display the digital pictures on a display unit.
  • digital pictures are usually classified into corresponding folders for being managed according to a certain classification condition.
  • different folders may be set according to the place or time at which the pictures are taken, or the major characters in the pictures, so as to classify the pictures into corresponding folders.
  • the inventor of the present disclosure finds that when the pictures are viewed, each time only a folder meeting a specific classification condition (e.g., the place at which the pictures are taken is "Tokyo") can be entered, and if wanting to enter a folder meeting other classification conditions (e.g., the major character is "A"), he has to exit the current folder before entering another folder.
  • a specific classification condition e.g., the place at which the pictures are taken is "Tokyo
  • other classification conditions e.g., the major character is "A”
  • the embodiments of the present disclosure provide a method and an apparatus for filtering pictures, and an electronic device.
  • the object is to flexibly filter the pictures, and to filter the pictures according to multiple classification conditions.
  • a method for filtering pictures which is adopted to filter pictures in an electronic device, may include: determining a filtering condition according to user's selection, the filtering condition including a filtering type and a range corresponding thereto; judging whether each picture meets the filtering condition according to attributes of pictures in the electronic device; and displaying selected pictures meeting the filtering condition.
  • method may further include: determining another filtering condition according to the user's selection, the another filtering condition including another filtering type and a range corresponding thereto; judging whether each picture in the selected pictures meets the another filtering condition according to attributes of the selected pictures; and displaying other selected pictures meeting the another filtering condition.
  • the determining a filtering condition according to user's selection may include: determining the filtering type of the filtering condition according to a fist instruction; and determining the range of the filtering type according to a second instruction.
  • the determining a filtering condition according to user's selection may include: determining a filtering mode according to a third instruction, the filtering mode being coiTesponding to at least one preset filtering condition; and determining the filtering condition according to the filtering mode.
  • the method for filtering pictures may further include storing the selected pictures to a preset location according to a fourth instruction.
  • the number of the filtering condition may be at least one, each including one filtering type and one range thereof.
  • the filtering type may include a place of the picture, time of the picture, a character of the picture, a color of the picture, text of the picture and/or audio corresponding to the picture.
  • an apparatus for filtering pictures which is adopted to filter pictures in an electronic device, including: a determining unit configured to determine a filtering condition according to user's selection, the filtering condition including a filtering type and a range corresponding thereto; a judging unit configured to judge whether each picture meets the filtering condition according to attributes of pictures in the electronic device; and a displaying unit configured to display selected pictures meeting the filtering condition.
  • the determining unit may include: a filtering type determining unit configured to determine the filtering type of the filtering condition according to a first instruction; and a range determining unit configured to determine the range of the filtering type according to a second instruction.
  • the determining unit may include: a filtering mode determining unit configured to determine a filtering mode according to a third instruction, the filtering mode being corresponding to at least one preset filtering condition; and a filtering condition determining unit configured to determine the filtering condition according to the filtering mode.
  • the apparatus for filtering pictures may further include a post-processing unit configured to store the selected pictures to a preset location according to a fourth instruction.
  • the embodiments of the present disclosure have the following beneficial effect: the mode of classifying the pictures through inherent folders is replaced by screening the pictures according to the filtering condition, thus the pictures can be filtered flexibly and conveniently, and multiple filtrations of the pictures can be realized easily.
  • the pictures in the electronic device can be conveniently classified and consolidated.
  • FIG. 1 is a flowchart of a method for filtering pictures according to Embodiment 1 of the present disclosure
  • FIG. 2 is a flowchart of a method for determining a filtering condition according to Embodiment 1 of the present disclosure
  • FIG. 3 is a schematic diagram where filtering type icons are displayed on a display interface according to Embodiment 1 of the present disclosure
  • FIG. 4 is a schematic diagram of setting a time range according to Embodiment 1 of the present disclosure
  • FIG. 5 is a schematic diagram where the setting of a time range is completed according to Embodiment 1 of the present disclosure
  • FIG. 6 is a schematic diagram of setting a character range according to Embodiment 1 of the present disclosure.
  • FIG. 7 is a schematic diagram of setting a place range according to Embodiment 1 of the present disclosure.
  • FIG. 8 is a structure diagram of an apparatus for filtering pictures according to Embodiment 2 of the present disclosure.
  • Fig. 9 is a structure diagram of a determining unit according to Embodiment 2 of the present disclosure.
  • FIG. 10 is a block diagram of a system structure of an electronic device according to Embodiment 3 of the present disclosure.
  • Embodiment 1 of the present disclosure provides a method for filtering pictures, which is adopted to filter pictures in an electronic device.
  • Fig. 1 is a flowchart of a method for filtering pictures according to Embodiment 1 of the present disclosure. As illustrated in Fig; 1 , the method for filtering pictures includes:
  • Step 101 determining a filtering condition according to user's selection, the filtering condition including a filtering type and a range thereof;
  • Step 102 judging whether each picture meets the filtering condition according to attributes of pictures in the electronic device.
  • Step 103 displaying selected pictures meeting the filtering condition.
  • the electronic device may be a smart phone, a digital photo
  • the pictures may be those acquired by the electronic device through its own camera, those transmitted to the electronic device by another electronic device in a wired or wireless manner, or those in storage means connected to the electronic device.
  • the pictures of the electronic device in the present disclosure are not limited to those stored in the electronic device, and may be the pictures stored in another electronic device connected to the electronic device.
  • the filtering condition may include a filtering type and a range thereof, wherein the filtering type may be the type of a certain attribute of the picture, such as the place, time, character, color, text or corresponding audio of the picture.
  • the range of the filtering type may be the concrete content of the above information.
  • the range thereof may include the place represented with the name of country, city or region in the picture (e.g., "Shanghai", “Tokyo”, etc.), the place where the landmark or landscape in the picture is located, the geographical coordinates of the camera when the pictures are taken (e.g., longitude and/or latitude), or the distance between the location of the camera when the picture is taken and the residence of the picture-taker.
  • the filtering type is the character in the picture
  • the range thereof may include the name of a major object in the picture (e.g., "Tomy", "Jerry”, etc.), or the character having certain appearance characteristics (e.g., short hair, a round face, etc.).
  • the range thereof may include the particular text in the picture, or the particular notes added to the attribute of the picture.
  • the range thereof may include the particulai- rhythm, melody or tune of corresponding audio of the picture, wherein the corresponding audio of the picture for example may be the audio data of environment when the picture is taken, or the audio data of particular- character, or the audio data set by the user according to the scene or content of the picture (e.g., the audio data of a piece of music).
  • the filtering type and its range are just exemplary, and in the implementation of the present disclosure, different filtering types and ranges may be selected for a picture filtration according to the user's personal preference and demand.
  • the above attribute information of the picture may be acquired in many manners.
  • the attribute information of the picture may be acquired according to the inherent attribute of the picture.
  • the time of the picture may be acquired according to the picture-taken time automatically stored in the picture attribute when the picture is taken or the modification time stored in the picture attribute when the picture is modified.
  • the above attribute information may also be added to the picture attribute when picture is processed.
  • the main color information of the picture is acquired by analyzing the picture colors, and then added to the picture attribute.
  • step 101 the filtering condition for the picture selection may be determined
  • the filtering type and the range thereof contained in the filtering condition may be determined in many manners. For example, firstly the filtering type for the picture filtration is determined, and then the range thereof is set. Or, a filtering mode is preset, which corresponds to at least one preset filtering condition, thereby indirectly determining the filtering type corresponding to the filtering mode and the range thereof.
  • a filtering instruction may be received, and it is judged whether the at- tributes of pictures in the electronic device meet the filtering condition determined in step 101 , so as to select the pictures meeting the filtering condition.
  • the picture attributes may be compared with the filtering condition, and it is judged whether the picture meets the filtering condition according to the comparison result, so as to select the pictures meeting the filtering condition.
  • the picture filtering operation please refer to the relevant art, and herein is omitted.
  • step 103 the pictures meeting the filtering condition are displayed as the selected pictures.
  • the total number of the pictures meeting the filtering condition and/or the detailed information of each picture may be additionally displayed.
  • the method for filtering pictures according to Embodiment 1 may further include step 104: performing a predetermined processing on the selected pictures according to an instruction of the user (herein referred to as a fourth instruction), e.g., while displaying die selected pictures, displaying a virtual function key which corresponds to a preset function of replicating or moving the selected pictures to another folder, so that all the selected pictures are replicated or moved to another folder when the user clicks the virtual function key,
  • the other folder may be established in advance or concurrently with the replication or movement, and the embodiment is not limited thereto.
  • the above preset function is also an example, and the virtual function key may correspond to another preset function in another embodiment, e.g., sending to a designated address, printing, etc., so that the selected pictures are sent to the designated address or printed when the printer is normal once the virtual function key is clicked.
  • Fig. 2 is a flowchart of a method for determining a filtering condition according to Embodiment 1 of the present disclosure. As illustrated in Fig. 2, the method includes: Step 201 : determining the filtering type of the filtering condition according to a first instruction; and
  • Step 202 determining the range of the filtering type according to a second instruction.
  • the filtering type may be displayed on the display interface according to an instruction from a Use Interface (UI).
  • UI Use Interface
  • a plurality of filtering type icons may be displayed on the display interface of the electronic device, wherein each filtering type icon indicates one filtering type.
  • Fig. 3 is a schematic diagram where filtering type icons are displayed on a display interface according to Embodiment 1 of the present disclosure.
  • the display interface 300 includes a filtering type icon display area 301 and a picture display area 302, wherein the filtering type icon display area 301 is located at the upper portion of the display interface.
  • the filtering type icon display area 301 may also be located at another portion (e.g., lower portion, or left or right side) of the display interface, or float in front of the picture display area in a semi- transparent display format.
  • a filtering type icon 301 1 is displayed in the filtering type icon display area 301.
  • the filtering type indicated by the filtering type icon 301 1 for example may be the picture-taken time (TIME), the character (FACE), the picture-taken place (PLACE), the picture color (COLOR), etc.
  • a certain filtering type may be determined from a plurality of filtering types indicated by a plurality of filtering type icons 301 1, according to a first instruction from the user or a preset first instruction.
  • the icon 301 1 corresponding to a detemiined filtering type may be highlighted to emphasize the determined filtering type.
  • step 202 for the determined filtering type, the range thereof is set.
  • a second instruction may be received through the display interface of the electronic device, and the range of the filtering type is determined according to the second instruction.
  • Fig. 4 is a schematic diagram of determining a time range according to Embodiment 1 of the present disclosure.
  • a timeline 401 may be displayed on the display interface 300.
  • the user can generate a second instruction indicating to determine the time range as one or more time points, or one or more time periods, etc.
  • the unit of the time point or time period may be hour, day, month, year, etc., which may be changed according to the user's setting.
  • the timeline 401 covers the filtering type icon display area 301 , but the present disclosure is not limited thereto.
  • the timeltne40l may be located in another area of the display interface 300, rather than covering the filtering type icon display area 301.
  • Fig. 4 illustrates how to generate the second instruction for the picture-taken time by using the timeline as an example.
  • the second instruction may be generated in other manners. For example, after the filtering type is determined, an input field may be displayed on the display interface 300, and the user may input specific time information in the input field, so as to generate the second instruction.
  • TIME picture-taken time
  • step 202 how to set the range of the filtering type in step 202 will be described through an example in which the character (FACE), the picture- taken place (PLACE) or the picture color (COLOR) and the range thereof are taken as the filtering type, respectively.
  • FACE character
  • PLACE picture- taken place
  • COLOR picture color
  • Fig. 6 is a schematic diagram of setting a character range according to Embodiment 1 of the present disclosure.
  • a character list 601 may be displayed on the display interface 300.
  • the user may generate a second instruction by selecting the name or icon of a particular character in the character list, and the second instruction indicates to determine the character range as one or more characters.
  • the logical relation between the values thereof may be further indicated as "logical AND”, or "logical OR”, etc.
  • logical AND or "logical OR”
  • the relation between the three characters to be “logical OR” i.e., once the picture contains any one of the three characters “Angelababy”, “Han Geng” and “Wang Likun”, it meets the filtering condition.
  • the user selects the relation between the three characters to be "logical AND”, only a picture simultaneously containing the above three characters meets the filtering condition.
  • the attribute data of each picture may be added with the
  • the character attribute of the picture may be artificially added to the attribute data of the picture in advance, e.g., the names of all characters in the picture or just the name of the major character may be added.
  • the faces in the picture may be automatically recognized according to the face template data of particular characters, and the information of the characters in the picture may be automatically added to the attribute data of the picture according to the recognition result.
  • the electronic device receives the generated second instruction, and determines the range of the filtering type according to the second instruction.
  • the information of the characters may be not added to each picture in advance, instead, after the range of the filtering type is set as one or more particular characters, the face template data of those particular characters is compared with the picture to be filtered, and it is directly judged whether the character attribute of the picture meets the filtering condition according to the comparison result.
  • the time for preprocessing the picture can be reduced, but the time for filtering the picture may be increased accordingly.
  • Fig. 7 is a schematic diagram of setting a place range according to Embodiment 1 of the present disclosure.
  • a distance line 701 may be displayed on the display interface 300 to indicate a distance between the picture-taken place and the user residence.
  • the display interface 300 may directly display a list of picture-taken places.
  • the display interface may also display other information indicating places.
  • the user may generate a second instruction by selecting the distance between the picture-taken place and the user residence or paiticular places in the list of picture-taken places. The second instruction indicates to determine the place range as a certain distance, or one or more particular places.
  • the attribute data of each picture may be added with the place attribute of the picture.
  • the place information of the picture may be artificially added to the attribute data of the picture in advance, e.g., the city name of the picture-taken place and/or the distance between the picture-taken place and the user residence is added.
  • the positioning function of the camera device may be activated to automatically record the coordinates of the geographical location of the camera device when the picture is taken, thereby acquiring the information of the distance between the picture-taken place and the user residence, or other information of the picture-taken place.
  • the electronic device receives the generated second instruction, and determines the range of the filtering type according to the instruction.
  • a color palette or a color bar may be displayed on the display interface 300.
  • the user generates a second instruction by selecting particular colors, and the second instruction indicates to set the color range as one or more colors.
  • the attribute data of each picture may be added with the color attribute of the picture.
  • the colors in the picture may be analyzed, e.g., using a color picking technology, to determine the main colors of the picture, and information of the main colors is added to the attribute data of the picture.
  • the present disclosure is not limited thereto, and information of other colors may also be used.
  • the electronic device receives the generated second instruction, and determines the value of the filtering type according to the second instruction.
  • the color information of the colors may not be added to each picture in advance, instead, after the range of the color filtering type is set as one or more particular colors, the colors of the picture are analyzed and the color attribute of the picture is extracted, then it is judged whether the picture meets the filtering condition by comparing the color attribute of the picture with the range of the color filtering type.
  • the time for preprocessing the picture can be reduced, but the time for filtering the picture may be increased accordingly.
  • the character list 601 , the distance line 701 and the color palette may cover the filtering type icon display area 301 , or they may be located in another area of the display interface 300, rather than covering the filtering type icon display area 301.
  • an input field may be displayed on the display interface 300, and the user may input specific characters, places, colors, etc. in the input field, so as to generate the second instruction.
  • the character range, the place range and/or the color range may be displayed in the display area of the filtering type icon 301 1.
  • the pictures to be filtered may be displayed in the picture display area 302 of the display interface 300, and the selected pictures may be displayed in the picture display area 302 after the pictures are filtered, thereby directly reflecting the difference between the pictures before and after the filtration.
  • the picture display area 302 may not display the pictures to be filtered, but just displays the selected pictures after the pictures are filtered, so as to display a more extensive range of filtering type in the step of determining the filtering condition.
  • the time, character, place or color is used as the filtering type as an example and the present disclosure is not limited thereto.
  • Other filtering types may also be set upon the user's actual demand.
  • a filtering mode may be determined according to a third instruction, so as to determine the filtering condition.
  • the filtering mode(s) of the pictures may be preset, each including one or more filtering conditions, i.e., a preset filtering mode may be used to determine multiple filtering conditions.
  • one or more filtering modes may be set. Thus, by determining a filtering mode, its corresponding filtering condition(s) may be conveniently determined.
  • filtering mode 1 may be preset, which takes “character” as the filtering type, and the range of the filtering type is "character A", then the pictures of character A can be conveniently filtered just by determining filtering mode 1 without needing to determine the filtering type each time, and the range of the filtering type can also be determined, thereby improving the convenience.
  • the filtering type and the range thereof for the multiple filtrations can be determined more conveniently by determining the filtering mode.
  • a new filtering condition may be determined based on the pictures previously selected, i.e., the new filtering type and the range thereof are determined to judge whether each of the selected pictures meets the new filtering condition, and the pictures meeting the new filtering condition are displayed as other selected pictures, thereby performing multiple filtrations on the pictures. For example, after the pictures are filtered by using the time as the filtering type, the selected pictures may be filtered by using the character as the filtering type, then filtered by using the place as the filtering type, and the pictures obtained after the multiple filtrations meet each filtering condition. Thus, the pictures may be filtered by combining several filtering types, thereby improving the flexibility of the picture filtration.
  • the filtering steps temporarily may be not carried out when a filtering condition is determined, and it continues to determine other filtering conditions.
  • the pictures are filtered according to the determined at least two filtering conditions and the received or preset filtering operation instruction, i.e., the pictures are filtered under multiple filtering conditions through only a single filtering operation.
  • the steps of multiple filtrations are reduced, and the multiple filtrations are more convenient.
  • a subsequent processing icon (the previous virtual function key) 503 may be displayed on the display interface 300.
  • the user may generate a subsequent processing instruction by operating the subsequent processing icon 503, and perform subsequent processing on the selected pictures according to the subsequent processing instruction.
  • classified folders may be established for the selected pictures according to the received subsequent processing instruction.
  • the selected pictures are replicated or moved into the classified folders for a storage, or the shortcuts of the selected pictures are stored into those folders.
  • the pictures in the electronic device can be conveniently classified and consolidated, and the pictures can be viewed, transmitted or shared conveniently through the classified folders when the electronic device is connected to another device.
  • the selected pictures may also be directly transmitted, printed, etc.
  • the pictures can be filtered flexibly and conveniently, and multiple filtrations of the pictures can be realized easily.
  • the pictures in the electronic device can be conveniently classified and consolidated.
  • Embodiment 2 of the present disclosure provides an apparatus for filtering pictures, which corresponds to the method for filtering pictures in Embodiment 1 , and the same contents are omitted herein.
  • Fig. 8 is a structure diagram of an apparatus for filtering pictures according to Embodiment 2 of the present disclosure.
  • the apparatus 800 for filtering pictures includes: a determining unit 801 configured to determine a filtering condition according to user's selection, the filtering condition including a filtering type and a range thereof; a judging unit 802 configured to judge whether each picture meets the filtering condition according to attributes of pictures in the electronic device; and a displaying unit 803 configured to display pictures meeting the filtering condition as selected pictures.
  • the determining unit 801 may determine at least two filtering conditions, and the judging unit 802 may filter the pictures according to the at least two filtering conditions.
  • the apparatus 800 for filtering pictures may further comprise
  • a post-processing unit 804 configured to store the pictures acquired through the filtration (i.e., the selected pictures) to a preset location, or perform a subsequent processing such as storage, transmission, printing, etc.
  • Fig. 9 is a structure diagram of a determining unit 801 according to Embodiment 2 of the present disclosure.
  • the determining unit 801 may include: a filtering type determining unit 901 configured to determine the filtering type of the filtering condition; and a range determining unit 902 configured to determine the range of the filtering type according to the second instruction.
  • the determining unit 801 may also include a filtering mode determining unit and a filtering condition selecting unit, wherein the filtering mode determining unit determines a filtering mode according to a third instruction; and the filtering condition determining unit determines the filtering condition according to the filtering mode.
  • the filtering condition determining unit for example may include a memory which stores a queiy table of the correspondence between the filtering mode and the filtering condition, and the filtering condition corresponding to the filtering mode can be obtained by inquiring the query table.
  • the apparatus for filtering pictures can select pictures flexibly and conveniently, and multiple filtrations of the pictures can be realized easily.
  • the pictures in the electronic device can be conveniently classified and consolidated
  • the embodiment of the present disclosure further provides an electronic device
  • Fig. 10 is a block diagram of a system structure of an electronic device 1000
  • Embodiment 3 of the present disclosure including a picture processing and displaying apparatus 160 which includes the apparatus 800 for filtering pictures as described in Embodiment 2.
  • the diagram is schematic, and the structure may be supplemented or replaced with another type of structure to realize the telecom function or other function.
  • the electronic device 1000 may further include a Central Processing Unit (CPU) 100, a communication module 1 10, an input unit 120, an audio processor 130, a memory 140, a camera 150 and a power supply 170.
  • CPU Central Processing Unit
  • the CPU 100 (sometimes called as controller or operation control, including microprocessor or other processor unit and/or logic unit) receives an input, and controls respective parts of the electronic device 1000 and operations thereof.
  • the input unit 120 provides an input to the CPU 100.
  • the input unit 120 for example is a key or touch input means.
  • the camera 150 is configured to take image data, and provide the taken image data to the CPU 100 for a conventional usage, such as storage, transmission, etc.
  • the power supply 170 is configured to supply power to the electronic device 1000.
  • the picture processing and displaying apparatus 160 is configured to process and display an object such as picture, video and text.
  • the memory 140 is coupled to the CPU 100.
  • the memory 140 may be a solid state memory, such as Read Only Memory (ROM), Random Access Memory (RAM), SIM card, etc., and it also may be a memory which stores information even if the power is off, and which is selectively erasable and provided with more data.
  • the example of the memory sometimes is referred to as EPROM.
  • the memory 140 also may be of another type.
  • the memory 140 includes a buffer memory 141 (sometimes referred to as buffer);
  • the memory 140 may include, an application/function storage section 142 configured to store application programs and function programs, or perform the operation flow of the electronic device 1000 through the CPU 100.
  • the memory 140 may further include a data storage section 143 configured to store data, such as contact person, digital data, pictures, sounds and/br any other data used by the electronic device.
  • a drive program storage section 144 of the memory 140 may include various drive programs of the electronic device for the communication function and/or for performing other functions of the electronic device (e.g., messaging application, address book application, etc.).
  • the communication module 1 10 is a transmitter/receiver 110 which transmits and receives signals via an antenna 1 1 1.
  • the communication module (transmitter/receiver) 1 10 is coupled to the CPU 100, so as to provide an input signal and receive an output signal, which may be the same as the situation of the conventional mobile communication terminal.
  • the same electronic device may be provided with a plurality of communication modules 1 10, such as cellular network module, Bluetooth module and/or wireless local area network (WLAN) module.
  • the communication module (transmitter/ receiver) 1 10 is further coupled to a speaker 131 via the audio processor 130, so as to provide an audio output via the speaker 131.
  • the audio processor 130 may include any suitable buffer, decoder, amplifier, etc.
  • the embodiment of the present disclosure further provides a computer readable program, which when being executed in an electronic device, enables a computer to perform the method for filtering pictures as described in Embodiment 1 in the electronic device.
  • the embodiment of the present disclosure further provides a storage medium storing a computer readable program which enables a computer to perform the method for filtering pictures as described in Embodiment 1 in an electronic device.
  • each of the parts of the present disclosure may be implemented by hardware, software, firmware, or combinations thereof.
  • multiple steps or methods may be implemented by software or firmware stored in the memory and executed by an appropriate instruction executing system.
  • the implementation uses hardware, it may be realized by any one of the following technologies known in the art or combinations thereof as in another embodiment: a discrete logic circuit having a logic gate circuit for realizing logic functions of data signals, application-specific integrated circuit having an appropriate combined logic gate circuit, a programmable gate array (PGA), and a field programmable gate array (FPGA). etc.
  • Any process, method or block in the flowchart or described in other manners herein may be understood as being indicative of including one or more modules, segments or parts for realizing the codes of executable instructions of the steps in specific logic functions or processes, and that the scope of the preferred embodiments of the present disclosure include other implementations, wherein the functions may be executed in manners different from those shown or discussed (e.g., according to the related functions in a substantially simultaneous manner or in a reverse order), which shall be understood by a person skilled in the art.
  • the logic and/or steps shown in the flowcharts or described in other manners herein may be, for example, understood as a sequencing list of executable instructions for realizing logic functions, which may be implemented in any computer readable medium, for use by an instruction executing system, apparatus or device (such as a system based on a computer, a system including a processor, or other systems capable of extracting instructions from an instruction executing system, apparatus or device and executing the instructions), or for use in combination with the instruction executing system, apparatus or device.

Abstract

The embodiments of the present disclosure provide a method and an apparatus for filtering pictures and an electronic device. The method is adopted to filter pictures in the electronic device, including: determining a filtering condition according to user's selection, the filtering condition including a filtering type and a range thereof; judging whether each picture meets the filtering condition according to attributes of pictures in the electronic device; and displaying pictures meeting the filtering condition as selected pictures. According to the method and the apparatus for filtering pictures and the electronic device provided by the embodiments of the present invention, the flexibility of the picture classification is improved by filtering the pictures according to the filtering condition, and the pictures in the electronic device can be conveniently classified and consolidated through the subsequent processing of the selected pictures.

Description

Description
METHOD AND APPARATUS FOR FILTERING PICTURES
Technical Field
[0001] The disclosure relates to an electronic device, and particularly, to a method and an apparatus for filtering pictures which are applicable to an electronic device, and the electronic device.
[0002] Cross-Reference to Related Application and Priority Claim
This application claims priority from Chinese patent application No.
201310534552.6, filed November 1 , 2013, the entire disclosure of which hereby is incorporated by reference.
Background Art
[0003] Electronic devices having the functions of image displaying and image processing have been widely used, such as a digital photo frame, a digital camera, a smart mobile terminal, etc. Those electronic devices store digital pictures by a storage unit, process the digital pictures by a picture processing unit, and display the digital pictures on a display unit.
[0004] With the improvement of the picture storage and processing capacities, storing and processing more digital pictures in an electronic device have become reality. While the digital pictures stored and processed in the electronic device are continuously increasing, how to classify and select those digital pictures to effectively manage them is a problem to be solved.
[0005] To be noted, the above introduction to the technical background is just made for the convenience of clearly and completely describing the technical solutions of the present disclosure, and to facilitate the understanding of a person skilled in the art. It shall not be deemed that the above technical solutions are known to a person skilled in the art just because they have been illustrated in the Background section of the present disclosure.
Summary
[0006] In the relevant art, digital pictures are usually classified into corresponding folders for being managed according to a certain classification condition. For example, different folders may be set according to the place or time at which the pictures are taken, or the major characters in the pictures, so as to classify the pictures into corresponding folders.
[0007] During the implementation of the relevant art, the inventor of the present disclosure finds that when the pictures are viewed, each time only a folder meeting a specific classification condition (e.g., the place at which the pictures are taken is "Tokyo") can be entered, and if wanting to enter a folder meeting other classification conditions (e.g., the major character is "A"), he has to exit the current folder before entering another folder. Thus, the relevant art cannot flexibly change the picture classification condition, and cannot classify the pictures according to multiple classification conditions, thereby degrading the user experience.
[0008] The embodiments of the present disclosure provide a method and an apparatus for filtering pictures, and an electronic device. The object is to flexibly filter the pictures, and to filter the pictures according to multiple classification conditions.
[0009] According to an aspect of an embodiment of the present disclosure, a method for filtering pictures is provided, which is adopted to filter pictures in an electronic device, may include: determining a filtering condition according to user's selection, the filtering condition including a filtering type and a range corresponding thereto; judging whether each picture meets the filtering condition according to attributes of pictures in the electronic device; and displaying selected pictures meeting the filtering condition.
[0010] According to another aspect of the embodiments of the present disclosure, the
method may further include: determining another filtering condition according to the user's selection, the another filtering condition including another filtering type and a range corresponding thereto; judging whether each picture in the selected pictures meets the another filtering condition according to attributes of the selected pictures; and displaying other selected pictures meeting the another filtering condition.
[001 1] According to another aspect of an embodiment of the present disclosure, the determining a filtering condition according to user's selection may include: determining the filtering type of the filtering condition according to a fist instruction; and determining the range of the filtering type according to a second instruction.
[0012] According to another aspect of an embodiment of the present disclosure, the determining a filtering condition according to user's selection may include: determining a filtering mode according to a third instruction, the filtering mode being coiTesponding to at least one preset filtering condition; and determining the filtering condition according to the filtering mode.
[0013] According to another aspect of an embodiment of the present disclosure, the method for filtering pictures may further include storing the selected pictures to a preset location according to a fourth instruction.
[0014] According to another aspect of an embodiment of the present disclosure, the number of the filtering condition may be at least one, each including one filtering type and one range thereof.
[0015] According to another aspect of an embodiment of the present disclosure, the filtering type may include a place of the picture, time of the picture, a character of the picture, a color of the picture, text of the picture and/or audio corresponding to the picture.
[0016] According to still another aspect of an embodiment of the present disclosure, an apparatus for filtering pictures is provided, which is adopted to filter pictures in an electronic device, including: a determining unit configured to determine a filtering condition according to user's selection, the filtering condition including a filtering type and a range corresponding thereto; a judging unit configured to judge whether each picture meets the filtering condition according to attributes of pictures in the electronic device; and a displaying unit configured to display selected pictures meeting the filtering condition.
[0017] According to still another aspect of an embodiment of the present disclosure, the determining unit may include: a filtering type determining unit configured to determine the filtering type of the filtering condition according to a first instruction; and a range determining unit configured to determine the range of the filtering type according to a second instruction.
[0018] According to still another aspect of an embodiment of the present disclosure, wherein the determining unit may include: a filtering mode determining unit configured to determine a filtering mode according to a third instruction, the filtering mode being corresponding to at least one preset filtering condition; and a filtering condition determining unit configured to determine the filtering condition according to the filtering mode.
[0019] According to still another aspect of an embodiment of the present disclosure, wherein the apparatus for filtering pictures may further include a post-processing unit configured to store the selected pictures to a preset location according to a fourth instruction.
[0020] According to yet another aspect of an embodiment of the present disclosure, an
electronic device is provided, including the aforementioned apparatus for filtering pictures.
[0021] The embodiments of the present disclosure have the following beneficial effect: the mode of classifying the pictures through inherent folders is replaced by screening the pictures according to the filtering condition, thus the pictures can be filtered flexibly and conveniently, and multiple filtrations of the pictures can be realized easily. In addition, through the subsequent processing of the selected pictures, the pictures in the electronic device can be conveniently classified and consolidated.
[0022] These and other aspects of the present disclosure will be clear with reference to the following descriptions and drawings. In those descriptions and drawings, the specific embodiments of the present disclosure are concretely disclosed to represent some implementations of the principle of the present disclosure. But it shall be appreciated that the scope of the present disclosure is not limited thereto. On the contrary, the present disclosure includes all changes, modifications and equivalents falling within the scope of the spirit and the connotation of the accompanied claims.
[0023] Features described and/or illustrated with respect to one embodiment can be used in one or more other embodiments in a same or similar way, and/or by being combined with or replacing the features in other embodiments.
[0024] To be noted, the term "comprise/include" used herein specifies the presence of
feature, element, step or component, not excluding the presence or addition of one or more other features, elements, steps or components.
[0025] Many aspects of the present disclosure will be understood better with reference to the following drawings. The components in the drawings are not surely drafted in proportion, and the emphasis lies in clearly illustrating the principle of the present disclosure. For the convenience of illustrating and describing some portions of the present disclosure, corresponding portions in the drawings may be enlarged, e.g., being more enlarged relative to other portions than the situation in the exemplary device practically manufactured according to the present disclosure. The parts and features illustrated in one drawing or embodiment of the present disclosure may be combined with the parts and features illustrated in one or more other drawings or embodiments. In addition, the same reference signs denote corresponding portions throughout the drawings, and they can be used to denote the same or similar portions in more than one embodiment.
Brief Description of Drawings
[0026] The included drawings are provided for further understanding of the present
disclosure, and they constitute a part of the Specification. The drawings illustrate the preferred embodiments of the present disclosure, and they are used to explain the principle of the present disclosure together with the text, wherein the same or similar element is denoted with the same reference sign, and "primed" reference numerals represent elements that are the same or similar to elements that are designed by the same unprimed reference numeral, and so on.
In the drawings:
[fig. l ]Fig. 1 is a flowchart of a method for filtering pictures according to Embodiment 1 of the present disclosure;
[fig.2]Fig. 2 is a flowchart of a method for determining a filtering condition according to Embodiment 1 of the present disclosure;
[fig.3]Fig. 3 is a schematic diagram where filtering type icons are displayed on a display interface according to Embodiment 1 of the present disclosure;
[fig.4]Fig. 4 is a schematic diagram of setting a time range according to Embodiment 1 of the present disclosure; [fig.5]Fig. 5 is a schematic diagram where the setting of a time range is completed according to Embodiment 1 of the present disclosure;
[fig.6]Fig. 6 is a schematic diagram of setting a character range according to Embodiment 1 of the present disclosure;
[fig.7]Fig. 7 is a schematic diagram of setting a place range according to Embodiment 1 of the present disclosure;
[fig.8]Fig. 8 is a structure diagram of an apparatus for filtering pictures according to Embodiment 2 of the present disclosure;
[fig.9]Fig. 9 is a structure diagram of a determining unit according to Embodiment 2 of the present disclosure; and
[fig.10]Fig. 10 is a block diagram of a system structure of an electronic device according to Embodiment 3 of the present disclosure.
Description of Embodiments
[0027] The embodiments of the present disclosure will be described as follows with
reference to the drawings. Those embodiments are just exemplary, rather than limitations to the present disclosure.
[0028] Embodiment 1
[0029] Embodiment 1 of the present disclosure provides a method for filtering pictures, which is adopted to filter pictures in an electronic device. Fig. 1 is a flowchart of a method for filtering pictures according to Embodiment 1 of the present disclosure. As illustrated in Fig; 1 , the method for filtering pictures includes:
Step 101 : determining a filtering condition according to user's selection, the filtering condition including a filtering type and a range thereof;
Step 102: judging whether each picture meets the filtering condition according to attributes of pictures in the electronic device; and
Step 103: displaying selected pictures meeting the filtering condition.
[0030] In this embodiment, the electronic device may be a smart phone, a digital photo
frame, a digital camera, etc. As the filtering object, the pictures may be those acquired by the electronic device through its own camera, those transmitted to the electronic device by another electronic device in a wired or wireless manner, or those in storage means connected to the electronic device. In conclusion, "the pictures of the electronic device" in the present disclosure are not limited to those stored in the electronic device, and may be the pictures stored in another electronic device connected to the electronic device.
[0031] In this embodiment, the filtering condition may include a filtering type and a range thereof, wherein the filtering type may be the type of a certain attribute of the picture, such as the place, time, character, color, text or corresponding audio of the picture. The range of the filtering type may be the concrete content of the above information. For example, when the filtering type is the place of the picture, the range thereof may include the place represented with the name of country, city or region in the picture (e.g., "Shanghai", "Tokyo", etc.), the place where the landmark or landscape in the picture is located, the geographical coordinates of the camera when the pictures are taken (e.g., longitude and/or latitude), or the distance between the location of the camera when the picture is taken and the residence of the picture-taker. When the filtering type is the character in the picture, the range thereof may include the name of a major object in the picture (e.g., "Tomy", "Jerry", etc.), or the character having certain appearance characteristics (e.g., short hair, a round face, etc.). When the filtering type is the text in the picture, the range thereof may include the particular text in the picture, or the particular notes added to the attribute of the picture. When the filtering type corresponds audio of the picture, the range thereof may include the particulai- rhythm, melody or tune of corresponding audio of the picture, wherein the corresponding audio of the picture for example may be the audio data of environment when the picture is taken, or the audio data of particular- character, or the audio data set by the user according to the scene or content of the picture (e.g., the audio data of a piece of music). The above descriptions of the filtering type and its range are just exemplary, and in the implementation of the present disclosure, different filtering types and ranges may be selected for a picture filtration according to the user's personal preference and demand.
[0032] In this embodiment, the above attribute information of the picture may be acquired in many manners. In some implementations, the attribute information of the picture may be acquired according to the inherent attribute of the picture. For example, the time of the picture may be acquired according to the picture-taken time automatically stored in the picture attribute when the picture is taken or the modification time stored in the picture attribute when the picture is modified. In other implementations, the above attribute information may also be added to the picture attribute when picture is processed. For example, the main color information of the picture is acquired by analyzing the picture colors, and then added to the picture attribute.
[0033] In step 101 , the filtering condition for the picture selection may be determined
according to the user's demand. In the implementation, the filtering type and the range thereof contained in the filtering condition may be determined in many manners. For example, firstly the filtering type for the picture filtration is determined, and then the range thereof is set. Or, a filtering mode is preset, which corresponds to at least one preset filtering condition, thereby indirectly determining the filtering type corresponding to the filtering mode and the range thereof.
[0034] In step 102, a filtering instruction may be received, and it is judged whether the at- tributes of pictures in the electronic device meet the filtering condition determined in step 101 , so as to select the pictures meeting the filtering condition. When a selecting operation is specifically performed, for example, the picture attributes may be compared with the filtering condition, and it is judged whether the picture meets the filtering condition according to the comparison result, so as to select the pictures meeting the filtering condition. For the implementation of the picture filtering operation, please refer to the relevant art, and herein is omitted.
[0035] In step 103, the pictures meeting the filtering condition are displayed as the selected pictures. In addition, optionally the total number of the pictures meeting the filtering condition and/or the detailed information of each picture may be additionally displayed.
[0036] In one implementation, the method for filtering pictures according to Embodiment 1 may further include step 104: performing a predetermined processing on the selected pictures according to an instruction of the user (herein referred to as a fourth instruction), e.g., while displaying die selected pictures, displaying a virtual function key which corresponds to a preset function of replicating or moving the selected pictures to another folder, so that all the selected pictures are replicated or moved to another folder when the user clicks the virtual function key, The other folder may be established in advance or concurrently with the replication or movement, and the embodiment is not limited thereto. In addition, the above preset function is also an example, and the virtual function key may correspond to another preset function in another embodiment, e.g., sending to a designated address, printing, etc., so that the selected pictures are sent to the designated address or printed when the printer is normal once the virtual function key is clicked.
[0037] Next, the steps below will be described in details with reference to the drawings.
[0038] Fig. 2 is a flowchart of a method for determining a filtering condition according to Embodiment 1 of the present disclosure. As illustrated in Fig. 2, the method includes: Step 201 : determining the filtering type of the filtering condition according to a first instruction; and
[0039] Step 202: determining the range of the filtering type according to a second instruction.
[0040] When step 201 is carried out, the filtering type may be displayed on the display interface according to an instruction from a Use Interface (UI). In an implementation, a plurality of filtering type icons may be displayed on the display interface of the electronic device, wherein each filtering type icon indicates one filtering type.
[0041 ] Fig. 3 is a schematic diagram where filtering type icons are displayed on a display interface according to Embodiment 1 of the present disclosure. As illustrated in Fig. 3, the display interface 300 includes a filtering type icon display area 301 and a picture display area 302, wherein the filtering type icon display area 301 is located at the upper portion of the display interface. In another embodiment, the filtering type icon display area 301 may also be located at another portion (e.g., lower portion, or left or right side) of the display interface, or float in front of the picture display area in a semi- transparent display format.
[0042] In this embodiment, as illustrated in Fig. 3, a filtering type icon 301 1 is displayed in the filtering type icon display area 301. The filtering type indicated by the filtering type icon 301 1 for example may be the picture-taken time (TIME), the character (FACE), the picture-taken place (PLACE), the picture color (COLOR), etc.
[0043] In this embodiment, a certain filtering type may be determined from a plurality of filtering types indicated by a plurality of filtering type icons 301 1, according to a first instruction from the user or a preset first instruction. In an implementation, the icon 301 1 corresponding to a detemiined filtering type may be highlighted to emphasize the determined filtering type.
[0044] In step 202, for the determined filtering type, the range thereof is set. In an implementation, a second instruction may be received through the display interface of the electronic device, and the range of the filtering type is determined according to the second instruction.
[0045] Next, the determination of the range of the filtering type will be described through an example in which the picture-taken time (TIME) is determined as the filtering type.
[0046] Fig. 4 is a schematic diagram of determining a time range according to Embodiment 1 of the present disclosure. As illustrated in Fig. 4, when the filtering type is detemiined as the picture-taken time (TIME), a timeline 401 may be displayed on the display interface 300. By operating the timeline, the user can generate a second instruction indicating to determine the time range as one or more time points, or one or more time periods, etc. To be noted, the unit of the time point or time period may be hour, day, month, year, etc., which may be changed according to the user's setting.
[0047] In addition, in Fig. 4 the timeline 401 covers the filtering type icon display area 301 , but the present disclosure is not limited thereto. In another embodiment, the timeltne40l may be located in another area of the display interface 300, rather than covering the filtering type icon display area 301.
[0048] Next, the generation of the second instruction will be described through an example of the timeline displayed in months as illustrated in Fig. 4. If the user clicks or selects a particular date on the timeline, such as April 1, 2013, an instruction "time range = April 1 , 2013" is generated. If the user clicks two dates respectively within a certain operation interval, such as April 13, 2013 and April 27, 2013, an instruction "time range= April 13, 2013 to April 27, 2013" is generated. There are many manners for generating the second instruction according to the user's operation. The above de- scription is just an example, while the present disclosure is not limited thereto, and the relation between the user operation and the second instruction may be defined based on the relevant art.
[0049] In addition, the range of the filtering type may be emphasized on the timeline. As illustrated in Fig. 4, when the second instruction "time range= April 13, 2013 to April 27, 2013" is generated, the time period may be emphasized on the timeline.
[0050] Fig. 4 illustrates how to generate the second instruction for the picture-taken time by using the timeline as an example. In addition, the second instruction may be generated in other manners. For example, after the filtering type is determined, an input field may be displayed on the display interface 300, and the user may input specific time information in the input field, so as to generate the second instruction.
[0051] The electronic device receives the generated second instruction, and determines the range of the filtering type according to the second instruction. In an implementation, after the range of the filtering type is determined, it may be displayed in the display area of the filtering type icon 301 1. As illustrated in Fig. 5, when the value of the picture-taken time (TIME) is set as "time range= April 13, 2013 to April 27, 2013", "time range= April 13, 2013 to April 27,. 2013" may be displayed in the display area corresponding to "picture-taken time (TIME)".
[0052] In the above description, how to set the range of the filtering type is described
through an example in which the picture-taken time (TIME) and the range thereof are taken as the filtering condition. Next, how to set the range of the filtering type in step 202 will be described through an example in which the character (FACE), the picture- taken place (PLACE) or the picture color (COLOR) and the range thereof are taken as the filtering type, respectively.
[0053] Fig. 6 is a schematic diagram of setting a character range according to Embodiment 1 of the present disclosure. As illustrated in Fig. 6, when the character (FACE) is determined as the filtering type, a character list 601 may be displayed on the display interface 300. The user may generate a second instruction by selecting the name or icon of a particular character in the character list, and the second instruction indicates to determine the character range as one or more characters.
[0054] In an implementation, when it is indicated to detennine the character range as plural characters, the logical relation between the values thereof may be further indicated as "logical AND", or "logical OR", etc. For example, after the user selects three characters "Angelababy", "Han Geng" and "Wang Likun", he can further select the relation between the three characters to be "logical OR", i.e., once the picture contains any one of the three characters "Angelababy", "Han Geng" and "Wang Likun", it meets the filtering condition. However, if the user selects the relation between the three characters to be "logical AND", only a picture simultaneously containing the above three characters meets the filtering condition.
[0055] In this embodiment, the attribute data of each picture may be added with the
character attribute of the picture. In an implementation, the character attribute of the picture may be artificially added to the attribute data of the picture in advance, e.g., the names of all characters in the picture or just the name of the major character may be added. In another implementation, the faces in the picture may be automatically recognized according to the face template data of particular characters, and the information of the characters in the picture may be automatically added to the attribute data of the picture according to the recognition result.
[0056] The electronic device receives the generated second instruction, and determines the range of the filtering type according to the second instruction.
[0057] In addition, the information of the characters may be not added to each picture in advance, instead, after the range of the filtering type is set as one or more particular characters, the face template data of those particular characters is compared with the picture to be filtered, and it is directly judged whether the character attribute of the picture meets the filtering condition according to the comparison result. Thus, the time for preprocessing the picture can be reduced, but the time for filtering the picture may be increased accordingly.
[0058] Fig. 7 is a schematic diagram of setting a place range according to Embodiment 1 of the present disclosure. As illustrated in Fig. 7, when the picture-taken place is de- termined as the filtering type, a distance line 701 may be displayed on the display interface 300 to indicate a distance between the picture-taken place and the user residence. In another implementation, the display interface 300 may directly display a list of picture-taken places. In addition, the display interface may also display other information indicating places. The user may generate a second instruction by selecting the distance between the picture-taken place and the user residence or paiticular places in the list of picture-taken places. The second instruction indicates to determine the place range as a certain distance, or one or more particular places.
In this embodiment, the attribute data of each picture may be added with the place attribute of the picture. In an implementation, the place information of the picture may be artificially added to the attribute data of the picture in advance, e.g., the city name of the picture-taken place and/or the distance between the picture-taken place and the user residence is added. In another implementation, the positioning function of the camera device may be activated to automatically record the coordinates of the geographical location of the camera device when the picture is taken, thereby acquiring the information of the distance between the picture-taken place and the user residence, or other information of the picture-taken place.
The electronic device receives the generated second instruction, and determines the range of the filtering type according to the instruction.
[0059] In addition, when the color (COLOR) is selected as the filtering type, a color palette or a color bar may be displayed on the display interface 300. The user generates a second instruction by selecting particular colors, and the second instruction indicates to set the color range as one or more colors.
[0060] In this embodiment, the attribute data of each picture may be added with the color attribute of the picture. In an implementation, the colors in the picture may be analyzed, e.g., using a color picking technology, to determine the main colors of the picture, and information of the main colors is added to the attribute data of the picture. Of course, the present disclosure is not limited thereto, and information of other colors may also be used.
[0061] The electronic device receives the generated second instruction, and determines the value of the filtering type according to the second instruction.
[0062] In addition, in some implementations, the color information of the colors may not be added to each picture in advance, instead, after the range of the color filtering type is set as one or more particular colors, the colors of the picture are analyzed and the color attribute of the picture is extracted, then it is judged whether the picture meets the filtering condition by comparing the color attribute of the picture with the range of the color filtering type. Thus, the time for preprocessing the picture can be reduced, but the time for filtering the picture may be increased accordingly.
[0063] For the above embodiment, being similar to the situation where the time is determined as the selection type, the character list 601 , the distance line 701 and the color palette may cover the filtering type icon display area 301 , or they may be located in another area of the display interface 300, rather than covering the filtering type icon display area 301.
[0064] In addition, after the filtering type is determined, an input field may be displayed on the display interface 300, and the user may input specific characters, places, colors, etc. in the input field, so as to generate the second instruction.
[0065] In addition, after the step of determining the filtering condition is finished according to the second instruction, the character range, the place range and/or the color range may be displayed in the display area of the filtering type icon 301 1.
[0066] In addition, in the above embodiment, during the step of determining the filtering condition, the pictures to be filtered may be displayed in the picture display area 302 of the display interface 300, and the selected pictures may be displayed in the picture display area 302 after the pictures are filtered, thereby directly reflecting the difference between the pictures before and after the filtration.
[0067] In another embodiment, the picture display area 302 may not display the pictures to be filtered, but just displays the selected pictures after the pictures are filtered, so as to display a more extensive range of filtering type in the step of determining the filtering condition.
[0068] To be noted, the time, character, place or color is used as the filtering type as an example and the present disclosure is not limited thereto. Other filtering types may also be set upon the user's actual demand.
[0069] In an embodiment of the present disclosure, a filtering mode may be determined according to a third instruction, so as to determine the filtering condition. In an implementation, the filtering mode(s) of the pictures may be preset, each including one or more filtering conditions, i.e., a preset filtering mode may be used to determine multiple filtering conditions. In addition, one or more filtering modes may be set. Thus, by determining a filtering mode, its corresponding filtering condition(s) may be conveniently determined.
[0070] For example, if the pictures of character A in the electronic device are continuously updated, filtering mode 1 may be preset, which takes "character" as the filtering type, and the range of the filtering type is "character A", then the pictures of character A can be conveniently filtered just by determining filtering mode 1 without needing to determine the filtering type each time, and the range of the filtering type can also be determined, thereby improving the convenience. In addition, when multiple filtrations are performed, the filtering type and the range thereof for the multiple filtrations can be determined more conveniently by determining the filtering mode.
[0071] In the embodiment of the present disclosure, a new filtering condition may be determined based on the pictures previously selected, i.e., the new filtering type and the range thereof are determined to judge whether each of the selected pictures meets the new filtering condition, and the pictures meeting the new filtering condition are displayed as other selected pictures, thereby performing multiple filtrations on the pictures. For example, after the pictures are filtered by using the time as the filtering type, the selected pictures may be filtered by using the character as the filtering type, then filtered by using the place as the filtering type, and the pictures obtained after the multiple filtrations meet each filtering condition. Thus, the pictures may be filtered by combining several filtering types, thereby improving the flexibility of the picture filtration.
[0072] In another implementation of an embodiment of the present disclosure, in a case where multiple filtrations are to be performed, the filtering steps temporarily may be not carried out when a filtering condition is determined, and it continues to determine other filtering conditions. When at least two filtering conditions are determined, the pictures are filtered according to the determined at least two filtering conditions and the received or preset filtering operation instruction, i.e., the pictures are filtered under multiple filtering conditions through only a single filtering operation. Thus the steps of multiple filtrations are reduced, and the multiple filtrations are more convenient.
[0073] In an embodiment of the present disclosure, after the filtered pictures are acquired, as illustrated in Fig. 5, a subsequent processing icon (the previous virtual function key) 503 may be displayed on the display interface 300. The user may generate a subsequent processing instruction by operating the subsequent processing icon 503, and perform subsequent processing on the selected pictures according to the subsequent processing instruction. For example, as mentioned previously, classified folders may be established for the selected pictures according to the received subsequent processing instruction. The selected pictures are replicated or moved into the classified folders for a storage, or the shortcuts of the selected pictures are stored into those folders. Thus, the pictures in the electronic device can be conveniently classified and consolidated, and the pictures can be viewed, transmitted or shared conveniently through the classified folders when the electronic device is connected to another device. In addition, the selected pictures may also be directly transmitted, printed, etc.
[0074] As can be seen from the above embodiment, by determining the picture filtering type and the range thereof, the pictures can be filtered flexibly and conveniently, and multiple filtrations of the pictures can be realized easily. In addition, through a subsequent processing of the selected pictures, the pictures in the electronic device can be conveniently classified and consolidated.
[0075] Embodiment 2
[0076] Embodiment 2 of the present disclosure provides an apparatus for filtering pictures, which corresponds to the method for filtering pictures in Embodiment 1 , and the same contents are omitted herein.
[0077] Fig. 8 is a structure diagram of an apparatus for filtering pictures according to Embodiment 2 of the present disclosure. As illustrated in Fig. 8, the apparatus 800 for filtering pictures includes: a determining unit 801 configured to determine a filtering condition according to user's selection, the filtering condition including a filtering type and a range thereof; a judging unit 802 configured to judge whether each picture meets the filtering condition according to attributes of pictures in the electronic device; and a displaying unit 803 configured to display pictures meeting the filtering condition as selected pictures.
[0078] In an implementation, the determining unit 801 may determine at least two filtering conditions, and the judging unit 802 may filter the pictures according to the at least two filtering conditions.
[0079] In another implementation, the apparatus 800 for filtering pictures may further
include a post-processing unit 804 configured to store the pictures acquired through the filtration (i.e., the selected pictures) to a preset location, or perform a subsequent processing such as storage, transmission, printing, etc.
[0080] For the specific working modes of respective units in this embodiment, please refer to those of corresponding steps in Embodiment 1 , and herein are omitted. To be noted, regarding the apparatus 800 for filtering pictures, only the stmctures of the parts related to the embodiment of the present disclosure are illustrated, and please refer to the relevant art for other parts.
[0081] Fig. 9 is a structure diagram of a determining unit 801 according to Embodiment 2 of the present disclosure. As illustrated in Fig. 9, in Embodiment 2 of the present disclosure, the determining unit 801 may include: a filtering type determining unit 901 configured to determine the filtering type of the filtering condition; and a range determining unit 902 configured to determine the range of the filtering type according to the second instruction.
[0082] For the specific working modes of respective parts of the determining unit 801 in this embodiment, please refer to those of corresponding steps in Embodiment 1 , and herein are omitted.
[0083] In one embodiment, the determining unit 801 may also include a filtering mode determining unit and a filtering condition selecting unit, wherein the filtering mode determining unit determines a filtering mode according to a third instruction; and the filtering condition determining unit determines the filtering condition according to the filtering mode. In the implementation, the filtering condition determining unit for example may include a memory which stores a queiy table of the correspondence between the filtering mode and the filtering condition, and the filtering condition corresponding to the filtering mode can be obtained by inquiring the query table.
[0084] As can be seen from the above embodiment, by determining the picture filtering
condition, the apparatus for filtering pictures can select pictures flexibly and conveniently, and multiple filtrations of the pictures can be realized easily. In addition, through a subsequent processing of the selected pictures, the pictures in the electronic device can be conveniently classified and consolidated
[0085] Embodiment 3
[0086] The embodiment of the present disclosure further provides an electronic device
including the apparatus for filtering pictures as described in Embodiment 2.
[0087] Fig. 10 is a block diagram of a system structure of an electronic device 1000
according to Embodiment 3 of the present disclosure, including a picture processing and displaying apparatus 160 which includes the apparatus 800 for filtering pictures as described in Embodiment 2. To be noted, the diagram is schematic, and the structure may be supplemented or replaced with another type of structure to realize the telecom function or other function.
[0088] As illustrated in Fig. 10, the electronic device 1000 may further include a Central Processing Unit (CPU) 100, a communication module 1 10, an input unit 120, an audio processor 130, a memory 140, a camera 150 and a power supply 170.
[0089] The CPU 100 (sometimes called as controller or operation control, including microprocessor or other processor unit and/or logic unit) receives an input, and controls respective parts of the electronic device 1000 and operations thereof. The input unit 120 provides an input to the CPU 100. The input unit 120 for example is a key or touch input means. The camera 150 is configured to take image data, and provide the taken image data to the CPU 100 for a conventional usage, such as storage, transmission, etc.
[0090] The power supply 170 is configured to supply power to the electronic device 1000.
The picture processing and displaying apparatus 160 is configured to process and display an object such as picture, video and text.
[0091 ] The memory 140 is coupled to the CPU 100. The memory 140 may be a solid state memory, such as Read Only Memory (ROM), Random Access Memory (RAM), SIM card, etc., and it also may be a memory which stores information even if the power is off, and which is selectively erasable and provided with more data. The example of the memory sometimes is referred to as EPROM. The memory 140 also may be of another type. The memory 140 includes a buffer memory 141 (sometimes referred to as buffer); The memory 140 may include, an application/function storage section 142 configured to store application programs and function programs, or perform the operation flow of the electronic device 1000 through the CPU 100.
[0092] The memory 140 may further include a data storage section 143 configured to store data, such as contact person, digital data, pictures, sounds and/br any other data used by the electronic device. A drive program storage section 144 of the memory 140 may include various drive programs of the electronic device for the communication function and/or for performing other functions of the electronic device (e.g., messaging application, address book application, etc.).
[0093] The communication module 1 10 is a transmitter/receiver 110 which transmits and receives signals via an antenna 1 1 1. The communication module (transmitter/receiver) 1 10 is coupled to the CPU 100, so as to provide an input signal and receive an output signal, which may be the same as the situation of the conventional mobile communication terminal.
[0094] Based on different communication technologies, the same electronic device may be provided with a plurality of communication modules 1 10, such as cellular network module, Bluetooth module and/or wireless local area network (WLAN) module. The communication module (transmitter/ receiver) 1 10 is further coupled to a speaker 131 via the audio processor 130, so as to provide an audio output via the speaker 131. The audio processor 130 may include any suitable buffer, decoder, amplifier, etc.
[0095] The embodiment of the present disclosure further provides a computer readable program, which when being executed in an electronic device, enables a computer to perform the method for filtering pictures as described in Embodiment 1 in the electronic device.
[0096] The embodiment of the present disclosure further provides a storage medium storing a computer readable program which enables a computer to perform the method for filtering pictures as described in Embodiment 1 in an electronic device.
[0097] The preferred embodiments of the present disclosure are described as above with reference to the drawings. Many features and advantages of those embodiments are apparent from the detailed specification, thus the accompanied claims intend to cover all such features and advantages of those embodiments which fall within the true spirit and scope thereof. In addition, since numerous modifications and changes are easily conceivable to a person skilled in the art, the embodiments of the present disclosure are not limited to the exact structure and operation as illustrated and described, but cover all suitable modifications and equivalents falling within the scope thereof.
[0098] It shall be understood that each of the parts of the present disclosure may be implemented by hardware, software, firmware, or combinations thereof. In the above embodiments, multiple steps or methods may be implemented by software or firmware stored in the memory and executed by an appropriate instruction executing system. For example, if the implementation uses hardware, it may be realized by any one of the following technologies known in the art or combinations thereof as in another embodiment: a discrete logic circuit having a logic gate circuit for realizing logic functions of data signals, application-specific integrated circuit having an appropriate combined logic gate circuit, a programmable gate array (PGA), and a field programmable gate array (FPGA). etc.
[0099] Any process, method or block in the flowchart or described in other manners herein may be understood as being indicative of including one or more modules, segments or parts for realizing the codes of executable instructions of the steps in specific logic functions or processes, and that the scope of the preferred embodiments of the present disclosure include other implementations, wherein the functions may be executed in manners different from those shown or discussed (e.g., according to the related functions in a substantially simultaneous manner or in a reverse order), which shall be understood by a person skilled in the art.
[0100] The logic and/or steps shown in the flowcharts or described in other manners herein may be, for example, understood as a sequencing list of executable instructions for realizing logic functions, which may be implemented in any computer readable medium, for use by an instruction executing system, apparatus or device (such as a system based on a computer, a system including a processor, or other systems capable of extracting instructions from an instruction executing system, apparatus or device and executing the instructions), or for use in combination with the instruction executing system, apparatus or device.
[0101 ] The above literal descriptions and drawings show various features of the present disclosure. It shall be understood that a person of ordinary skill in the art may prepare suitable computer codes to carry out each of the steps and processes described above and illustrated in the drawings. It shall also be understood that the above-described terminals, computers, servers, and networks, etc. may be any type, and the computer codes may be prepared according to the disclosure contained herein to carry out the present disclosure by using the apparatuses.
[0102] Particular embodiments of the present disclosure have been disclosed herein. A
person skilled in the art will readily recognize that the present disclosure is applicable in other environments. In practice, there exist many embodiments and implementations. The appended claims are by no means intended to limit the scope of the present disclosure to the above particular embodiments. Furthermore, any reference to "an apparatus configured to" is an explanation of apparatus plus function for describing elements and claims, and it is not desired that any element using no reference to "an apparatus configured to " is understood as an element of apparatus plus function, even though the wording of "apparatus" is included in that claim.
[0103] Although the present disclosure has been illustrated and described through a
particular prefened embodiment or multiple embodiments, it is obvious that equivalent amendments and modifications are conceivable to a person skilled in the art in reading and understanding the description and drawings. Especially for various functions executed by the above elements (parts, components, apparatuses, and compositions, etc.), unless otherwise specified, it is desirable that the terms (including the reference to the "apparatus") describing these elements correspond to any element executing specific functions of these elements (i.e. function equivalence), even though the element is structurally different from that executing the function of an exemplary embodiment or multiple embodiments illustrated in the present disclosure. Furthermore, although the specific features of the present disclosure have been described with respect to only one or more of the illustrated embodiments, such features may be combined with one or more other features in other embodiments as desired and in consideration of benefiting any given or specific application.

Claims

Claims
A method for filtering pictures, which is adapted to filter pictures in an electronic device, comprising:
determining a filtering condition according to user's selection, the filtering condition including a filtering type and a range corresponding thereto;
judging whether each picture meets the filtering condition according to attributes of pictures in the electronic device; and
displaying selected pictures meeting the filtering condition.
The method for filtering pictures according to claim 1 , further comprising:
determining another filtering condition according to the user's selection, the other filtering condition including another filtering type and a range corresponding thereto;
judging whether each picture in the selected pictures meets the other filtering condition according to attributes of the selected pictures; and displaying other selected pictures meeting the other filtering condition. The method for filtering pictures according to any one of claims 1-2, wherein determining a filtering condition according to user's selection comprises:
determining the filtering type of the filtering condition according to a first instruction; and
determining the range of the filtering type according to a second instruction.
The method for filtering pictures according to any one of claims 1-3, wherein determining a filtering condition according to user's selection comprises:
determining a filtering mode according to a third instruction, the filtering mode corresponding to at least one preset filtering condition; and
determining the filtering condition according to the filtering mode. The method for filtering pictures according to any one of claims 1-4, further comprising storing the selected pictures to a preset location according to a fourth instruction.
The method for filtering pictures according to any one of claims 1-5, wherein the number of the filtering condition is at least one, each comprising one filtering type and one range thereof. The method for filtering pictures according to any one of claims 1-6, wherein the filtering type comprises at least one of: a place of the picture, time of the picture, a character of the picture, a color of the picture, text of the picture and/or audio corresponding to the picture. An apparatus for filtering pictures, which is adapted to filter pictures in an electronic device, comprising:
a determining unit configured to determine a filtering condition according to user's selection, the filtering condition including a filtering type and a range corresponding thereto;
a judging unit configured to judge whether each picture meets the filtering condition according to attributes of pictures in the electronic device; and
a displaying unit configured to display selected pictures meeting the filtering condition.
The apparatus for filtering pictures according to claim 8, wherein the determining unit comprises:
a filtering type determining unit configured to determine the filtering type of the filtering condition according to a first instruction; and a range determining unit configured to determine the range of the filtering type according to a second instruction.
The apparatus for filtering pictures according to any one of claims 8-9, wherein the determining unit comprises:
a filtering mode determining unit configured to determine a filtering mode according to a third instruction, the filtering mode corresponding to at least one preset filtering condition; and
a filtering condition determining unit configured to determine the filtering condition according to the filtering mode.
The apparatus for filtering pictures according to any one of claims
8- 10, further comprising:
a post-processing unit configured to store the selected pictures to a preset location according to a fourth instruction.
An electronic device comprising the apparatus for filtering pictures according to any one of claims 8-1 1.
PCT/IB2014/000101 2013-11-01 2014-02-04 Method and apparatus for filtering pictures WO2015063551A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201310534552.6 2013-11-01
CN201310534552.6A CN104598483A (en) 2013-11-01 2013-11-01 Picture filtering method and device and electronic device

Publications (1)

Publication Number Publication Date
WO2015063551A1 true WO2015063551A1 (en) 2015-05-07

Family

ID=50239689

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2014/000101 WO2015063551A1 (en) 2013-11-01 2014-02-04 Method and apparatus for filtering pictures

Country Status (2)

Country Link
CN (1) CN104598483A (en)
WO (1) WO2015063551A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110245246A (en) * 2019-04-30 2019-09-17 维沃移动通信有限公司 A kind of image display method and terminal device
CN112905812A (en) * 2021-02-01 2021-06-04 上海德拓信息技术股份有限公司 Media file auditing method and system

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105262810B (en) * 2015-09-29 2019-04-26 北京金山安全软件有限公司 Cloud backup method and device for picture and electronic equipment
CN106599207A (en) * 2016-12-15 2017-04-26 北京小米移动软件有限公司 Picture filtering method and device
CN106961559B (en) * 2017-03-20 2019-03-05 维沃移动通信有限公司 A kind of production method and electronic equipment of video
CN110019028A (en) * 2017-08-08 2019-07-16 阿里巴巴集团控股有限公司 The methods, devices and systems and equipment of garbled data
CN111143590A (en) * 2019-12-25 2020-05-12 上海云从企业发展有限公司 Image filtering method, system, device and machine readable medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050166156A1 (en) * 2004-01-23 2005-07-28 Microsoft Corporation System and method for automatically grouping items
US20050278328A1 (en) * 2004-06-04 2005-12-15 Marston Michael J Sorting and filtering techniques for products, namely posters and artwork
EP1783636A1 (en) * 2004-08-24 2007-05-09 Sony Corporation Image display, image displaying method, and computer program
US20080301128A1 (en) * 2007-06-01 2008-12-04 Nate Gandert Method and system for searching for digital assets
US20100064254A1 (en) * 2008-07-08 2010-03-11 Dan Atsmon Object search and navigation method and system
US7716157B1 (en) * 2006-01-26 2010-05-11 Adobe Systems Incorporated Searching images with extracted objects
US7779358B1 (en) * 2006-11-30 2010-08-17 Adobe Systems Incorporated Intelligent content organization based on time gap analysis
US20130024801A1 (en) * 2011-07-19 2013-01-24 Disney Enterprises, Inc. Method and System for Providing a Compact Graphical User Interface for Flexible Filtering of Data
US20130202216A1 (en) * 2012-02-08 2013-08-08 International Business Machines Corporation Object tag metadata and image search
US8533232B1 (en) * 2007-03-30 2013-09-10 Google Inc. Method and system for defining relationships among labels

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5510329B2 (en) * 2008-09-05 2014-06-04 ソニー株式会社 Content recommendation system, content recommendation method, content recommendation device, program, and information storage medium
CN104239336B (en) * 2013-06-19 2018-03-16 华为技术有限公司 A kind of method for screening images, device and terminal

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050166156A1 (en) * 2004-01-23 2005-07-28 Microsoft Corporation System and method for automatically grouping items
US20050278328A1 (en) * 2004-06-04 2005-12-15 Marston Michael J Sorting and filtering techniques for products, namely posters and artwork
EP1783636A1 (en) * 2004-08-24 2007-05-09 Sony Corporation Image display, image displaying method, and computer program
US7716157B1 (en) * 2006-01-26 2010-05-11 Adobe Systems Incorporated Searching images with extracted objects
US7779358B1 (en) * 2006-11-30 2010-08-17 Adobe Systems Incorporated Intelligent content organization based on time gap analysis
US8533232B1 (en) * 2007-03-30 2013-09-10 Google Inc. Method and system for defining relationships among labels
US20080301128A1 (en) * 2007-06-01 2008-12-04 Nate Gandert Method and system for searching for digital assets
US20100064254A1 (en) * 2008-07-08 2010-03-11 Dan Atsmon Object search and navigation method and system
US20130024801A1 (en) * 2011-07-19 2013-01-24 Disney Enterprises, Inc. Method and System for Providing a Compact Graphical User Interface for Flexible Filtering of Data
US20130202216A1 (en) * 2012-02-08 2013-08-08 International Business Machines Corporation Object tag metadata and image search

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110245246A (en) * 2019-04-30 2019-09-17 维沃移动通信有限公司 A kind of image display method and terminal device
CN110245246B (en) * 2019-04-30 2021-11-16 维沃移动通信有限公司 Image display method and terminal equipment
CN112905812A (en) * 2021-02-01 2021-06-04 上海德拓信息技术股份有限公司 Media file auditing method and system
CN112905812B (en) * 2021-02-01 2023-07-11 上海德拓信息技术股份有限公司 Media file auditing method and system

Also Published As

Publication number Publication date
CN104598483A (en) 2015-05-06

Similar Documents

Publication Publication Date Title
WO2015063551A1 (en) Method and apparatus for filtering pictures
KR101343609B1 (en) Apparatus and Method for Automatically recommending Application using Augmented Reality Data
CN107193944B (en) Theme pushing method, terminal, server and computer-readable storage medium
US9491281B2 (en) Apparatus and method for displaying unchecked messages in a terminal
US10635713B2 (en) Method and device for replacing the application visual control
CN106325674B (en) Message prompting method and device
KR101491592B1 (en) Terminal and method for displaying contents thereof
US20130152024A1 (en) Electronic device and page zooming method thereof
CN106777214B (en) Photo album picture ordering method and mobile terminal
US9251404B2 (en) Name bubble handling
RU2703956C1 (en) Method of managing multimedia files, an electronic device and a graphical user interface
KR20170022967A (en) Method and device for displaying badge of icon
RU2648616C2 (en) Font addition method and apparatus
CN105094549A (en) Method and device for displaying messages
WO2018000643A1 (en) Method and device for sorting photographs
EP2712166B1 (en) Method, information processing apparatus and computer program for visually dividing a file containing multiple images
US20140043255A1 (en) Electronic device and image zooming method thereof
CN106201212A (en) Generation method, device and the mobile terminal of a kind of application icon
CN109085982B (en) Content identification method and device and mobile terminal
EP3242197A1 (en) Desktop sharing method and mobile terminal
CN104991910A (en) Album creation method and apparatus
CN105094975A (en) Method and device for calling application program
CN104182419A (en) Method and device for processing character information in picture
CN106201509A (en) A kind of method for information display, device and mobile terminal
CN110390641B (en) Image desensitizing method, electronic device and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14708936

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14708936

Country of ref document: EP

Kind code of ref document: A1